Home » News » FIRE’s campus free-speech rankings: A closer look

By Ken Paulson, published on September 26, 2023

Select Dynamic field

Photo courtesy iStock

The 2023 College Free Speech Rankings survey from the Foundation for Individual Rights and Expression is an ambitious effort to essentially answer one question: “What is the state of free speech on America’s college campuses?”


That’s quite a challenge. Some colleges are private. Others are public. Demographics, curricula, locations and institutions all vary. 


It’s a formidable task with a valuable objective. The next generation of American citizens needs to live and learn in environments where ideas are valued and opinions are shared. 

The study encompassed data from 55,102 students at 254 colleges and universities. From the data collected by College Pulse, FIRE ranked the schools from best to worst in support for free speech, with Michigan Technological University at the top and Harvard University — with an abysmal rating — at the bottom. 


Among the study’s findings: 


    • In contrast to the top-ranked colleges’ spirit of openness, the lowest-ranked schools were less tolerant of controversial speakers on campus and more accepting of violent protests to block such speakers.

    • More than half of the students reported that they were concerned about harming their reputation by saying or doing something that others would misunderstand.

    • 45 percent of respondents said it would be acceptable for protesting students to bar other students from a speech, while 27 percent said using violence to stop a controversial speech on campus could be acceptable.

The researchers’ assessments included 13 components. Six focused on students’ perceptions of the campus climate for free speech, including their comfort level in expressing ideas and tolerance for liberal or conservative speakers. 


The remaining seven components tracked the conduct of administrators, faculty and students in addressing free-speech issues.


FIRE’s longstanding Spotlight Database — a rating of campus policies that have an impact of freedom of student expression — was also factored into the ratings.


The full results and methodology can be found on FIRE’s website.


The survey continues to evolve. The Free Speech Center asked Adam Goldstein, FIRE’s vice president for research, and Sean Stevens, director of polling and analytics, about the survey’s process and potential in this Q&A:


Q: As researchers, what were your own takeaways from the study? Should it be viewed as a literal ranking of best to worst?


A: No, not in a literal sense, though that’s one way to express it — best and worst in a particular window of time on one particular issue. You could also think of it as a temperature sample. We measured the temperature for speech and here’s a ranking from hot to cold within a particular time frame. None of it is a referendum on the value of the institution, the quality of its administration or faculty, or its ability to make students who believe in freedom of speech. 


I think focusing too much on the actual score or rank of a school misses the forest for the trees. For instance, we have conducted this survey for the past four years. We’ve expanded the number of schools surveyed each year, but there are 159 that have been surveyed at least three times, and 55 of them were surveyed all four years. We can say with a degree of confidence that the following schools consistently do well:


    • U. of Chicago

    • U. of New Hampshire

    • Florida State

    • Kansas State

    • Oregon State

    • Mississippi State

    • George Mason

    • Purdue

    • North Carolina State

    • U. of Virginia

    • U. of Colorado, Boulder 

    • (Objectively speaking) Texas A&M. I say “objectively speaking” because it’s accurate to put them here based on how they’ve done all four years, but they probably won’t do as well next year. After this year’s scores and rankings were finalized, FIRE became aware of at least one incident that will likely result in A&M’s being penalized in next year’s rankings.

And, we can also say with confidence that the following schools consistently perform poorly:


    • Boston College

    • Fordham

    • Georgetown

    • Grinnell

    • Harvard

    • Marquette

    • Middlebury

    • Rensselaer Polytechnic Institute

    • U. of Texas, Austin

Plus, because we have increased the number of schools ranked each year, each rankings survey constitutes the largest-ever survey of college students’ free-speech attitudes (until the next year’s survey comes out). Many of the topline findings suggest that a notable portion of students embodies a “free speech for me but not for thee” mindset — as evidenced by some of the large discrepancies in support for allowing controversial liberal speakers on campus compared to controversial conservative ones (and in the rare cases of a heavily conservative student body, like Hillsdale and Liberty, the reversal of this pattern). 


Q: We see that FIRE’s very valuable speech-code assessment is included in the scoring of colleges, and that you’re incorporating it with the addition or subtraction of deviations. Can you tell us what the relative weight of the speech code is to the other factors? Was there a speech-code score for every school?


A: Short version: A green light provides a boost of one standard deviation. A yellow light results in a penalty of half a standard deviation, and a red light results in a penalty of one standard deviation. Essentially the spotlight rating is like an adjusted starting position.


A slightly longer version is, it’s messy, because … math. Because the scores are standardized before the speech-code score is applied, one standard deviation is 10, compared with a potential overall score of 100. But it’s precisely because the score is standardized before the speech code is factored in that it’s difficult to precisely measure the weight of the speech code against any other discrete component. 


But to weigh it against the other overall components is pretty easy — if you think of it in the old 100-point scoring systems we all encountered in high school, it’s a letter grade up for Green; half a letter grade down for Yellow; and a full letter grade down for Red.


Q: Is there any consideration given to external forces on the college? For example, when Florida imposes limits on what teachers can discuss in the classroom, that’s going to have a significant impact on public universities throughout the state. Is that factored in anywhere?


A: It depends. One of the reasons the University of Florida is ranked where it is that it prevented law professors from offering pro-bono services and/or testifying in court cases. It appears like the administration feared backlash from the state government in a number of these incidents, so they are entries in our Scholars Under Fire database. In other words, if the limits on academic freedom imposed on faculty result in specific individual cases that result in Scholar Under Fire entries, then yes, it’s factored in. But, the overall policy itself is not — this is mainly because our spotlight ratings look only at student-speech policies. 


So, in a nutshell, in this version, it’s counted if a specific scholar has been affected. We’ve definitely talked about how to denote it further in future rankings. But because the primary purpose of the survey is to help parents and students pick colleges that respect their speech rights, it’s tricky. From the student’s perspective, it probably doesn’t matter whether a school policy limiting their speech is inspired by a state law or an administrative whim. Also, those state laws are going to affect student perception on the ground, and so they’re already being factored in by the survey itself, and adding a weight might double-count it. But we’ve thought about how we might highlight those laws in the future, whether it’s through math, or more like an asterisk on the rankings.


Q: Comfort expressing Ideas and openness are, of course, essential elements of free speech on campus. Does any outside research exist for non-students in this age group?


A: There is no corresponding research on non-students in the same age group that uses our survey questions specifically. But, research in communication, political science, political psychology, and social psychology has been done on closely related constructs in non-student samples.


Q: Comfort expressing ideas appears to constitute about a third of a school’s overall score. Are we reading that correctly?


A: The maximum score on Comfort Expressing Ideas is 33, and that’s the highest possible score on any of the components. The Openness component is next with a maximum score of 20. Because we then add up all the components and standardize the scores, it’s not accurate to say Comfort accounts for 33% of the score, but it does carry the most weight out of any of the components — which we think makes sense since it is made up of the 5 questions that ask about comfort expressing views in different campus settings and the 3 questions that ask about self-censorship (without mentioning the term self-censorship).


Q: Does FIRE have any suggestions about how a university with a low ranking can best remedy its shortcomings?


A: The lowest-hanging fruit is revising speech policies that earn a yellow- or red-light rating. Making it clear to the students that the policies are being revised in favor of protecting speech would likely result in an increased Administrative Support score — schools with administrations that are known for their staunch support of free speech (e.g., U. Chicago, Purdue) tend to do very well on this component.


But (and I’m pulling this directly from a blog I wrote that just went up today; why reinvent the wheel, right?), publicly stating that free speech is the core value of a school — something that likely contributes to a strong score on Administrative Support — and writing good speech policies are only the beginning of creating an environment of open and robust conversation. The proof of whether a school truly supports free expression as a core value comes when that core value is inevitably tested by controversy.


The decisions administrators make in response to campus speech controversies are likely to have a more lasting influence on an individual school’s climate for free expression than its policies or its students’ perceptions of “Administrative Support.” When a decision is made unequivocally in defense of free speech, it sends one kind of message to a school’s students and faculty. When a response is tepid or, worse, is one that violates someone’s speech rights, it sends a very different kind of message.


Q: I understand that your schools come from College Pulse’s database, but do you have any flexibility within that framework? Is every state represented? Are there schools with FIRE speech-code ratings that are not in the study?


A: College Pulse has been a great partner and has worked to add schools on our request, which means actually recruiting panels. It’s a more difficult task than I’m making it sound, but they’ve been really good at it. Almost every state is represented, and we’ve worked to maintain a balance of demographic, economic, and regional categories. Of course, we’d like to add more!


Forty-nine of 50 states are represented for the second year in a row. The one state we have yet to survey a school in is South Dakota; we’re hoping to add at least one of their larger state universities to the pool for next year’s rankings. College Pulse will also attempt to build a panel at a specific school if there is interest in including it; it would only be ranked if we were able to meet a minimum sample size. This year, the average sample size per school was 217 individuals. There were 13 schools with less than 100 students sampled; average sample size among these 13 was 85. Most of them are small liberal arts colleges (e.g., Bard, Harvey Mudd, Scripps, Smith) with enrollments around 2,000 or in some cases a bit less (Harvey Mudd’s undergraduate enrollment is under 1,000). So, even with an N of under 100 we are actually sampling a larger percentage of the student body at most of these schools, compared with the larger state universities that enroll tens of thousands of students (e.g., Ohio State, Rutgers, etc.).


The Spotlight ratings have been around much longer than the [College Free-Speech Rankings] — there are nearly twice as many schools with rated speech codes as there are schools in this year’s rankings. So we have a lot of room to grow there. We’ve rated just under 500 schools in total, so there are schools that have FIRE ratings that have not yet been included in the rankings. 


Q: If your resources were dramatically increased, what more would you like to know about free speech on campus?


A: First, expanding the rankings themselves. We’d love to add every school in the Spotlight database, and then expand them both to reach, eventually, about 600 schools. At that point, we’d be measuring institutions making up the vast majority — over 75% — of students in four-year higher education. Even adding another 50 or so might get us to 50%. 


Second, surveying faculty would add a much deeper understanding of the campuses. It’s also immensely difficult, for a number of reasons. Faculty are much more reluctant to express views that might affect their employers, and are much harder to incentivize than students. But we’re working on that. It’s just difficult with limited resources. 


Third, for the first time this year we asked students to identify who it is they do talk to about controversial political issues. I think this is an area that is underexplored and deserves more attention. Adding questions to explore this in more detail is something we are exploring.


Finally, we’ve always envisioned this survey as a resource for academic researchers too, something akin to the General Social Survey or the American National Election Studies. We have identified a core set of questions that we’ve asked in at least 3 of the 4 surveys, and will continue to ask them every subsequent year. These questions are what we use to calculate the component scores that go into the overall score. 


But we will also introduce new questions each year and/or bring back questions asked in previous years that are not part of the rankings calculation. All of this allows us to track trends over time in students’ speech attitudes. It also allows us to have some flexibility to ask about new controversial issues that inevitably emerge over time (e.g., in 2021 we asked about comfort discussing George Floyd and COVID-19; last year we replaced those and asked about vaccine and mask mandates, and this year didn’t ask about mandates and replaced them with issues like inflation and the war in Ukraine).


Ken Paulson is the director of the Free Speech Center at Middle Tennessee State University.

The Free Speech Center newsletter offers a digest of First Amendment and news media-related news every other week. Subscribe for free here: https://bit.ly/3kG9uiJ


More than 1,700 articles on First Amendment topics, court cases and history