Tuesday, November 16, 2021

Some Musings on College Rankings

 

Some Musings on College Rankings

 

College rankings continue to proliferate, and in many cases muddy the waters for prospective students and their families.

 

Most of the leading rankings of institutions use an in-house algorithm that utilizes certain data points to create a score. Many of those data points are meaningful measures of institutional health and how selective they are, but these do not necessarily correlate with students’ academic experiences.

 

U.S. News, the most prominent of undergraduate rankings, continues to adjust their formula. They report that they currently use the following data to build their scoring:

 

·      Outcomes (40%)

o   First-year retention

o   Graduation rates and deviation from predicted grad rate

o   Social mobility (graduating more lower-income students)

o   Student debt

·      Faculty resources (20%)

o   Percentage of classes with fewer than 20 students

o   Percentage of classes with more than 50 students

o   Student faculty ratio

o   Percentage of faculty who are full-time

·      Expert opinion (20%)

·      Financial resources (10%)

·      Student excellence (7%)

o   Acceptance rate

o   25th-75th percentile SAT or ACT scores

o   Percentage of first years who were in the top 10% of their H.S. class

·      Alumni giving rate (3%)

 

Most of these are meaningful measures, but not all are linked to quality. Just because many more prospective students apply to an elite institution than will be accepted doesn’t mean they will have a better experience.

 

The “expert opinion” category is especially problematic. Each year, U.S. News sends a questionnaire to the President, Provost, and Chief Enrollment Officer of each college with a list of all the institutions in their cohort. At Susquehanna, we receive a list of 222 “national liberal arts colleges.” We are asked to rate them on a five-point scale from “marginal” to “distinguished.” In recent years the highest composite score any institution inn our cohort has received in this category is 4.7.

 

Only about one third of those who receive the questionnaire, complete and submit it.

A number of years ago, a significant portion of an Annapolis Group meeting was spent debating whether the leaders of those member institutions should boycott the questionnaire. We discussed the absurdity of any of us having a meaningful understanding of that many institutions. We also noted that it was not a helpful measure for prospective students, but many of us recognized that if we did not report, our own institutions would lose the benefit of our “vote.” There has been at least one news story of a leaked copy of the questionnaire in which all schools were ranked at the lowest level except for the home institution of the completer.

 

When the “expert opinion” component was initially added, high school guidance counselors also completed a ranking questionnaire. This has recently been removed from their process.

 

Other national rankings use different combinations of data and balance each element differently. Some, like Wall Street Journal/THE assess all institutions together rather than separating liberal arts colleges from large universities or institutions that recruit regionally versus nationally. This creates comparisons between remarkably heterogeneous schools. Putting a small residential private liberal arts college on the same scale as a large state university does not yield much meaning.

 

I must confess that I celebrate when Susquehanna climbs in a ranking, and I complete the reputation survey to be sure that we receive one more distinguished ballot. I give that same rating for a number of other institutions that deserve it, and I don’t provide ratings for schools about which I am uninformed. I don’t find ratings to be of value, but they are an influential reality.

 

The National Survey of Student Engagement (NSSE) was developed in the late 1990s as a measure of the student experience and a scholarly counterpoint to the ratings system. This survey of currently enrolled students (usually first years and seniors) measures student engagement including academic rigor, learning with peers, engagement with faculty, and the campus environment.

 

The NSSE survey also measures how much students are engaged in selected high-impact practices including: a learning community, service learning, independent research with a faculty member, internships, study abroad, and a capstone project.

 

Early in NSSE’s history, they reported that among the surveyed institutions, there were four schools in the top quintile in all categories they measured. None of those institutions were in the U.S. News top 50. NSSE’s leadership encouraged prospective students to focus on the experiences they would have in college as the driver for choosing an alma mater.

 

Students who are selecting an institution should look at graduation rates and placement in jobs and graduate schools. These are important measures of outcomes. Families may also be buoyed in their confidence by consulting the recent Georgetown University study on the return on investment of a degree, which provides lifetime earnings data. Those data are most meaningful when comparing otherwise similar institutions. Comparing earnings between graduates of schools that mostly educate teachers and social workers with schools focused on engineering and business is not revealing.

 

Abstract rankings do not provide meaningful discernment for which school will be best for a given student. Understanding the student experience is a much better evaluation.

 

Prospective students should find out how many of their classes will be taught by faculty vs. graduate students. They should ask how big those classes will be. Small classes aren’t a guarantee of quality, but they increase the possibilities of individualized learning opportunities. Lastly, students should ask how many of those high impact practices are built into the curriculum and co-curriculum of the institution. If these aspects of an institution are favorable, there is a quality program in their chosen field of study, and they have a positive campus visit, that institution should move to the top of their list.

 

 

 

 

Welcome!

This Blog Has Moved