The problem is, college rankings aren't going away. As info- and choice-glut continue to proliferate throughout our society, the need to organize and make sense of such potentially overwhelming amounts of information will only increase the presence and usage of such tools as rankings, however potentially reductionist in nature. Especially when investment in higher education amounts to one of the biggest financial decisions an individual and family can make in life. We can only collectively come together in our various stake-holding communities (e.g. higher education) to proactively shape ranking systems that better reflect our actual realities, instead of merely complaining about them on the one hand while attempting to measure up on the other.
How much do college rankings matter? According to one recent study, a one-rank improvement in the U.S. News and World Report rankings (the most influential of all rankings) results in a 0.9% increase in applicants. Another study found that moving onto the front page of U.S. News rankings increases admissions applications among all kinds of institutions, although the same article determined that among liberal arts colleges, raising tuition made more of a positive difference with respect to application increase. The rankings also influence some college administrators into "gaming" the ranking system, which can result in real impacts upon a given institution's reporting procedures, class sizes, alumni relationships, faculty salary determinations, and the like. Indeed, in one high-profile case, a U.S. college president's financial compensation has been directly related to his institution's U.S. News & World report ranking. Another report found that college rankings are especially influential with academically high-achieving students from more affluent households, as opposed to lower-achieving students, and students from lower-income and minority backgrounds.
Ranking System Methodologies and Criticisms. How do the major college ranking systems work? And what are the perceived limitations of such ranking systems? Below is a brief survey and analysis of some of the more representative systems (for a more comprehensive treatment, see the following report):
1. U.S. News & World Report College Rankings. Judges institutions of higher education based on their perceived quality, as determined by U.S. News researchers. Weighted Factors as determinants of quality include:
A. "Reputation". 22.5 to 25% weight depending on if a national or liberal arts institution. A 1-to-5 point scale of subjective quality as judged by thousands of college administrators and high school college counselors nationwide.
B. "Retention". 20-25% weight depending on if judged a national or regional institution. Measures graduation rates within six years, freshmen to sophomore matriculation rates, etc.
C. "Faculty resources". 20% weight. Considers faculty-to-student ratios, faculty salaries, etc.
D. "Student Selectivity". 15% weight. Factors in student SAT and ACT test scores and student class rankings.
E. "Financial resources". 10% weight. Institutional spending per student on educationally-related items like instruction, research, and student services.
F. "Graduation rate performance". 7.5% weight. A formula striving to determine individual institutional success at graduating students within six years.
G. "Alumni Giving Rate". 5% weight. The percentage of bachelor-achieving alumni during the past two years who have donated to their alma mater, as a measure of student satisfaction. For a full explanation of methodology and weighting of criteria, see here.
Criticisms are legion from multiple quarters, and include disagreement over usage of subjective judgements of reputation as a measure of institutional quality, methodological imprecisions that can greatly distort differences between institutions, an unbalanced emphasis on student inputs (e.g. high school test scores, class ranking, ) as opposed to outputs (student satisfaction surveys, educational attainment tests, job placement rates, etc.) as measures of institutional quality, and much more. For a fairly comprehensive review of such criticisms, see here.
2. Princeton Review's Best Colleges. Rankings based upon results of student surveys. Surveys are self-selected by students replying almost entirely online (only 5% of respondents reply via mail), according to eight overall categories, including academics, politics, demographics,quality of life, parties, extracurricular activities, schools by type, and social life. Criticisms include the fact that such a ranking system is wholly subjective in nature, and that the respondents are not representative of college students in general, but rather those predisposed to responding to such surveys via an online format.
3. Kiplingers. This financially-oriented survey seeks to rate higher education institutions by their perceived value, as determined by measures of academic quality (including the school's student-faculty ratio, its admission rate and its four-year graduation rate), and affordability ( such as the total cost of attendance with or without financial aid). Kiplingers is not very forthcoming with regards to their methodology, and what information they do provide can be found here. Criticisms include the problems determining the quality of an institution of higher education by its perceived economic value when so many aspects of a college education are difficult to quantify, and equating institutional affordability with quality.
4. Forbes. This economics-oriented magazine joined with the Center on College Affordability and Productivity and decided to focus on factors that directly concern incoming students: Will my courses be interesting? Is it likely I will graduate in four years? Will I incur a ton of debt getting my degree? And once I get out of school, will I get a good job? In answering these questions, five general categories were weighted, including student satisfaction survey results, postgraduate success as measured by payscale.com and prominent alumni listings, student debt and loan default measures, four-year graduation (vs. predicted) rates, and competitive awards as measured by student award achievements of various kinds. Criticisms include the problems with utilizing student surveys (such as Ratemyprofessor.com) as a barometer of institutional quality due to their subjective nature and limited and self-selected sample populations. Also, measuring institutional effectiveness by such measures as alumni and student listings in various "who's-who" and academic award achievement lists can disproportionately value institutions that attract faculty and students who focus more on such achievements.
5. Washington Monthly. This iconoclastic, left-of-center political magazine utilizes an output-based, service-oriented methodology to rate institutions of higher education. Schools are ranked according to community service (percentage of students enrolled in the Army or Navy Reserve Officer Training Corps; percentage currently serving in the Peace Corps; percentage of the school's work-study grants devoted to community service projects), social mobility (based on a regressional analysis formula seeking to determine a school's success at graduating low-income students using Pell Grant statistics), and research success (based on total institutional research expenditures and percentage of bachelor recipients who go on to earn Ph.Ds). Criticisms include limiting community service measures to the above categories, to the exclusion of faith-based service and other activities; the limits of determining student mobility success with Pell-grant recipient graduation rates when so many low-income, minority, and other disadvantaged students fail to successfully apply or even qualify for Pell grant aid; and using research funding and Ph.D. achievement as measures of institutional quality when so many of our colleges and universities across the nation are simply not research-oriented institutions.
Of course, there are many other college ranking systems existant, including Newsweek/Kaplan, college prowler, and Collegeconfidential.com, which tend to rely heavily on self-selected student surveys (Collegeconfidential provides a comprehensive comparison of the various ranking systems), and so don't add much innovative light to the higher education ranking dilemma. Then there are ranking systems that narrowly focus on specific and subjective measures of quality, such as Sierra Magazine's ranking of institutions based on "Cool" (based on a subjective mixture of perceived environmentally 'green' measures and available activities deemed desirable), and so are difficult to operationalize with respect to the differing missions of colleges and universities nationwide.
A Better Way to Rank our Colleges and Universities.
If rankings are indeed here to stay, and can be used for positive as well as negative purposes, how can they be improved? Based on the brief survey and critical analysis of the respective ranking systems above, I suggest the following measures as a starting point for creating a better college and university ranking system (and welcome here input from my instructor and colleagues in this class):
1. Carefully consider what is best from each of the ranking systems mentioned above (and other), and merge them into one comprehensive ranking system. Utilize peer-reviewed research findings, econometric measures, student surveys, and other methods in integratively developing a more comprehensive and rigorously defined and tested ranking system. And make it as transparent as possible.
2. Assess and rank a given institution based on three general categories: a. Inputs. That is, student, faculty, administration, and staff recruitment success, as creatively measured by not only traditional measures as student test scores and faculty graduate-degree achievement, but also by measured ability to recruit successfully students and workers throughout a given institution's socio-economic and racial/ethnic locality, and to recruit international students from a wide variety of national and academic backgrounds. And, to measure the quality of institutional investment in terms of well-constructed campus infrastructure that adequately meets the needs of the entire campus community, and the degree to which the campus/institution is connected to its local community in terms of measures such as bus schedules, course availability, media connectiveness, primary- and secondary-level education guidance and communications, lower-income and minority-empowerment programs with respect to higher education access, and activities that invite political, neighborhood, and cultural input and involvement, and more. b. Process. Fashion measures that determine the success at which a given college or university is successful in carrying out its given mission on a day-to-day basis. What is the given drop-out rate at the institution, as well as the graduation rates within, say, four and six year periods? How often are students utilizing student services such as counseling, housing, employment, and academic advising, particularly with respect to marginalized populations like part-time students, students of color, and students from economically disadvantaged backgrounds? How well are student complaints throughout the college/university community being addressed in terms of meetings and hearings held, measures taken, and other indicators? Within budgetary constraints, how well is Student Affairs overall meeting the needs of its diverse clientele, according to such measures as length-of-time to process financial aid applications, number of active student clubs on campus, housing slots filled vs. those requested, and so forth? c. Outputs. How effectively did campus faculty and administration meet the educational goals and needs of students, measured by such the College Learning Assessment (CLA) or other measures? What percentage of graduating students actually received career counseling and/or job placement assistance services? What percentage of students received exiting financial aid information and other information designed to facilitate entrance into post-college life? What percentage of graduates and drop-outs actually found work within, say, three to six months of graduation? Work that actually supported their minimum financial needs, and were in some way related to their college studies? Which percentage of students both during and post-college were involved in some service activity, according to a broad metric including service organizations and activities of all kinds?
3. Effectively broadcast to and elicit input from all stakeholders in the college ranking process (current students, prospective students and their parents, alumni, faculty, administrators and staff, political and community leaders, etc.) for their input and critical feedback with respect to such a comprehensive ranking system, prior to launching the system.
4. Make this ranking system as transparent as possible via media outlets, facebook, higher education forums, and other means of communication.
5. Invite and integratively incorporate as much research and research oversight into the process as feasible from doctoral departments, associations such as NASPA and ACPA, think tanks, governmental organizations, and other interested entities.
6. Constantly update and evolve this comprehensive ranking system based on new research findings, educational forum inputs, and reviews and insights provided by interested community and governmental groups.
7. Ensure that all funding for such a ranking enterprise be divorced from any conflict-of-interest possibilities, such as accepting funding from ranked institutions.
That was a lengthy post :)
ReplyDeleteI have never worked at an institution that has made an appearance in the US World and News for any rankings at all, so it is very difficult for me to connect with the purpose of the rankings. With that said, I can also say that the rankings are probably also irrelevant to a large portion of higher education clientele (the prospective student, in my world). I think the rankings probably serve a purpose with a specific population of students and families. However, I would argue that for our growing population of first generation college students and their families, the reports do not have much meaning. I have not seen a report category for anything along the lines of "creating/establishing a culture of care". From the families I work with, this is what is important to them. Will my child be taken care of and treated respectfully? Will my family be in a better position for having a student attend this college? Can I trust my child in your care?
I really don’t believe the rankings mentioned have much value. However, there are other rankings out there that I do think grab the attention of a college bound students. For example, the top party schools. Again, I would imagine it is a certain demographic of student who would be influenced by that ranking. I
Also, just to throw it out there, I think another big factor in yearly increases to the application pool has to do with the performance of athletic programs. I remember reading that Butler University saw a huge increase in applications for 2011after their appearance in the 2010 NCAA Final Four.
I cannot agree or disagree in the way the data is collected and the ranking is determined, because I just don’t see the effect, I suppose.
I recently stumbled on a website the ranked colleges, but the ranking criteria were not the standard criteria we are used to seeing. Their were several categories, including presence of greek life, time the average student spends studying, town life and "birkenstock wearing, tree hugging, clove smoking vegetarians" just to name a few.
ReplyDeleteTraditional rankings rarely capture the culture at a college or a university. Rankings such as these, help prospective students get a better understand of what life is really like at the institution.
Student affairs professionals talk of the college search process as if they are talking about shoes shopping. It is important that student find what institution fits them best, where they are most comfortable and where they believe they will get the best support. When students are making their decisions, perhaps as student affairs professionals we should steer students towards untraditional ranking systems as well as more traditional ones.
Last week, after attending my son's graduation from basic training, my daughter and I visited Auburn University. It is on her top 10 list for colleges to attend next fall. We have made several college visits and all of the universities boast about their rankings in some form.
ReplyDeleteI have never given the rankings much thought but have found some of the categories interesting. I have not done any research on the criteria for the rankings. The different rankings have come from US News and World Report, Princeton Reviews, Forbes and I believe even Kiplinger.
Auburns claims to fame are the following:
Ranked in the top 50 public institutions in the nation for providing a quality education with educational value (US News & World Report)
One of the top five universities nationally for producing NASA scientists and astronauts
Consistently ranked in the top 100 "Best Value" universities by Kiplinger
Located in a small, vibrant community that is ranked as one of the Top 10 Places to Live in 2010
(www.auburn.edu/admissions/auburn/benefits-of-attending-auburn.html)
So after reading your post, I asked my daughter if any of these rankings mattered to her and if it would have an influence on her decision. She said, "I think it is interesting to hear about the rankings but it won't make any difference in the college that I decide on." She is looking for a college with awesome traditions, great environment, a variety of social organizations, good academics and a feeling of community. Not too much to ask for...right?
What she did say though was that one important statistic she found interesting was that "More than 95 percent of alumni say that if they could do it all again -- they would again choose Auburn. That's among the highest satisfaction ratings in the country." (www.auburn.edu/admissions/auburn/benefits-of-attending-auburn.html). Both she and I found this statistic interesting and we did ask the question as to how they obtained this data. The University sends out surveys and questionnaires to alumni that have been graduated for 4 to 5 years and they ask several questions. One of the question is if they would come back and do it all over again at Auburn. 95% of the alumni said they would and the survey has a significantly number of responses! Again, we realize that only those that enjoyed the Auburn experience may be responding but it was a fact that we both found comforting and interesting about Auburn University.
So do college ranking systems help these colleges recruit students? Do potential students care about such rankings? My answer coming from experience is it has very little impact. And if the ranking systems were revised and revamped, more emphasis may be put on the reliability of the ranking systems!
Michael, I thought you had a great starting point for revamping the systems. I feel that students, professors, staff, parents, research organizations, professional organizations should all be apart of a comprehensive ranking system. Retention rates, graduation rates, costs, accessibility should all be areas of consideration. Social, religious and political organizations should be reviewed. Academic successes, innovative programs, international programs and research programs can be assessed.
But after visiting a few colleges throughout the US this past year with my daughter, I do believe that the rankings are another selling point for the Universities. But, I feel that the upcoming student will likely choose a college that has a good feel and has the programs and academics that they are interested in and the rankings won't really have an impact!