SHARE
Photo Credit: Jessica Dang

On Thursday, February 2, the Forum and the Claremont Port Side sat down with Claremont McKenna College President Pamela Gann to discuss the recent SAT reporting scandal.

President Gann began the discussion by emphasizing honesty, integrity and the core values of the college. Gann felt it was important “to recognize all of the historical work of prior leaders within the college that makes this community what it is.”

The College’s overarching goal was to get to the “root of the problem.” Gann stated that, “there was never a question that we would be forthcoming” about the falsified SAT data. The College wanted to be open and transparent in their findings and get the information out “quickly but accurately.”

Gann went on to explain the chronology of recent events and how the falsely reported data first came to her attention. On January 9, an unnamed source from within the college approached President Gann and raised a question about the accuracy of the SAT data for CMC’s 2011 incoming class.

That day, Gann asked Vice President and Dean Emeritus Jerome Garris to look into the questions raised about the SAT scores. Gann noted that Garris is a man of unquestionable integrity. Over two weeks later, on January 24, Gann sent an email to Garris to check-in on the investigation. She awoke on Wednesday morning, January 25, to an email from Garris. The email included news that someone within the Office of Admission had confessed to falsely reporting SAT data since 2005.

President Gann was in disbelief when she first heard the news.

Richard C. Vos, former Vice President & Dean of Admission and Financial Aid, is widely assumed to be at fault and resigned on Monday after news of the scandal was made public.

On the morning of January 25, Gann immediately notified Chairman of the Board of Trustees, Harry McMahon ‘75, of Garris’ findings. McMahon formed a small working group of board members that met later that day by phone. McMahon, the Board of Trustees, and Gann worked in parallel through this process. “The board was totally in sync with us as we did our work,” said Gann. Gann then held an executive committee meeting over the phone on Thursday and a full meeting of the board on Friday.

Early Monday morning, January 30, the college began telephoning all entities to which the affected data was reported and informing them of the falsified data. At 9:00 AM, Gann met with her senior staff to inform them of the news, and at 10:00 AM, she met with the Office of Admissions and Financial Aid staff. Gann’s email was sent at 11:00 AM to all students, faculty, staff, alumni, and parents of current students.

Soon after, Gann informed this year’s Early Decision I applicants—those who had already received notification of their binding admission—and Early Decision II applicants about the falsification of SAT scores.

On Wednesday, February 1, the college formally released the corrected data and sent it to outside agencies, including the New York Times, the LA Times, the college’s auditors, Moody’s Investors Service and other interested parties. The corrected data has not been sent to all entities, but the college hopes to finish their distribution by the end of the day.

Photo Credit: Jessica Dang

Gann emphasized the important distinction between data construction and data reporting. Data construction, she explained, is the way in which the college compiles the data internally. Applicants often take the SATs and the ACTs more than once, and all scores are sent to the college. Like many of its peer institutions, Claremont McKenna takes an applicant’s highest critical reading and highest math score to create the combined SAT score used for the college’s admission decision. If a student’s ACT score is higher than his or her SAT score, the former score is used in the admission process.

Gann explained that now, nearly fifteen to sixteen percent of applicants only submit ACT scores to the college. Since a higher ACT score can trump the SAT score, Gann said, “There will not be an SAT score for every student.”

 

Data reporting, she explained, is the way in which the college’s admissions data is presented to outside entities. According to Gann, the manipulation of SAT score data was an issue of data reporting and not data construction.

“As far as we know,” said Gann, “there was no falsification of data construction.”

Gann noted two main reasons for how the data could have been misreported for over six years. First, she said, “a sole person had too much authority over the reporting of data.” Gann admitted that there was “no internal checks and balances system in place” when the senior administrator was falsifying reports of the data.

Second, the reported SAT scores “did not trigger suspicion,” said Gann. Gann explained that the data were relatively flat, and “the falsified numbers were almost the same every year.” Nothing in the data raised any suspicions amongst those who saw it.

Citing personnel matters, protected by California privacy law, Gann could not comment on the former senior administrator’s motivation behind fudging the numbers. She also could not comment on whether the person responsible was asked to resign and whether there was a resignation letter.

President Gann has said the college will move forward next week with an independent review conducted by the law firm O’Melveny & Myers and led by the Board of Trustees. Gann emphasized that no one from within the college can be responsible for leading the independent review as it would be an obvious conflict of interest.

Though she can’t predict when the review will be completed, Gann said she hoped it would be finished quickly. The findings of the independent review will only be made public if the Board of Trustees chooses to do so. Although the college has no reason to believe that other data has been falsely reported, the independent investigation will look at data before 2005.

Photo Credit: Jessica Dang

Gann believes the college acted quickly and “used good governance” to address the root of the problem and manage the situation. Gann asserts that the College has “been very prompt, open, and honest” in its handling of the issue.

While some students have expressed frustration with the minimal communication from Gann and the college administration, Gann emphasized that her plan of action thus far has prioritized (1) obtaining the right information and (2) getting the correct information out to the appropriate agencies. Now, she said, the college is in the midst of her third objective—to repair the trust of the community—and will continue to reach out and inform students as best it can.

Gann thought going to student publications such as the Forum and the Claremont Port Side was more effective than immediately holding a town hall-style meeting. However, Gann has been present in college dealings over the past week. She attended a senior class reception on Tuesday evening, appearing at a Board of Trustees meeting breakfast with students, and may appear alongside Vice President of Public Affairs Max Benavidez on Monday evening at the ASCMC Senate meeting.

President Gann could not speculate on how this incident will impact CMC’s rankings in the future. On NPR’s All Things Considered, Robert Morse, director of data research for U.S. News and World Report, indicated that the dip in scores is likely to have only a small effect on the ranking. “It’s certainly not going to drop the school to twentieth place,” said Morse, “but I guess there’s some chance that it could drop out of the top ten.”

President Gann added that she does not see any data that suggests rankings drive a student’s decision to go to a college. “The primary reason that students come here is the high quality education and the academic program is a good fit for them,” said Gann. She continues, “rankings and guides are only part of the process.”

Many believe that rankings played a role in causing this incident. Gann stated that the Office of Admission has no explicit goals for SAT scores. “Our aspiration is to have a talented student body,” said Gann, “and SAT scores are a part of that.”

In 2002, the Board of Trustees adopted a general policy statement to guide the admissions office on shaping incoming classes. Some considerations include leadership, diversity, and support for co-curricular programs. According to Gann, one change in this policy since the beginning of her presidency was to increase the number of international students.

Despite the recent SAT score incident, President Gann believes that Claremont McKenna remains a strong institution. “We have wonderful students, wonderful faculty, and I’m very proud of this college,” she said.

Gann hopes that this unfortunate incident will also become a learning experience for students. She explained that the past week has been an excellent lesson in “crisis leadership.” Gann stated that, “lapses in leadership are where you learn the most.”

Updates: February 2, 2012 at 3:14pm

Since the incident first came to light, the college has taken a number of steps to ensure that this will never happen again.  Before any data is released from the Admissions office, two Vice Presidents in different areas of the college, that have no authority in the Office of Admission, must sign off on the data.  President Gann believes this method should be extended for all data reporting at the college. Vice President for Administration and Planning, General Counsel, and Secretary of the College, Matthew Bibbens, and Vice President for Academic Affairs and Dean of the Faculty, Gregory Hess, signed off on the corrected SAT score data before it was released yesterday. President Gann also signed off on the SAT data.

Heath Hyatt ’12, Caroline Nyce ’13, and Nathan Falk ’14 contributed reporting.

Editor’s note. This article was updated on February 2 at 2:37pm. The original article stated that a senior administrator had “falsified reporting” of SAT data since 2005. The updated article clarifies and states the administrator confessed to “falsely reporting” the data.

30 COMMENTS

  1. Just a question.  “The email included news that someone within the Office of Admission had confessed to falsifying the reporting of SAT data since 2005.” Falsifying and misreporting are two entirely different issues. How can they say someone admitted to being dishonest and providing false information and then call it misreporting? Dean Vos was always kind to me and it saddens me to know this all happened, but I am wondering how they are attempting to reconcile these two opposing words.

  2. I am calling bullshit. So the college brings in an outside firm that represented it in past scandals? We need an accounting firm, not the college’s lawyers doing the investigation. Gann needs to go.

    • Right. Because integrity is the long suit of public accounting firms?  Have you been reading the newspapers since 2008?  I’d far rather have a top law firm like O’Melveny, which has deep experience in white-collar investigations.

      • I do read the newspapers. There has been quite a lull in major accounting investigation as compared to the beginning of the decade, the last time there was significant amount of corporate turmoil and crime. Since 2008, there have only been two public accounting scandals that have occurred in the United States, both surrounding the crimes of either Bernie Madoff or Lehman Brothers (both of whom managed to fool an exceedingly large amount of qualified people, their accountants included). You should probably restrain yourself from impugning an entire industry based on the actions of very few.

        That said, I don’t really know what an accounting firm can do here that a law firm with experience in business investigation can’t. The SAT falsifications here aren’t like the highly sophisticated accounting tricks like the ones that Madoff or Andrew Fastow were engaged in. Still, it’s not unfair to question O’Melveny’s objectivity regarding CMC.

        The bottom line is that it’s hard to accept that Dean Vos was alone in his actions. There needs to be more transparency from the administration. The fact that they haven’t officially acknowledged that Dean Vos was the perpetrator is telling enough: clearly they are doing everything they can to escape culpability.  

      • Right… because they did such a good job with Jonathan Petropoulos. /sarc off.

         After the L.A. Times found out about his scheme to extort a Holocaust survivor, he resigned as head of the Holocaust Center. O’Melveny was the one that was brought on to look into it, but they found nothing. Apparently they don’t Google at top law firms. 

        The point I think is that you aren’t going to get an “independent” investigation of a college from that college’s general counsel. There is this thing called “attorney-client” privilege. Maybe you heard of it? It’s really hard for attorneys to hurt their clients reputations. I expect a white washing.  

        • So… do you actually know what happened with Petropoulos?

          He’s awesome.  There’s nothing wrong with imposing a fee for hours of expert research – which is what he did.

  3. If Vos had too much authority making decisions and he only reported to Gann isn’t she negligent?

    • It’s highly probable that CMC hasn’t released Dean Vos’ name so that Pam Gann can’t be identified as negligent.

      • She needs to be fired. If one of my employees lied to me for seven years about our financials, I would be investigated by my shareholders. Is CMC the Enron of academia? 

  4. Duh. Why doesn’t Gann admit it? “I want to increase the number of full-paying international students because I can count them as a minority and get their money!” 

  5. Gann needs to go! It is the only way to bring justice! It’s not like the Dean of Admissions just went rogue out of his own sudden desire to climb in the rankings after being an employee for 25+ years. He obviously was pushed by a leader. That leader is Pam Gann. She led us astray and she needs to go! I call for her to resign immediately, or if not, the Board of Trustees should force her resignation.

    • While I think a forced resignation would be an extreme measure, it’s clear that Dean Vos did this because of very extreme pressure. He did a great job in his role over those 25 years, and it makes no sense that he would start lying about something so obvious for personal gain.

  6. Was there any discussion during the “interview” of what Dean Vos’s motive was in falsifying the data?

  7. President Gann, Thank you for your work on behalf of the college and its students and staff.

  8. Luckily we live in America and we have the right to freely voice our opinions.  If you believe, as I do, that CMC deserves better than 

      • Coming from someone presumptuous enough to pen this , I’m hardly surprised. 

        I won’t be signing your petition Patrick, and neither will rational students & alums.

        At this point there isn’t even a scintilla of evidence to tie Gann to this mess, and I’m not willing to toss out the best fundraiser CMC has ever seen based on your deluded conclusory assertion of her blameworthiness.  Until real proof of her involvement surfaces, your reaction is characteristically premature.

        • Ouch. Going after a guy’s thesis anonymously? 

          Well, I signed it, Patrick. I encourage others to sign it, too. 

          Gann’s got to go so that everyone knows how seriously we treat this. There are lots of people that can raise money for the college. 

  9. “Although some institutions refuse to cooperate with the U.S. News & World Report rankings, the vast majority comply with the magazine’s request for data. But to appear more selective, some institutions count incomplete applications as denials to lower their acceptance rate. (Students may have mailed the first part but never submitted test scores or high-school transcripts.) Other colleges defer admission to the second semester and then don’t report those students as admitted to U.S. News. Similarly, some institutions record impressive increases in giving rates by purging their databases of graduates who haven’t given in many years. A number of colleges don’t require the SAT exam but proudly report high average SAT scores, ignoring the obvious fact that when SATs are optional, only the highest scorers submit their scores. Others omit the SAT scores of international students or recruited athletes.

    A fundamental moral question is this: Even if a college feels strongly that the rankings are bogus, is it acceptable to fool around with the numbers? 

    (From Moral Reasoning and Higher-Education Policy The Chronicle of Higher Education September 7, 2007)

  10. Wonder if President Gann would confirm whether higher SAT scores impact her compensation and that of other administrators at CMC? This goes to the heart of the matter. Her statement about SAT scores, explicit goals, and aspirations was vague at best, misleading at worst. 

  11. sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!sat!

  12. Cheat Sheets: Colleges Inflate SATs And Graduation Rates In Popular Guidebooks — Schools Say They Must Fib To U.S. News and Others To Compete Effectively — Moody’s Requires the Truth The Wall Street Journal April 5, 1995 Wednesday

    Copyright 1995 Factiva, a Dow Jones and Reuters CompanyAll Rights Reserved(Copyright (c) 1995, Dow Jones & Co., Inc.) The Wall Street Journal
    April 5, 1995 Wednesday

    SECTION: Pg. A1

    LENGTH: 3564 words

    HEADLINE: Cheat Sheets: Colleges Inflate SATs And Graduation Rates In Popular Guidebooks — Schools Say They Must Fib To U.S. News and Others To Compete Effectively — Moody’s Requires the Truth

    BYLINE: By Steve Stecklow, Staff Reporter of The Wall Street Journal

    BODY:

    In Money magazine’s 1994 college guide, New College of the University of South Florida was ranked No. 1 overall. Among the school’s strengths, the guide noted, was the freshman class’s average Scholastic Aptitude Test score: an impressive 1296.

    This appeared to place New College among the most selective schools in the nation. But the score — as well as the pretense of exclusivity — was false.

    For years, the Sarasota-based school concedes, it deliberately inflated its SAT scores by lopping off the bottom-scoring 6% of students, thereby lifting the average about 40 points. Admission Director David Anderson describes the practice, which he says he recently discontinued, as part of the college’s “marketing strategy.” 

    Though Mr. Anderson acknowledges that such a strategy raises “some ethical questions,” he points out that New College is far from alone. In their heated efforts to woo students, many colleges manipulate what they report to magazine surveys and guidebooks — not only on test scores but on applications, acceptances, enrollment and graduation rates.

    “This is awful stuff,” says Thomas Anthony, former dean of admission at Colgate University in Hamilton, N.Y. “But when the American public comes to you and says you’re not in the top 20 and they’re going to make their decision based on that, it puts incredible pressure on you to have the right-looking numbers.”

    The guidebooks, which have become a powerful influence on parents and students choosing schools, routinely publish the erroneous statistics. Meanwhile, many of the same colleges provide accurate — and much less flattering — data to debt-ratings agencies, as required when the schools sell bonds or notes. Lying to the ratings agencies violates federal securities laws and can expose schools to huge liabilities; there are no legal penalties for misleading guidebook publishers.

    The publishers say they try to tailor their survey questions to reduce the opportunity for fudging. But, says Max Reed, senior editor at Barron’s Profiles of American Colleges, “If they give us incorrect data, there’s really not much we can do about it.”

    Excluding certain groups of low-scoring students from their SAT numbers is one of the colleges’ most common tactics, even though most of the guidebooks specifically prohibit it. Many admission officials argue that including these students, sometimes admitted under special preferences, would put a school at a disadvantage in the ratings. Not every college excludes students, but those that do so follow rules they make up themselves.

    Northeastern University in Boston excludes both international students and remedial students, who together represent about 20% of the freshman class. The practice boosts the school’s SAT average by about 50 points, says provost Michael Baer.

    New York University excludes from its SAT scores economically disadvantaged students in a special state-sponsored program. So does Manhattanville College in Purchase, N.Y. But Marist College in Poughkeepsie, N.Y., includes them — except for about 25 who are learning disabled. Marist also excludes international students.

    “The reason those two groups are excluded is it just skews the average, and it’s not accurate for kids who are trying to figure out if they’re admissible or not,” says Harry Wood, Marist’s vice president for admissions.

    Then there is Boston University. It excludes the verbal SAT scores, but not the math scores, of about 350 international students. The reason: Foreign students often have trouble with English and tend to do poorly on verbal SATs, but many score better than U.S. students in math.

    Monmouth University in West Long Branch, N.J., provides no rationale for overstating its SAT scores by more than 200 points for the College Handbook, a guide published by the College Board. David Waggoner, director of undergraduate admissions, says the guidebook numbers appear to have been “fabricated” by a former Monmouth employee. “They’re off the wall,” he concedes.

    “We accept what the colleges tell us,” says Robert Seaver, a spokesman for the College Board. He calls the Monmouth listing “extremely embarrassing.”

    “The problem is we’re all trying to look better,” says Steven T. Syverson, former ethics-committee chairman of the National Association of College Admission Counselors. Mr. Syverson says colleges widely ignore NACAC’s ethics code, which includes explicit rules against statistical manipulation. In particular, the code requires that schools report data on “all first-year admitted or enrolled students, or both, including special subgroups.”

    Mary Lee Hoganson, head guidance counselor at the University of Chicago Laboratory High School, says some colleges are so obsessed with looking good that they employ tactics that hurt students. To appear more selective, she says, colleges solicit applications from students they don’t really want, raising false hopes but pumping up the closely scrutinized ratio of rejections vs. acceptances. “They need more applications so they can turn down more people so they look better in the ratings,” Mrs. Hoganson says.

    Colleges are busily playing this numbers game at a time when admissions has become a buyer’s market, with many schools fiercely scrambling for quality applicants. This school year, no fewer than 150 colleges made presentations at Mrs. Hoganson’s school over a two-month application period. “We can’t even accommodate them all any longer,” she says. In this intense climate, college-admission directors who don’t recruit the right mix of students often find themselves out of a job.

    Money and U.S. News & World Report, though not the other guidebook publishers, use the data they receive to rank the schools with what appears to be methodical exactitude. College officials almost universally disdain these rankings, arguing that a college’s quality can’t be judged merely by statistics and opinion polls. But they dare not refuse to participate, knowing that the rankings can profoundly affect numbers of applications, the quality of students who apply and even alumni donations.

    Applications jumped 7%, for instance, after U.S. News named Susquehanna University in Selinsgrove, Pa., the No. 1 northern regional-liberal-arts college last fall. When the magazine named Lyon College in Batesville, Ark., the No. 1 southern regional-liberal-arts college, more highly qualified students applied, lifting the average SAT scores of applicants 73 points in a single year. When the same guide dropped Hamilton College in Clinton, N.Y., from its top 25 national liberal-arts colleges, two dozen disappointed alumni called the school demanding to know what was wrong.

    Even the most prestigious colleges trumpet their guidebook successes. When Massachusetts Institute of Technology summoned reporters to a news conference on March 9, it was to announce a competitive breakthrough, not a scientific one: Two MIT graduate schools had placed first in a U.S. News survey. “We all live and die by those rankings,” says Gordon Holland, president of Gettysburg College in Gettysburg, Pa.

    Most of the statistics in magazines and guidebooks such as Barron’s and Peterson’s Four-Year Colleges are self-reported and unaudited. Each guide uses its own formula for evaluating colleges, including pollster-style surveys, but nearly all factor in the self-reported data heavily. As a result, says Roland King, director of public relations at the University of Maryland in College Park, “They’re subject to cooking all the time.”

    To gauge the accuracy of colleges’ reported numbers, this newspaper compared data colleges provide to the guides with similar statistics they give to debt-rating agencies and investors. A review of more than 100 credit reports on colleges by Moody’s Investors Service Inc. and Standard & Poor’s, as well as bond prospectuses, showed more than two dozen discrepancies in SAT scores, acceptance rates and other enrollment data. In nearly every case, the Moody’s and S&P numbers were less favorable to the colleges than the guidebook figures.

    A Moody’s credit report on Richard Stockton College of New Jersey in Pomona, for example, lists an average SAT score of 991 on the 1600-point scale for freshmen entering in the fall of 1993. But U.S. News lists the average score as 1095. Harvey Kesselman, the college’s vice president for student services, acknowledges that the Moody’s number is correct and says he cannot explain the numbers given to U.S. News.

    Editors at U.S. News, Money, Barron’s and Peterson’s say that though they try to fact check, they have never compared their own data with information reported to the debt-rating agencies; indeed, only U.S. News was aware that the agencies collected such information. U.S. News, whose college guide sells more copies than any other, says it hopes to compare the databases in the future.

    Fudging the guidebook numbers can have a direct positive effect on rankings. Edward Hershey, former director of communications at Colby College in Waterville, Maine, says Colby moved up significantly in U.S. News’s fall 1992 rankings of national liberal-arts colleges through “numbers massage” and an inadvertent error.

    In a letter last fall to the student newspaper at Cornell University, where he now works, Mr. Hershey recounted how officials at one college — which he now confirms was Colby — huddled at “a meeting that could only be described as a strategy session on how to cheat on the survey.”

    Though he won’t detail the “numbers massage,” Mr. Hershey says that in completing the U.S. News survey, he mistakenly reported that 80%, rather than 60%, of Colby’s freshmen were in the top 10% of their high-school class. “It was pure innocence, I swear,” he says. “Of course, the thing just rolled right on through.” He says no one at U.S. News caught the error, even though the year before Colby had reported the figure as 54%.

    When U.S. News’s rankings came out in September 1992, Colby jumped to 15th place from 20th place. In his letter to the Cornell newspaper, Mr. Hershey wrote, “The downside was that we spent the following year figuring out how to play with some other numbers to preserve our competitive advantage and forestall a subsequent plunge in the rankings that would have to be explained to concerned alumni.”

    Sally Baker, Colby’s current director of communications, concedes that the school made an inadvertent error on applicants’ class rank but denies that officials ever intentionally fudged any figures: “The data is real, and we are as honest as humans can be,” she says.

    Christian Brothers University in Memphis, Tenn., is another school that benefited from giving U.S. News questionable data. The magazine’s America’s Best Colleges 1995 College Guide said that in the fall of 1993 the school accepted 59% of its freshman applicants, a figure that helped place the school in U.S. News’s top tier of southern colleges and universities. But Moody’s, in a credit report dated Jan. 3, 1995, said that in 1993 the school accepted 73% of its applicants.

    Christian Brothers says the Moody’s figure is accurate, and it can’t explain the number in U.S. News. “We try to be honest,” says Nick Scully, vice president for institutional advancement. “This doesn’t look real honest and I don’t know if it was on purpose or not.

    Acceptance rates at Bard College, a highly regarded liberal-arts school, don’t square, either. Moody’s says the college, in Annandale-on-Hudson, N.Y., accepted 62.6% of its freshman applicants for fall 1993. U.S. News reports the figure at 44%.

    Both figures are wrong, says Mary Inga Backlund, Bard’s director of admission. She says the figures given to Moody’s were provided by the college’s business office and excluded disadvantaged applicants excused from paying a $40 application fee, but included transfer students who should have been excluded. “This is a conflict that we have regularly with the business office,” Ms. Backlund says. The U.S. News figure, she adds, “was a transcription error on my part.” Bard’s actual acceptance rate: 47%, she says.

    Even Harvard University, often the top-ranked school in the nation, shows up with slightly conflicting SAT data. Harvard gave U.S. News a range of SAT scores for fall 1993 freshmen, and the magazine placed the midpoint at 1400 — a benchmark score that’s a well-established mark of excellence. But Harvard’s midpoint in Moody’s reports was 1385, derived from a lower range that Harvard provided Moody’s.

    Marlyn McGrath Lewis, director of admission for Harvard and Radcliffe Colleges, describes the Moody’s numbers as a “mystery” and says the U.S. News figures were accurate. “I don’t think this is significant,” she says. A Moody’s spokeswoman says the score it published was exactly what Harvard told it.

    Graduation rates are also subject to sleight of hand. The National Collegiate Athletic Association requires its members to disclose graduation rates for the student body as a whole and for student-athletes. A comparison of the numbers reported by 300 colleges to the NCAA and to U.S. News found discrepancies in more than 50 instances; in nearly every case, the overall graduation rates reported to the NCAA were lower. Schools may have an incentive to play down overall graduation rates to the NCAA so that their student-athletes’ rates look better in comparison, college officials say.

    Acceptance rates provide further opportunities for manipulation. Conrad Sharrow, who was dean of admission at Rensselaer Polytechnic Institute in Troy, N.Y., from 1986 to 1993, says about 20% of the school’s applicants were rejected by the specific program for which they applied but were accepted into another undergraduate program at RPI. All these students were counted as rejects in the school’s reported admission figures. That way the school appeared to be more selective overall than it actually was.

    He says he also used waiting lists to enhance the numbers. “Suppose you had 5,000 applications and suppose in the first round you accepted 2,500 of those. Then you had a waiting list,” he explains. “So when the question comes up, how many did you accept, you can in good conscience say you accepted 2,500. That’s true.” The school, however, would later accept another “400 or 500” students off the waiting list but continue to count them as rejects, he says.

    Mr. Sharrow says he justified such actions because the rankings emphasize a college’s acceptance rate, a figure that he believes doesn’t connote quality. He says if college guides “abuse” such numbers “then what you’ve got to do as an admissions person is to juggle them in such a way so the abuse is minimized.” Rensselaer’s new dean of admissions, Teresa C. Duffy, says she stopped such practices last year.

    Mel Elfin, special-projects editor at U.S. News, defends the magazine’s college guide despite some school’s efforts to deceive readers. Roughly 90% of the information U.S. News prints is accurate, he estimates, adding: “Our job now is not to throw our hands up in the air and say it doesn’t work but to continue to beef up our defensive line.”

    But colleges are so accustomed to cheating by now, contends college consultant Martin Nemko, that it may be impossible to put out a reliable guidebook. Mr. Nemko, of Oakland, Calif., says he was slated to write a guide for Little, Brown & Co., a unit of Time Warner Inc., that would have required schools to distribute questionnaires to random groups of students.

    But in January, he wrote to colleges saying he had been forced to scrap the project after receiving “sufficient evidence that more than a few institutions will be taking extraordinary measures to guarantee that their student questionnaires paint an inordinately flattering picture.”

    One example was Texas A&M University, which told him it planned to distribute the surveys mostly to honor students, Mr. Nemko says. Admission Director Gary Engelgau now says he hadn’t finally decided on a distribution list but notes that honor students are generally more likely to fill out surveys. He adds: “If you ask me to do something, as much as I can within the rules, I’ll try to do it so that it makes me look good.”

    — Graduation Rates

    Location U.S. News NCAA

    Bowling Green State University Bowling Green, Ohio 65% 58%

    Campbell University Buies Creek, N.C. 52 46

    University of Cincinnati Cincinnati 52 47

    Coastal Carolina University Myrtle Beach, S.C. 37 32

    University of Hartford West Hartford, Conn. 59 53

    Indiana University Bloomington, Ind. 66 62

    Jackson State University Jackson, Miss. 34 23

    La Salle University Philadelphia 76 68

    Long Island University Brooklyn, N.Y. 55 28

    Louisiana Tech University Ruston, La. 44 38

    University of Md., Eastern Shore Princess Anne, Md. 29 20

    Mercer University Macon, Ga. 75 40

    University of Minn.-Twin Cities Minneapolis 43 38

    University of Missouri Kansas City 43 39

    Niagara University Niagara, N.Y. 59 55

    Nicholls State University Thibodaux, La. 30 20

    N’western State Univ. of Louisiana Natchitoches, La. 27 20

    Ohio University Athens, Ohio 62 58

    Providence College Providence, R.I. 93 87

    St. Francis College Loretto, Pa. 62 57

    University of San Diego San Diego 64 59

    San Jose State University San Jose, Calif. 38 32

    University of South Florida Tampa, Fla. 46 40

    Southern Illinois University Carbondale, Ill. 43 37

    Southern Utah University Cedar City, Utah 45 27

    Tennessee State University Nashville, Tenn. 40 24

    University of Toledo Toledo, Ohio 46 39

    Wagner College Staten Island, N.Y. 67 43

    Wake Forest University Winston-Salem, N.C. 86 82

    Note: Colleges gave a variety of reasons for discrepancies, including misinterpretation or misunderstanding of instructions; clerical or computer errors; and inclusion of part-time, summer or associate-degree students in one survey but not in the other.

    Sources: U.S. News’s America’s Best Colleges 1995 College Guide and 1994 NCAA Division 1 Graduation-Rates Report. Figures represent the average percentage of freshmen enrolled 1984-1987 who graduated within six years. Graduation rates count for 15% of U.S. News’s overall rankings.

    — SAT Scores

    College/Location Score

    Boston University/Boston

    Gave U.S. News a middle range of SAT scores with a midpoint of 1150. Excluded the verbal, but not the math, scores of about 350 foreign students.

    Florida Institute/Melbourne, Fla.

    Gave U.S. News a middle range of SAT scores with a midpoint of 1065. The scores excluded foreign students.

    Harvard University/Cambridge, Mass.

    Gave U.S. News a middle range of SAT scores with a midpoint of 1400. Gave Moody’s a middle range of scores with a midpoint of 1385.

    Manhattanville College/Purchase, N.Y.

    Gave U.S. News a middle range of SAT scores with a midpoint of 1068. The scores excluded economically disadvantaged students in a special state-sponsored program.

    Marist College/Poughkeepsie, N.Y.

    Gave U.S. News a middle range of SAT scores with a midpoint of 985. The scores excluded about 25 students who are learning disabled, as well as international students.

    Marshall University/Huntington, W.Va.

    Told Barron’s 47% of its students scored 21 or better on the American College Test (ACT). Told Peterson’s 36% of its students scored 21 or better.

    Monmouth University/W. Long Branch, N.J.

    Gave College Handbook a middle range of SAT scores with a midpoint of 1115. Gave U.S. News a middle range of scores with a midpoint of 930, which the admission director says excluded 150 remedial students. Told Moody’s the median score was 816.

    New York University/New York

    Gave U.S. News a middle range of SAT scores with a midpoint of 1145. The scores excluded about 100 economically disadvantaged students in a special state-sponsored program.

    Northeastern University/Boston

    Gave U.S. News a middle range of SAT scores with a midpoint of 995. The scores excluded foreign and remedial students.

    Richard Stockton College of N.J./Pomona, N.J.

    Gave U.S. News a middle range of SAT scores with a midpoint of 1095. Told S&P the average SAT score was 1041. Told Moody’s the average SAT score was 991. An official says only the Moody’s figure includes all freshmen.

    Sarah Lawrence College/Bronxville, N.Y.

    Gave U.S. News a middle range of SAT scores with a midpoint of 1215. Gave College Handbook a middle range of scores with a midpoint of 1145. Told Moody’s the median score was 1150.

    Note: Test scores are for fall 1993 freshman class. SAT scores represent combined verbal and math scores. SAT test-score ranges are for the middle 50% of students. Median is the middle score: Half are below, half are above. Average is the sum total of freshman scores divided by the number of freshmen.

    — Acceptance Rates

    Location U.S. News Moody’s

    Bard College Annandale-on-Hudson, N.Y. 44% 62.6%

    Christian Bros. University Memphis, Tenn. 59 73.0

    DePaul University Chicago 74 82.4

    Elizabeth City State University Elizabeth City, N.C. 54 73.0

    Georgian Court College Lakewood, N.J. 81 84.4

    Hood College Frederick, Md. 81 84.6

    Kent State University Kent, Ohio 33 86.7

    Nicholls State University Thibodaux, La. 94 100.0

    Wesleyan University Middletown, Conn. 41 42.5

    Note: Acceptance rate represents the percentage of applicants accepted for the fall 1993 freshman class. Some colleges attributed discrepancies to transcription or clerical errors; others said they could find no explanation.

    Sources: U.S. News’s America’s Best Colleges 1995 College Guide and recent Moody’s Investors Service Inc. credit reports

    (See related letters: “Letters to the Editor: Cooking the Campus Admissions Books” — WSJ April 27, 1995)

  13. Gann needs to go. The bucks stop with her, she cannot just hides behind some fancy
    law firm and pretend every thing is fine. I don’t believe she has no knowledge what was
    going on since 2005 with all these fake SAT numbers. Gann needs to step down.

Comments are closed.