It’s About Time for Transparency

(Editor’s Note: This article is an expanded version of an article published by the Raleigh News and Observer on January 4, 2013.)

Back in 2007, American universities faced a threat—the Department of Education wanted them to show that they were actually teaching something! Margaret Spellings, education secretary was on a tear with her “Spellings Commission on the Future of Higher Education” and it looked as though the federal government might start mandating some kind of evidence of education.

Some enterprising universities quickly marshaled forces and created a Voluntary System of Accountability, which was designed to persuade enough universities to show their success that they would fend off federal regulation.

Then the housing crash occurred, Margaret Spellings left office, and government watchdogs turned to other matters.

And the VSA languished.

But it may be coming to life—at least at the University of North Carolina.

In 2007-2008, the university system took some serious steps to meet the accountability standard. The general administration covered the costs for each school to evaluate student learning as part of their participation in the Voluntary System of Accountability pilot program.

The schools were expected to measure learning outcomes using one of several available assessments. They were told to provide “clear, accessible, and comparable information on the undergraduate student experience.” That information was to be posted on the VSA’s website: CollegePortraits.org.

Ten schools have reported learning outcomes thus far. [See Table.] When participating in the VSA, institutions are not required to report student learning outcomes until four years after initial sign up—this year. UNC-Chapel Hill declined to post its CLA results “because campus leaders/faculty believed the test results weren’t representative” despite the study’s use of statistically sound and publisher recommended sample sizes—an omission that the Board of Governors should address immediately. Going forward, schools will update results every three years.

There are three widely accepted methods of assessment, described in an earlier Pope Center article. Many schools use them but in the past rarely made them public.

Generally viewed as the most effective is the Collegiate Learning Assessment (CLA). This is the assessment that Richard Arum and Josipa Roksa used in their book Academically Adrift, which shocked readers by revealing that, nationally, today’s college students show little to no academic progress by their sophomore year. Nine of the ten UNC campuses that have provided information have reported their CLA scores.

These data, although limited, start to form a useful picture of student learning in North Carolina. The UNC Board of Governors seems to think so. Their plans for the upcoming years, laid out in a statement by the Strategic Directions Committee, lists strengthening academic quality as a top priority, and includes the directive to “identify [the] most effective ways to assess and assure student learning.”

The 2007-08 VSA pilot assessment program is an excellent start to fulfilling that goalassuming that the January 2013 goal is met. Two additions to those efforts would create a truly open and useful assessment program for North Carolina.

First, all schools in the UNC system should use one standard test. The College Learning Assessment (CLA) is the logical choice; eleven out of the 16 UNC schools used CLA in the 2007-08 pilot program.

CLA assesses students’ abilities to think critically, reason analytically, solve problems and communicate clearly and cogently. CLA is made up of four sections: a performance task, an analytical writing task, a make-an-argument section, and a critique-an-argument section. Scores are aggregated at the institutional level to inform the institution about how their students as a whole are performing. After controlling for college entrance scores (SAT or ACT), freshmen scores are compared with graduating senior scores to obtain the institution’s contribution to students’ results.  Students’ entrance scores help CLA to determine whether a university is at, above, or below expected performance.

Not only did the Academicqlly Adrift authors use the CLA, but Bill Gates endorsed it in early December. He wrote on his blog, The Gates Notes, “most people would agree that skills like critical thinking, complex reasoning and writing—the things the [CLA] test does measure—are pretty important.”

Moreover, postgraduate outcomes mirror the CLA’s results, Arum and Roksa found. “For example, students in the bottom quintile of CLA performance as seniors are more than three times as likely to be unemployed two years after college than graduates whose CLA scores were in the top quintile; they were also twice as likely to be living back at home with their parents,” Arum said.

The second improvement is to publish the results—regularly and in a place that’s easy to access. Universities shouldn’t be able to selectively withhold results when they don’t like them. Results should be broken down by university and department (or, if that requires testing too many students, then at least by school.)

Parents, students, legislators, employers, and taxpayers should know how much students are actually learning—and which departments are delivering real value to the students and citizens of North Carolina. That level of detail would allow students to make better decisions when deciding where to go to college, employers to know the best departments from which to recruit, and North Carolina’s taxpayers and parents to know how well their money is spent.

The limited results we already have invite important questions: Why is NC State, one of our flagship universities, underperforming UNC Asheville, a smaller school that obtains less funding? How is it that UNC Pembroke increased learning outcomes by 152 points given its relatively limited resources? Are there some best practices that other schools could incorporate?

Moving forward, more transparency will allow stakeholders to ask these important questions and receive clear answers. And it’s about time.