Back to top

Formula evaluation

Formula evaluation

CAUT News

Faculty groups in the United States are warning that the increasing reliance upon metrics to assess their “productivity” is threatening the traditional system of peer review in hiring, tenure and promotion decisions.

Full-time faculty members at Rutgers University have pro­tested the university’s decision to contract with Academic Analytics, a private company that has developed a patented algorithm — the Faculty Scholarly Productivity Index — to assess individual performance.

“Most faculty members have some sort of direct experience of metrics used to assess performance,” the American Asso­ci­ation of University Professors said in a statement released earlier this year. “There is, however, good reason to doubt the utility of such metrics in tenure and promotion decisions and/or in judgments
affecting hiring, compensation or working conditions.”

According to the Rutgers chapter of the AAUP, the FSPI is said “to encroach upon academic freedom, peer evaluation and shared governance by imposing its own criteria on emphasizing research” and “utterly ignoring the teaching, service and civic engagement that faculty perform.”

In Canada, universities and colleges have recently signed similar contracts with companies such as Scival and Faculty­180. In each case, metrics are used to develop an aggregate scoring of a program or department based on individual academic productivity to assess the scholarship quality of the whole unit.

“Significantly, individual academic staff are not permitted access to the data,” notes CAUT executive director David Robinson. “Only senior administrators can see the results, and faculty members aren’t always permitted to check the accuracy of how they’ve been assessed.”

Other critics have concerns about the capability of quant­itative research to fully capture research excellence.

A recent report produced by the Higher Education Funding Council for England expresses “skepticism” about the use of metrics, insisting that peer review should constitute the primary procedure for evaluating research quality. The same study also uncovered that indicators can be misused or “gamed,” and that “it is not currently feasible to assess research outputs or impacts … using quantitative indicators alone.”

“We need to ensure that metrics alone do not determine tenure, promotion, or hiring decisions,” adds Robinson.

Related

/sites/default/files/styles/responsive_low_constrict/public/two-young-researchers-carrying-out-experiments-in-a-lab-523752315_2125x1416.jpeg?itok=puDqHF7V
October, 2016

Universities ‘must confront’ taking research funds from oil firms

TimesHigherEducation.com UK universities have been warned by a leading environmental campaign... Read more
/sites/default/files/styles/responsive_low_constrict/public/news-default.png?itok=qGjUovvw
October, 2016

McMaster penalties quashed

In a strong rebuke to McMaster University, the Ontario Superior Court quashed penalties levied... Read more
/sites/default/files/styles/responsive_low_constrict/public/newstaffoctober2016.png?itok=bo-i0zYD
October, 2016

CAUT staff appointments

Shelley Melanson Membership Engagement Officer Shelley has extensive experience in organizing... Read more