Back to top

Book review / Bibliometrics and research evaluation: Uses and abuses

Book review / Bibliometrics and research evaluation: Uses and abuses

Yves Gingras. The MIT Press, 2016; 119 pp; ISBN: 978-0-26203-512-5.

By Michael Hohner

In Bibliometrics and Research Evaluation, Yves Gingras provides a densely compact and accessible account of bibliometrics, tracing the early applications of bibliometrics to the contemporary misapplication of metrics of research evaluation at universities. Gingras is a professor and Canada Research Chair in History and Sociology of Science at Université du Québec à Montréal and contends a mounting danger with these tools being so poorly wielded as potential weapons in the hands of administrators and researchers alike.

Bibliometrics were initially used by librarians to separate journals that were frequently used from those that could be considered obsolete because they were rarely cited, and thus (re)move them to make space for recent issues, and later developed into tools for retrieving the burgeoning literature of science. From the outset of this book, Gingras illustrates how things have gone awry in the past few decades with these tools being misapplied to evaluating the performance of individual researchers and the quality of their research.

At a macro scale, Gingras demonstrates that the study of publications and citation patterns can yield insights on the global dynamics of science over time. However, these ill-defined quantitative indicators, especially when taken to the micro level and applied to individuals, often generate perverse and unintended effects on the direction of research.

In many ways, Gingras illustrates how the social sciences, and even arts and humanities, are following the sciences in publishing with the present-day mantra of evaluating scholarly output. Doubtless, these metrics are driving decisions on where and what to publish, and reshaping and even controlling and constraining scholarly discourse in many unexpected ways. As many conform to publish in journals with high impact factors, which are typically international in scope, many topics of local interest are neglected. Books become a much less important medium for publication, especially with journal citations counting for more and not lagging as much. (This work was originally published in French as Dérives de l’évaluation de la recherche and translated to English, but we find English has become the prevalent language surpassing and perhaps on the verge of supplanting all others.)

Gingras questions why researchers are allowing their research to be erroneously represented by the impact factor of the journals they publish in or by an H-index, which he compares to a broken thermo­meter, as it only goes up and never down, and does not provide a measure of quality independent of productivity, as he clearly illustrates. The attempt to be more objective has forced this metric tide where any number beats no number, and where we find many invalid indicators being applied to poorly articulated instruments, which run contrary to what was being proposed to be measured in the first place.

Gingras also argues that universities seem eager to let invalid indicators rank and diminish their contributions to soci­ety, as he discusses compilations of Mac­lean’s and the “Shanghai rankings,” as they’re called, and the exchange of highly cited researchers and similar questionable practices to garner more favourable numbers.

Given the trends in research evaluation, it would seem advisable for Gingras to have published this book as a series of ar­ticles initially in English in some major international journal. Despite being aware of how such scholarship would likely be evaluated, the author demonstrates tremendous academic leadership and quality in publishing this volume. This is a very short monograph, but of such critical importance that it should be acquired by every academic library. It is highly accessible and well documented with 20 pages of endnotes. It also should be required reading for those who need to be much more informed of the many abuses and disturbing trends in attempting to evaluate the quality of scholarship at universities … so everybody!

_______________________________________________________
Michael Hohner is a librarian at the University of Winnipeg.

Related

/sites/default/files/styles/responsive_low_constrict/public/bythenumbers-background.png?itok=q_VGXAWG
/sites/default/files/styles/responsive_low_constrict/public/newsapril2017-socialmedia.png?itok=j7xa7RZB
April 2017

Using social media to engage supporters & build solidarity

The popularity of social networking sites has created new opportunities not only for academics... Read more
April 2017

CAUT Staff Appointments

Marcel Roy Meeting & Event Planner  Marcel joined CAUT in January 2017. He is a graduate from... Read more