Skip to content
News

Professors protest use of Academic Analytics in faculty, department assessments

 – Photo by Burning Silver Photography


In 1996, as part of his research as a professor, David Hughes constructed a map of 17 households considered to be within the Rusitu Botanical Reserve. His map showed that people lived there before the national park was legally declared, and as a result, the National Parks Department ceased threatening residents with eviction.

But Hughes, president of the Rutgers University faculty union, said his research did not show up in a new data mining tool by a company called Academic Analytics, LLC, that the University licensed in 2013.

The database, which tracks professors’ journal articles, citations, books, research grants and awards, and then compares those numbers to national averages, is often inaccurate, he said.

Academic Analytics does not track what others call “publicly-engaged scholarship,” such as his national park research. After submitting a records request, he found that Academic Analytics reported that Hughes wrote three articles, won two awards and published two books. In reality, he wrote one article and received one award.

On Dec. 14, the School of Arts and Sciences faculty met to pass a resolution regarding how the University uses Academic Analytics. Rutgers is paying $492,500 for the database over four years, according to the resolution.

The resolution stipulates that the University not use Academic Analytics in tenure and promotion decisions, or in allocating resources among departments and grant-writing.

On the basis of individual data, the University could aggregate the information and score whole departments or schools, Hughes said.

“(Academic Analytics) is part of what we call audit culture, where everything is reduced to a quantifiable variable and measured and compared in order to rank people," he said. "It's equivalent to high school testing."

The resolution also stipulates that the Deans’ Office of the School of Arts and Sciences should distribute Academic Analytics data to each faculty member. Without this, Hughes said the University would be violating the terms of their contract.

"The contract ensures that the process is fair and transparent and accountable. A big part of fairness, transparency and accountability is that the faculty member has access to everything in their personal file, with the sole exception of letters you get from other institutions," Hughes said.

Two days after the resolution was passed by the School of Arts and Sciences, Hughes met with Chancellor Richard Edwards to pass a memorandum of agreement on Academic Analytics. But the University did not agree to the memorandum, which asked the school to not use Academic Analytics in the promotion or tenure of faculty.

Signing the memorandum was unnecessary, Edwards said. The University already does use citation indexing and considers the quality of journals during the promotion process.

"Promotion standards change over time, the bar gets raised, someone who got tenured or promoted here 10 or 15 years ago might not be promoted here now because the expectations are different," he said. "We are looking at a variety of things in terms of output."

The University uses a wide scope of tools to determine a faculty member’s promotion or tenure, he said. The process spans an entire year.

Candidates submit a detailed form of everything that they've done at the University and develop a personal statement. The University compiles a list of the candidate's publications, any grant they may have received and letters sought from other schools and national experts. 

This information is sent to a department committee, an appointments and promotions committee at the school level, a University-wide appointments and review committee, the president and the Board of Governors.

Academic Analytics is simply one piece of a larger puzzle that would be used in considering tenure or promotion.

"We don't look at any one single thing," he said.

As for its “inaccuracy,” Edwards said the database does not collect specific information such as state grants or grants from foundations.

“Academic Analytics is very clear on what it measures and what it doesn’t,” Edwards said.

Even so, the use of Academic Analytics promotes the mass marketing of universities, Hughes said.

Unlike similar tools such as Google Scholar, the Academic Analytics database compares departments at Rutgers to departments at other colleges. And Hughes argues that comparisons only undermine the uniqueness of each university.

"You have U.S. News and World Report ranking colleges. If a college isn't ranking well, then they turn to Academic Analytics to justify the claim that they are the best in another way. None of that is helpful," Hughes said. "This advertising and ranking system encourages colleges to be more like each other and replicate each other."

But Edwards disagrees, and said the University invested in a tool that improve the image of the school, and in turn make students’ degrees more valuable to employers and draws in Ph.D. candidates.

“Individual Ph.D. programs can use it to bring in students,” Edwards said. "It's a question of 'Our program is ranked in the top 10 among all the Ph.D. programs in that field nationally.' That is a piece of information that might be useful to someone who is deciding whether to go here or Texas or Michigan."



Related Articles


Join our newsletterSubscribe