* Academic Analytics

UCSC entered into a contract with a company called Academic Analytics in 2013. Academic Analytics describes itself as a provider of “custom business intelligence data and solutions for research universities in the United States and the United Kingdom.” Their mission, in their words, is “to provide universities and university systems with objective data that administrators can use to support the strategic decision-making process as well as a method for benchmarking in comparison to other institutions.” Academic Analytics sells their data to universities as a tool to guide university leaders “in understanding strengths and weaknesses, establishing standards, allocating resources, and monitoring performance.”

One of their data bases, “Faculty Counts,” provides “a numerical summary of productivity on a person-by-person basis….[with] a numeric tally of each faculty member’s total scholarly productivity in each of the five areas of scholarly research (journal articles, citations, books, research grants and honorific awards) measured by Academic Analytics.” In other words, AA tallies the productivity of individual faculty across numerous universities.

The Academic Analytics database, however, suffers from a number of systematic flaws that lead to incompleteness and inaccuracy, making it of questionable value. For example, the measures mentioned above fail to count numerous journals, many granting agencies, various forms of collaboration on grants, and numerous awards.

SCFA learned about UCSC’s contract with AA last spring and began an investigation based on reported widespread inaccuracies in AA data collection at other universities. Our concerns were amplified in October 2016 when Georgetown University’s Provost, Robert Groves, posted a blog with the results of research he and others conducted that showed serious inaccuracies in Academic Analytics’ datasets. As just one example of Georgetown researchers’ findings: they compared AA data with 348 faculty members’ CV’s and found that AA captured only 48% of the academic papers — journal articles and conference papers — listed on faculty members’ CV’s. They found that the departments that fared least well were Computer Science and Psychology. They also found that AA significantly undercounted faculty grant activity. There were other troubling data as well, but for purposes of brevity, suffice it to say that they found AA’s data to be both incomplete and inaccurate. (Click here to read Groves’ blog).

In Provost Groves’ words: “the quality of AA coverage of the scholarly products of those faculty studied are far from perfect.” He added, and we think this point is perhaps more important: “Even with perfect coverage, the data have differential value across fields that vary in book versus article production and in their cultural supports for citations of others’ work.” The Georgetown Provost concludes: “we will be dropping our subscription to Academic Analytics.”

Along with the fact that AA’s data are incomplete and inaccurate, SCFA has additional concerns and questions.

1. Why did UCSC contract with AA? How does the campus leadership plan to use the data?

Labor Relations informed the SCFA that the University planned to use AA “as a source of curated comparative scholarly productivity data (e.g., grant activity, published articles/conference proceedings, citations) that will help leaders (e.g., at the decanal and campus-wide levels) understand opportunities we may have overlooked and to inform planning.”

With regard to opportunities “overlooked,” we presume that means grants that faculty should or could be seeking. It’s hard to understand how an incomplete and inaccurate database would provide better information and guidance than the various staff whose job it is to help faculty secure grants.

The University’s statement that it will use AA to “inform planning” is vague but raises concerns that AA data would be used to allocate FTE and other departmental resources.

2. Insofar as there already exist other ways of comparing departments across universities — the National Research Council rankings, for example — why is UCSC paying presumably hundreds of thousands of dollars, perhaps even half a million, for inaccurate and incomplete datasets?

Stay tuned for SCFA updates on this topic.