A citation-evaluation robot can help you sort out science articles

Anny Gano
March 13, 2020
Robot works at laptop while man drinks coffee
Robot works at laptop while man drinks coffee. Image by jossnatu. Licensed from iStock.com.

The many excellent tools available to a scientist today do not all come from Thermo Fisher Scientific or live on a lab bench.  An ever-expanding list of apps and programs are designed to make our science lives easier than ever. One such cool tool has recently come across my desk(top), and I wanted to tell you a little bit about it.

Scite.ai is an artificial intelligence (AI)-guided smart citation service brought to us by a group of developers in a Brooklyn-based start-up. The site has been live for less than a year but has already achieved some pretty important milestones, including receiving a Small Business Innovation Research grant from the National Institutes of Health (NIH) and a mention in the February 7, 2020 Science magazine news brief. The site seems to be gaining both traction and notice very quickly. 

Looking at how frequently a scientist’s article has been cited in other peer-reviewed publications has been a time-honored way to evaluate his or her published work. Sites such as Google Scholar and Scopus allow one to look up an author and find their most cited papers, or to see how many times a given paper has been cited. 

Scite.ai offers a new twist: it will not only tell you how many times the paper has been cited, aka its total number of “mentions,” but also how many times it has been cited as “supporting” versus “contradicting.” In other words, you will know how many other groups of researchers have independently confirmed the findings or perhaps found them hard to replicate. 

“Scite.ai can help scientists evaluate an article’s credibility by quickly checking whether the paper they are looking up (...) is controversial, popular, suspect or dependable.”

-- Anny Gano, postdoctoral fellow

The issue of replicability in research is a hot topic. The “publish or perish” culture of academia, predatory journal practices and sometimes just plain old mistakes add up to an uncomfortably high rate of unreproducible research piling up in the biomedical sciences. These mistakes in published work can lead to misguided forays down the wrong avenue of research for other scientists. Researchers may, for example, use a poorly supported paper as the basis for their new experiments, which leads to wasted effort and resources. 

Scite.ai can help scientists evaluate an article’s credibility by quickly checking whether the paper they are looking up (paper A) is controversial, popular, suspect or dependable. Another useful feature is that when you look up paper A, Scite.ai will show you a brief excerpt of text that surrounds the citation of paper A in paper B. This feature enables readers to contextualize the way paper A is being used in the argument others are making and see for themselves a glimpse of how others are evaluating A’s work. 

Scite.ai also notes each specific citation event separately. For instance, if paper A is cited in the introduction of paper B as a background source, and then again in the methods of paper B as a methods source, both instances will be reported and shown separately. You can further narrow down the search by specifying that you only want to see mentions of paper A that have appeared in people’s methods or perhaps only in review papers. 

After immediately Scite-ing my own papers and finding nothing salacious to report, I decided to look up a few others. Did you know that Lisa Kudrow, who once upon a time played Phoebe Buffay on the sitcom Friends, has a scientific publication that came out the same year as Friends premiered? The neurology paper, published in 1992 in the journal Cephalalgia, sought to examine whether being left- or right-handed had an effect on suffering from migraines. The study found no effect of handedness on headaches, and Scite.ai has informed me that, while the study was mentioned in one other article, no follow-up work has either confirmed or denied its findings.  

After trialing this tool, I have decided that I quite like it. The handiest feature that will make it easy to continue using this service is the Scite.ai google chrome extension that I downloaded. Now, whenever I am reading an article in my browser, a fairly minimalist (which is a plus!) pop-up will appear on the right and inform me of the article’s citation stats. Being able to evaluate papers in real time using this feature will of course help me find articles for use in papers or experiments. However, it is also pretty handy for reading the primary literature of another field with which I am less familiar. For instance, if I want to learn a little bit about how the coronavirus works, I can use Scite.ai to help me identify articles that have been widely supported by other scientists or ones that are controversial. 

Of course, this tool was launched pretty recently, and so there is room for refinement and improvement. Not all articles have been catalogued yet. Also, to enable the AI to scan works for content, the makers of Scite.ai have to strike agreements with publishing companies that control access to articles. The list of sources continues to expand, but some papers may not be accessible because they are either not yet available to the Scite.ai team, or because their file format is more difficult for the AI to process. And, of course, a potential issue with any AI is the fact that it is only as good as the people who “train” it, so biases and errors may of course occur. 

Overall, however, Scite.ai is an interesting lab resource that I recommend exploring!