Skip to main content
LibApps staff login

Citation Research and Impact Metrics

Methods and metrics for evaluating scholarly and research impact.

Introduction to Alternative Metrics

Alternative metrics, also known as altmetrics, try measure the reach and impact of scholarship and research beyond citation counts that contribute to more commonly used metrics like citation counts, Journal Impact Factor, or author H-index. Alternative metrics are a way to discover a more complete picture of how a given scholarly work is used, as well as where, why, and by whom. Additionally, alternative metrics may be the only way to evaluate the impact of non-traditional scholarly outputs, such as data, tools, software, websites, blogs, and videos.

In particular, alternative metrics take advantage of the internet's ability to track interactions with online items in order to measure how many times they are accessed, used, and shared.

Alternative metrics can answer questions such as:

  • How many times was my work downloaded?
  • Who is reading my work?
  • Was it covered by any news agencies?
  • Are any other researchers commenting on it?
  • How many times was it shared? (on Facebook, Twitter, etc.)
  • Which countries are looking at my work?

This page was inspired by and modified from "Altmetrics" from the University Library System, University of Pittsburgh, which was licensed under a CC-BY 4.0 License.

Benefits of Alternative Metrics

  • Capture elements of societal impact: Alternative metrics can inform researchers of elements of the societal impact of their research. For example, altmetrics data can help researchers understand how their research is being interacted with by the public, government, policy makers, and other researchers.
  • Complement traditional metrics: Alternative metrics provide a wider range of data, from a wider range of sources than traditional metrics. Altmetrics data is also highly nuanced and can be provided in high detail and in the context in which it originates.
  • Offer speed and discoverability: Alternative metrics data accumulates at a faster speed compared to traditional metrics. In disciplines where citations grow slowly, or in the context of new researchers, this speed helps determine which outputs are gaining online attention.
  • Open access advantage: Providers like Altmetric.com and ImpactStory provide access to their API and source code. Alternative metrics providers also pull their data from open sources, who give access to their APIs or raw usage data, which makes altmetrics data more easily replicable than data in proprietary databases.

Considerations for Alternative Metrics

  • Lack of a standard definition: The field of alternative metrics remains undecided on what they truly measure. However, the NISO Alternative Assessment Metrics (Altmetrics) Initiative is currently working to create a standard definition of the term and has a draft of its definition open for public comment.
  • Alternative metrics data are not normalized: We do not advise comparisons between different sources and data sets for alternative metrics, as different providers collect different kinds of data. Instead, we suggest using these metrics to tell a story about your research.
  • Alternative metrics are time-dependent: Alternative metrics provide information about the use of the work, but much of this use has a lifespan - and that lifespan is unknown. For older works, there may not be much activity, but that does not necessarily mean that the work is not heavily used!
  • Alternative metrics have known tracking issues: Alternative metrics work best with items that have a Digital Object Identifier (DOI).

A note on usage counts

Usage counts show interest in a work even though the reader may never end up citing the article. On the other hand, the usage count only reflects the usage that item has received via that specific database or website; consequently, the usage count may not reflect the full level of interest.

Important Points:

  • Items may not have a usage count as these counts appear only in some databases and at some publishers. Consequently, the absence of a usage count does not mean the work has never been used.
  • No consistency. Each database and website counts usage differently. Sometimes the count is how often a link was clicked to landing page (usually just the abstract but not necessarily to the full text. In other cases, views of the full text may be counted but not downloads.
  • Theoretically, works available for a longer time have an advantage over more currently published works, however, usage counts are a relatively new metric so the count may only reflect usage from a few recent years even for older works.
  • Do not add usage counts from different sources as you may end up counting a single transaction more than once.
  • Usage counts do not determine the quality of a work as there are both positive as well as negative reasons to view the item.

Uses:

  • Usage counts help to determine if there is interest, but
  • Only compare usage counts for different items if the counts come from the same database/website and the works are from the same time span.  

Further Reading

Alternative Metrics by type of work

  • Citations: Scopus, Web of Science, PubMed Central, and Google Scholar citations; citations in policy documents.
  • Bookmarks: scholarly bookmarks on Mendeley, CiteULike; bookmarks by the public on Del.icio.us and Pinboard; Twitter favorites.
  • Discussion: peer reviews on F1000, Publons, PubPeer, and other post-publication peer review websites; Twitter mentions and Facebook wall posts; newspaper articles, videos, and podcasts; mentions on scholarly blog networks.
  • Shares: Twitter mentions, Facebook shares.
  • Views: Page view and download statistics from the journal website and/or repository where you've archived your paper.
  • Citations: Web of Science and Scopus citations; Google Book citations.
  • WorldCat holdings: the number of libraries worldwide that have purchased your book.
  • Views: page view and download statistics from your publisher's website or the repository where you've archived your book/chapter.
  • Ratings: Amazon.com and Goodreads ratings
  • Discussion: see "Articles"
  • Bookmarks: see "Articles"
  • Citations: Data Citation Index and Google Scholar citations.
  • Views: views and downloads from Figshare, Zenodo, Dryad, ICPSR, or other subject or institutional repository.
  • Reuse: GitHub forks.
  • Discussion: Figshare comments, also see "Articles".
  • Bookmarks: see "Articles"
  • Views: views and downloads on Figshare, Zenodo, or other institutional or subject repository.
  • Discussion: Figshare comments, see also "Articles".
  • Bookmarks, Shares: see "Articles"
  • Views: views and downloads on Slideshare, Speakerdeck, and Figshare, or other institutional or disciplinary repository.
  • Discussion: Slideshare and Figshare comments, see also "Articles".
  • Shares: Slideshare embeds on other websites; mentions on Twitter, Facebook shares, LinkedIn shares.
  • Likes: Slideshare and Speakerdeck likes.
  • Bookmarks: see "Articles"
  • Citations: Google Scholar citations.
  • Downloads: download statistics from GitHub, Bitbucket, Sourceforge, or other institutional or subject repository.
  • Adaptations: GitHub forks, Bitbucket clones.
  • Collaborators: GitHub collaborators
  • Discussion: GitHub gists, mentions on Twitter, Figshare comments.
  • Bookmarks, Shares: see "Articles".
  • Views: Youtube, Vimeo, and Figshare views.
  • Likes/Dislikes: Youtube likes and dislikes; Vimeo likes.
  • Discussion: Youtube, Vimeo, and Figshare comments; see also "Articles".
  • Shares, Bookmarks: See "Articles".

Alternative Metrics Tools

Use Cases

An increasing number of researchers are using alternative metrics to help document the varied impacts of their work in their CVs, tenure & promotion dossiers, and grant and job applications. When using alternative metrics to document your research’s influence, keep in mind that context is very important for making the numbers you list meaningful. Provide contextual information like percentiles and maps that communicates to your viewer how your paper or other research output has performed relative to others’ papers/outputs.

The ASU Library acknowledges the twenty-three Native Nations that have inhabited this land for centuries. Arizona State University's four campuses are located in the Salt River Valley on ancestral territories of Indigenous peoples, including the Akimel O’odham (Pima) and Pee Posh (Maricopa) Indian Communities, whose care and keeping of these lands allows us to be here today. ASU Library acknowledges the sovereignty of these nations and seeks to foster an environment of success and possibility for Native American students and patrons. We are advocates for the incorporation of Indigenous knowledge systems and research methodologies within contemporary library practice. ASU Library welcomes members of the Akimel O’odham and Pee Posh, and all Native nations to the Library.