Login to LibApps

Citation Research and Impact Metrics: Home

Index

Introduction to:
Article Assessments
Author Assessments
Country Assessments
Journal Rankings

Metrics:
Altmetric Score

Citation Benchmarking
Citation Counts for: 
---Articles
---Authors
---Countries
Citation Distribution, see Citation Benchmarking
CiteScore 
Collaboration

Eigenfactor Score, see Other Journal Rankings
ERIH Plus, see Other Journal Rankings

Field-weighted citation impact (FWCI), see Citation Benchmarking
FWCI, see Citation Benchmarking

Google Scholar (Journal) Metrics, see Other Journal Rankings

Harzing, see Other Journal Rankings
Hirsch-index
h-index

iCite for:
---Articles, see Citation Benchmarking
---Authors

JIF
Journal Impact Factor

NIH ranking, see iCite

Publish or Perish software, see Citation Counts for Authors: Other Sources

RCR, see iCite
Relative Citation Ratio, see iCite

Scimago Country Rank (SCR)
Scimago Journal Rank, see CiteScore 
SJR, see CiteScore
SNIP, see CiteScore​
​Source Normalized Impact per Paper, see CiteScore

Usage Counts

How to Use This Guide

  • Use the tabs above to select what you are trying to access, an article, an author, a country or a journal.
  • Read the introduction to the section first.  The introduction contains general information about assessment of the item or individual, overall strengths and weaknesses for this type of assessment, and suggested uses for the data.  
  • After reading the introduction, select a metric.  Each metric has its own page that can be reached by clicking on the "down arrow" on the tab, from links within the Introduction, or by using the guide index in the left-hand column of each page.  
  • The metric's page will discuss what data the metric will produce, its strength and weaknesses (important points), and uses in addition to what is recommended on the introduction page for that section. 

If you have been told to use a specific metric, look in the left-hand column (index) to locate the metric.    

This guide does not attempt to list every type of metric that exists, instead, we concentrate on ones to which the ASU faculty have easy access or those we think might be most useful. 

 

When using metrics, keep in mind these "Rules of Thumb":

  1. Use at least two different metrics for assessment.
    Each metric has its strengths and weaknesses. Selecting metrics that balance each other reduces the possibility of inadvertent favoritism or penalization
     
  2. Compare "apples to apples" not "apples to oranges."
    Do not mix scores from different metrics as each metric uses different sources to obtain data. Compare Web of Science counts to other Web of  Science counts; do not compare a Web of Science count for Article A to the Google Scholar count of Article B.  Additionally, do not mix scores across different subject fields as citation behavior varies considerably; a low citation count in one field may actually be considered a high count in another. 
     
  3. Include qualitative assessment in addition to numerical metrics.  
    As tempting as just using the raw data may be, the numbers must be put in context.  Is the citation count due to positive or negative reasons? How does the count compare to others in the same subject field, the same journal and the same timeframe?  Is the count increasing or decreasing with each successive year?  Are there weaknesses in the metric that would favor or penalize the item or person under review? 

Hours and Locations