Skip to Main Content

CHEM 184/284 (Chemical Literature) - Huber - Winter 2024: Lecture 17

A two-credit course in the techniques and tools for effective searching the literature of chemistry, biochemistry, chemical engineering and related fields.

Lecture 17: Altmetrics


What are "metrics" in scholarly communication?

  • Metrics, in this context, are statistical analyses of scholarly output. Librarians have been doing this sort of thing for a long time.  The term "bibliometrics" was coined around 1969 to describe statistical analysis of books, articles, bibliographies, etc., inspired by earlier terms like "scientometrics" and "econometrics"
  • A metric (singular) is a standard of measurement applied to some body of literature.  The first widely recognized metric in scholarly publishing was the impact factor (see Lecture 9, part 3) devised by the late Professor Eugene Garfield in the sixties.  Lots of others have followed it.

Why are metrics important to scientists?

  • Originally, the impact factor, which measures the average "impact" of articles published in a given journal, was used by scholars to decide to which journal(s) to submit their research papers - the idea being that the higher the impact factor of the journal, the more exposure (and prestige) your research would have.
  • However, over time, impact factors came to be used as a quantitative (true!) and objective (more dubious) shortcut way to measure the quality of the research published in those journals.  The theory here was that high impact factor journals receive more article submissions, and therefore can (and must) be more selective in what they accept.  Therefore, if your paper is published in Nature it must be higher-quality research than if it were published in the (hypothetical) Antarctican Journal of Penguin Fat Chemistry.
  • Who wants to use such shortcuts?  Anyone who has to evaluate the quality of a scientist's work and doesn't hve the time or the expertise to do so by reading and evaluating the actual papers themselve.  This can include:
    • Hiring committees
    • Tenure and promotion committees
    • Funding agencies
    • Immigration agencies (who must decide whether to issue work permits to visiting scientists.)
  • Since almost every scientist is bound to encounter at least one of these situations in his/her career, impact factors and other metrics are important to them as well.

Why are impact factors, etc. imperfect tools for evaluating scholarly output?

  • First, most classic metrics do not directly evaluate the impact of the individual researcher - only that of the journals in which theypublish.
  • Second, metrics based on citation have an inherent "time lag".  A work of scholarship can begin to have impact in its field months, or even years, before any works citing it are published.
  • Third, traditional citation metrics ignore other types of impact that scholarship can have, especially in the social/public arenas.
  • Fourth, they do not track the impact of non-traditional publications, e.g. slideshows, software, videos, podcasts, etc. See, for example, the list of publication tytpes monitored by Plum Analytics

Enter...alternative assessment metrics = "altmetrics"

  • Starting around 2011, a number of groups began collecting and tabulating other kinds of statistics to measure scholarly impact.
  • At present, there are no standards for how to measure altmetrics, though NISO (National Industrial Standards Organization) is developing them. See, for example, the Draft Standard for the Altmetrics Data Quality Code of Conduct at the NISO Alternative Metrics Initiative page ( Each organization that measures altmetrics uses its own set of values and analyzes them in its own way.
  • There are far more types of potentially impactful publications than just scholarly journal articles. Plum Analytics has a list of over 67 publication types that they consider for impact. *
  • Some typical sources of altmetric information (taken from Plum Analytics - see below)
    • Usage - views, clicks, downloads
    • Captures - bookmarks, favorites, followers
    • Mentions - social networks, blogs, tweets, news items, reviews
    • Social Media - "likes", recommendations, shares
    • Citations - yes, they look at this traditional statistic, too.
  • Where do they get the numbers?
    • Pretty much anywhere they can get free data, such as...
    • Social media "likes", shares, mentions - Facebook, Twitter, Instagram, etc.
    • Video views, downloads, "favorites" and subscribers - YouTube, Vimeo
    • Slideshow views and downloads - SlideShare
    • Figure/poster views and downloads - FigShare
    • Software downloads and other "uses" - SourceForge, Github
    • Blog mentions -  ScienceSeeker, other blogs
    • Reviews - Goodreads, Amazon, Reddit
    • News media mentions - online newspapers
    • Social bookmarking/bibliography sites - Mendeley
    • Open access journals - PLoS, PubMed Central
    • Citation sources - CrossRef, PubMed Central, USPTO
    • commercial vendors willing to provide data.
    • and more.  See for example, source lists at

Who collects altmetric data?

  • Currently, there are three major sources of altmetric data, all founded in 2011:, ImpactStory and Plum Analytics. (

  • is a small, private, for-profit company.  It is now associated with Digital Science (
  • data is used by several major publishers, including Wiley and Taylor and Francis.
  • They generate altmetric data for journals, articles, institutions (universities, departments, research groups) and individuals (who work for institutions with an institutional contract.)  They also provide some free data for individual researchers: see Altmetric for Researchers (
  • Draws data from news, blogs, tweets, social media, social networking, and (recently) government policy documents. They have recently begun collaboration with the publishers of Web oScience for additional citation data.
  • Provides a free "bookmarklet" (, which works with your web browser to provide data on individual articles.  Note: this does not work with all publishers' websites.
  • Provides a variety of "badges" which can be inserted into articles, curricula vitae, and so forth to enhance them with altmetric information. (The badges are available free to institutional repositories and to individual researchers (
  • The Altmetric "donut":

Altmetric "donut" sample


ImpactStory (

  • ImpactStory is a not-for-profit company, funded in part by the National Science Foundation and Sloan Foundation, and in part by subscription fees from users.
  • ImpactStory lets individual researchers create profies - a sort of online curriculum vitae, which can link to all sorts of online publications (articles at a publisher's site or repository, slidshows, figures, software, etc.) and link them to altmetric data.  They can also provide analyses of the data, such as, geographical information on where users/viewers of your work are located.
  • Currently, ImpacStory is offering free accounts, using a Twitter login.
  • For an example of an ImpactStory profile, see this one for University of Florida professor, Ethan White  (
  • Note that ImpactStory can import profile data from ORCID (see below), SlideShare and other sources, so you don't have to rekey your information.
  • ImpactStory is now part of a larger organization, Our Research (, which engages in a number of research related projects, including a database of open access aricles, an open access altmetric data source called paperbuzz and more.

Plum Analytics (

  • Plum Analyticswass a for-profit company, acquired in 2014 by the major publisher, EBSCO and purchased in 2017 from EBSCO by Elsevier.
  • They assess scholarly impact at the individual researcher, research group. department, university, journal or publisher level.
  • Plum Analytics About Metrics (
  • PlumX ( is their "dashboard" for displaying altmetric data. 

How can I enhance my altmetrics?

  • Make sure that your work is accessible and discoverable.
    • Publish articles as open access and/or deposit copies of them in institutional repositories, such as UC's e-Scholarship ( or the equivalent at whatever institutions you may be connected to. Note that many funding agencies (e.g NIH) and institutions (e.g. the UC Faculty Senate) require (mandate) open access deeposition.
    • Similarly, deposit your data in open data repositories, such as UC's Merritt ( so your data can be discovered, downloaded and cited.  Note that the NSFand many other funding agencies now requires a data management plan of those applying for grants.
    • Other types of materials may be deposited in other venues:  Slide presentations in SlideShare (; figures and posters in FigShare (; open source software you've created in SourceForge ( or GitHub (
    • Create publicly accessible scholarly materials of other types: videos, podcasts, blogs.
    • Become active in social networking - both to promote your own publications, and to share, recommend or comment on the publications of others that you think are worthwhile or important.  Commenting on other's works raises your own presence on the Web, and leads others to take a look at your material
      • Social media: Facebook, Twitter, LinkedIn
      • Social bookmarking and bibliographies: CiteULike, Delicious, Mendeley, etc.
    • Make your chemical publications more discoverable via search engines by incorporating InChI identifiers (see for the chemical substances in your papers.
    • Perhaps most importantly, make sure that all your publications are linked to you by creating an ORCID profile (see below.)
    • For an example of a researcher who is deeply involved in activities recognized by altmetric, see Antony Williams online curriculum vitae. (

ORCID - the ID number for authors (

  • To quote their website, "ORCID is an open, non-profit, community-driven effort to create and maintain a registry of unique researcher identifiers and a transparent method of linking research activities and outputs to these identifiers."
  • Especially if you have a non-distinctive name, it is imporant to have a way for others to uniquely identify your scholarly publications.  ORCID provides such a mans. Similarly, if you have published under multiple names (nicknames, aliases, name changes), it's important to be able to bring all your publications together so you can get proper credit.
  • Creating a profile is free.  You can pring in lists of your publications in a variety of ways, such as keying them in directly or importing them.  ResearcherID, for example (see Lecture 9, part 1) can export profiles to ORCID and import them from ORCID.  ResearcherID, in turn, can import and export to and from a EndNote online database.
  • For examples of ORCID profiles, see Antony Williams' profile (linked from his webpage above) or Chuck Huber's profile (
  • Some databases (e.g. Web of Science) allow you to search ORCID numbers for author searching.  This trend is likely to increase.
  • ImpactStory can import your ORCID profile in order to know what publications to track for altmetric purposes.
  • Some organizations are beginning to require their researchers to create ORCIDs. e-Scholarship, for example, will require an ORCID for deposition of papers.
  • For more informatinn, see the UCSB Library's LibGuide ORCID (Open Researcher Contributor Identifier) at

© 20224Charles F. Huber

Creative Commons License
This work by Charles F. Huber is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Based on a work at


Login to LibApps