Metrics, in this context, are statistical analyses of scholarly output. Librarians have been doing this sort of thing for a long time. The term "bibliometrics" was coined around 1969 to describe statistical analysis of books, articles, bibliographies, etc., inspired by earlier terms like "scientometrics" and "econometrics"
A metric (singular) is a standard of measurement applied to some body of literature. The first widely recognized metric in scholarly publishing was the impact factor (see Lecture 9, part 3) devised by the late Professor Eugene Garfield in the sixties. Lots of others have followed it.
Why are metrics important to scientists?
Originally, the impact factor, which measures the average "impact" of articles published in a given journal, was used by scholars to decide to which journal(s) to submit their research papers - the idea being that the higher the impact factor of the journal, the more exposure (and prestige) your research would have.
However, over time, impact factors came to be used as a quantitative (true!) and objective (more dubious) shortcut way to measure the quality of the research published in those journals. The theory here was that high impact factor journals receive more article submissions, and therefore can (and must) be more selective in what they accept. Therefore, if your paper is published in Nature it must be higher-quality research than if it were published in the (hypothetical) Antarctican Journal of Penguin Fat Chemistry.
Who wants to use such shortcuts? Anyone who has to evaluate the quality of a scientist's work and doesn't hve the time or the expertise to do so by reading and evaluating the actual papers themselve. This can include:
Tenure and promotion committees
Immigration agencies (who must decide whether to issue work permits to visiting scientists.)
Since almost every scientist is bound to encounter at least one of these situations in his/her career, impact factors and other metrics are important to them as well.
Why are impact factors, etc. imperfect tools for evaluating scholarly output?
First, most classic metrics do not directly evaluate the impact of the individual researcher - only that of the journals in which theypublish.
Second, metrics based on citation have an inherent "time lag". A work of scholarship can begin to have impact in its field months, or even years, before any works citing it are published.
Third, traditional citation metrics ignore other types of impact that scholarship can have, especially in the social/public arenas.
Fourth, they do not track the impact of non-traditional publications, e.g. slideshows, software, videos, podcasts, etc. See, for example, the list of publication tytpes monitored by Plum Analytics
Starting around 2011, a number of groups began collecting and tabulating other kinds of statistics to measure scholarly impact.
At present, there are no standards for how to measure altmetrics, though NISO (National Industrial Standards Organization) is developing them. See, for example, the Draft Standard for the Altmetrics Data Quality Code of Conduct at the NISO Alternative Metrics Initiative page (http://www.niso.org/topics/tl/altmetrics_initiative/) Each organization that measures altmetrics uses its own set of values and analyzes them in its own way.
Altmetric.com is a small, private, for-profit company. It is now associated with Digital Science (http://www.digital-science.com/)
Altmetric.com data is used by several major publishers, including Wiley and Taylor and Francis.
They generate altmetric data for journals, articles, institutions (universities, departments, research groups) and individuals (who work for institutions with an institutional contract.) They also provide some free data for individual researchers: see Altmetric for Researchers (https://www.altmetric.com/audience/researchers/)
Draws data from news, blogs, tweets, social media, social networking, and (recently) government policy documents. They have recently begun collaboration with the publishers of Web oScience for additional citation data.
Provides a free "bookmarklet" (https://www.altmetric.com/products/free-tools/), which works with your web browser to provide Altmetric.com data on individual articles. Note: this does not work with all publishers' websites.
Provides a variety of "badges" which can be inserted into articles, curricula vitae, and so forth to enhance them with altmetric information. (The badges are available free to institutional repositories and to individual researchers (https://www.altmetric.com/products/free-tools/free-badges-for-researchers/)
ImpactStory is a not-for-profit company, funded in part by the National Science Foundation and Sloan Foundation, and in part by subscription fees from users.
ImpactStory lets individual researchers create profies - a sort of online curriculum vitae, which can link to all sorts of online publications (articles at a publisher's site or repository, slidshows, figures, software, etc.) and link them to altmetric data. They can also provide analyses of the data, such as, geographical information on where users/viewers of your work are located.
Currently, ImpacStory is offering free accounts, using a Twitter login.
For an example of an ImpactStory profile, see this one for University of Florida professor, Ethan White (https://impactstory.org/u/0000-0001-6728-7745)
Note that ImpactStory can import profile data from ORCID (see below), SlideShare and other sources, so you don't have to rekey your information.
ImpactStory is now part of a larger organization, Our Research (https://our-research.org/), which engages in a number of research related projects, including a database of open access aricles, an open access altmetric data source called paperbuzz and more.
PlumX (https://plu.mx/) is their "dashboard" for displaying altmetric data.
How can I enhance my altmetrics?
Make sure that your work is accessible and discoverable.
Publish articles as open access and/or deposit copies of them in institutional repositories, such as UC's e-Scholarship(http://escholarship.org/) or the equivalent at whatever institutions you may be connected to. Note that many funding agencies (e.g NIH) and institutions (e.g. the UC Faculty Senate) require (mandate) open access deeposition.
Similarly, deposit your data in open data repositories, such as UC's Merritt (https://merritt.cdlib.org/) so your data can be discovered, downloaded and cited. Note that the NSFand many other funding agencies now requires a data management plan of those applying for grants.
Other types of materials may be deposited in other venues: Slide presentations in SlideShare (http://www.slideshare.net/); figures and posters in FigShare (http://figshare.com/); open source software you've created in SourceForge (http://sourceforge.net/) or GitHub (https://github.com/)
Create publicly accessible scholarly materials of other types: videos, podcasts, blogs.
Become active in social networking - both to promote your own publications, and to share, recommend or comment on the publications of others that you think are worthwhile or important. Commenting on other's works raises your own presence on the Web, and leads others to take a look at your material
Social media: Facebook, Twitter, LinkedIn
Social bookmarking and bibliographies: CiteULike, Delicious, Mendeley, etc.
Make your chemical publications more discoverable via search engines by incorporating InChI identifiers (see http://www.iupac.org/home/publications/e-resources/inchi.html) for the chemical substances in your papers.
Perhaps most importantly, make sure that all your publications are linked to you by creating an ORCID profile (see below.)
ORCID - the ID number for authors (http://orcid.org/)
To quote their website, "ORCID is an open, non-profit, community-driven effort to create and maintain a registry of unique researcher identifiers and a transparent method of linking research activities and outputs to these identifiers."
Especially if you have a non-distinctive name, it is imporant to have a way for others to uniquely identify your scholarly publications. ORCID provides such a mans. Similarly, if you have published under multiple names (nicknames, aliases, name changes), it's important to be able to bring all your publications together so you can get proper credit.
Creating a profile is free. You can pring in lists of your publications in a variety of ways, such as keying them in directly or importing them. ResearcherID, for example (see Lecture 9, part 1) can export profiles to ORCID and import them from ORCID. ResearcherID, in turn, can import and export to and from a EndNote online database.
For examples of ORCID profiles, see Antony Williams' profile (linked from his webpage above) or Chuck Huber's profile (http://orcid.org/0000-0003-0205-2261)
Some databases (e.g. Web of Science) allow you to search ORCID numbers for author searching. This trend is likely to increase.
ImpactStory can import your ORCID profile in order to know what publications to track for altmetric purposes.
Some organizations are beginning to require their researchers to create ORCIDs. e-Scholarship, for example, will require an ORCID for deposition of papers.