PlumX Metrics
PlumX Metrics
Research metrics (also known as altmetrics), immediately measure awareness and interest and give us new ways to uncover and tell the stories of research.
PlumX Metrics help answer the questions and tell the stories about research.
Technologies that encourage communication, sharing and other interaction with research output—leave “footprints” to show the way back to who is interested in the research and why. Technologies that make processing big data possible—make it possible to categorize and analyze all of the metric data from the many interactions.
About PlumX Metrics
PlumX Metrics provide insights into the ways people interact with individual pieces of research output (articles, conference proceedings, book chapters, and many more) in the online environment. Examples include research being mentioned in the news or included in a policy citation. Collectively known as PlumX Metrics, these metrics are divided into five categories to help make sense of the huge amounts of data involved and to enable analysis by comparing like with like.
PlumX gathers and brings together appropriate research metrics for all types of scholarly research output. We categorize metrics into 5 separate categories: Citations, Usage, Captures, Mentions, and Social Media:
Citations | This is a category that contains both traditional citation indexes such as Scopus, as well as citations that help indicate societal impact such as Clinical or Policy Citations. Examples: citation indexes, patent citations, clinical citations, policy citations |
Usage | A way to signal if anyone is reading the articles or otherwise using the research. Usage is the number one statistic researchers want to know after citations. Examples: clicks, downloads, views, library holdings, video plays |
Captures | Indicates that someone wants to come back to the work. Captures can be a leading indicator of future citations. Examples: bookmarks, code forks, favorites, readers, watchers |
Mentions | Measurement of activities such as news articles or blog posts about research. Mentions is a way to tell that people are truly engaging with the research. Examples: blog posts, comments, reviews, Wikipedia references, news media |
Social media | This category includes the shares, likes, etc. that reference the research. Social Media can help measure “buzz” and attention. Social media can also be a good measure of how well a particular piece of research has been promoted. Examples: shares, likes, comments |
Citation metrics
Citation counts in PlumX are measures of how many times your research has been cited by others. Including citation counts alongside the other modern metrics categories allows for side-by-side analysis.
The following are the sources of citation counts that are currently in PlumX.
Metric | Source(s) | Description |
---|---|---|
Citation Indexes | Chinese Science Citation Database | The number of Chinese Citation Database (CSCD) works that cite the artifact |
Citation Indexes | CrossRef | The number of articles that cite the artifact according to CrossRef |
Citation Indexes | PubMed Central | The number of PubMed Central articles that cite the artifact |
Citation Indexes | PubMed Central Europe | The number of PubMed Central Europe articles that cite the artifact |
Citation Indexes | The number of SciELO articles that cite the artifact | |
Citation Indexes | The number of articles that cite the artifact according to Scopus | |
Citation Indexes | The number of SSRN works that cite the artifact | |
Patent Citations | The number of patents that reference the artifact according to the United States Patent and Trademark Office | |
Patent Family Citations | EPO 在新的选项卡/窗口中打开, IPO 在新的选项卡/窗口中打开, JPO 在新的选项卡/窗口中打开, USPTO 在新的选项卡/窗口中打开, WIPO 在新的选项卡/窗口中打开 Read more about Patent Family Citations | The number of patent families that reference the artifact according to the European Patent Office (EPO), World Intellectual Property Organization (WIPO), Intellectual Property Office of the United Kingdom (IPO), United States Patent and Trademark Office (USPTO) and Japan Patent Office (JPO) |
Clinical Citations | The number of Clinical Guidelines from PubMed that reference the artifact | |
Policy Citations | Policy document source lists curated by PlumX and Overton.io 在新的选项卡/窗口中打开 | The number of policy documents that reference an artifact |
Usage metrics
Article level usage metrics are the number one statistic that researchers want to know after their citation counts.
Is anyone reading our work?
Did anyone watch our videos?
PlumX is unique in combining artifact-level Usage data with other artifact-level metrics. Below is a listing of the current Usage metrics that PlumX supports, and the providers of the data.
Metric | Source(s) | Description |
---|---|---|
Abstract Views | Airiti iRead eBooks, Digital Commons, SciELO 在新的选项卡/窗口中打开, SSRN 在新的选项卡/窗口中打开 | The number of times the abstract of an artifact has been viewed |
Downloads | Airiti iRead eBooks, Airiti Library, Digital Commons, Institutional Repositories, Mendeley Data, Pure (for select customers only), RePEc, SSRN 在新的选项卡/窗口中打开 | The number of times an artifact has been downloaded |
Full Text Views | Airiti iRead eBooks, PubMedCentral (for PLOS articles only) 在新的选项卡/窗口中打开, SciELO 在新的选项卡/窗口中打开 | The number of times the full text of an article has been viewed |
Views | Mendeley Data | The number of times the artifact has been viewed |
Capture metrics
Captures track when end users bookmark, favorite, become a reader, become a watcher, etc.Captures indicate that someone wants to come back to the work. Captures are important because they are an early, leading indicator of future citations.
Below is a table of the metrics sources that PlumX uses for capture metrics.
Metric | Source(s) | Description |
---|---|---|
Readers | Mendeley, SSRN 在新的选项卡/窗口中打开 | The number of people who have added the artifact to their library/briefcase |
Exports/Saves | This includes the number of times an artifact’s citation has been exported direct to bibliographic management tools or as file downloads, and the number of times an artifact’s citation/abstract and HTML full text (if available) have been saved, emailed or printed. |
Mention metrics
Mentions are the blog posts, comments, reviews, and Wikipedia links about your research.This category measures when people are truly engaging with your research.Mentions are where the stories of how people are interacting with research can be discovered. The PlumX platform automatically uncovers mentions.
Below is a listing of the sources of mentions that PlumX monitors.
Metric | Source(s) | Description |
---|---|---|
Blog Mentions | Blog lists curated by PlumX | The number of blog posts written about the artifact |
News Mentions | News source lists from Lexis-Nexis Metabase and curated by PlumX | The number of news articles written about the artifact |
References | Wikipedia (English, Chinese, Dutch, Egyptian Arabic, French, German, Italian, Japanese, Persian, Polish, Portuguese, Russian, Spanish, Swedish, Turkish, Ukrainian) | The number of references found to the artifact |
Social media metrics
Social media metrics are the +1s, likes and shares about research.By tracking social media metrics, you can see how well a researcher is promoting their work.This is especially important for early career researchers to measure and understand who is interacting with their work.Of course, social media also allows us to track the buzz and attention surrounding research.
The following table lists the sources that PlumX tracks for Social Media.
Metric | Source(s) | Description |
---|---|---|
Shares, Likes & Comments | The number of times a link was shared, liked or commented on | |
Ratings | The average user rating of the artifact. |