From Bibliometrics to Altmetrics
Bibliometrics is the statistical analysis of books, journals, scientific articles and, authors. The term encompasses word frequency analysis, citation analysis, and counting the number of the articles of authors, amongst other forms of statistical analysis (Karanatsiou, Misirlis, & Vlachopoulou, 2017, p. 16).
Scientometrics is the study of science and technology with a focus on the interaction between scientometric theories and scientific communication (Mingers & Leydesdorff, 2015; Hood and Wilson, 2001). It is also the study of the bibliographies, as well as the evaluation of the scientific research and information systems (Van Raan, 1997).
Additionally, Tague-Sutcliffe (1992) explains the concept of informetrics as the study of the quantitative aspects of information in any form, not just records or bibliographies, and in any social group, not specifically scientists. The term informetrics was first coined by Nacke in 1979, but it only became an accepted category within bibliometrics and scientometrics in 1984, covering the basic definition of both metrics and characteristics of retrieval performance measures (Hood & Wilson, 2001; Brookes, 1990).
In 1999, with the advent of Web 2.0 and its effects on scientific discussions, webometrics were introduced to analyse linking, web citation, and search engine evaluation (Bar-Ilan, 2008; Thelwall, 2008). Online data is dynamic and can be considered as a huge bibliographic database of sites that work like scientific journals and from which web citations can be extracted. Applying data mining techniques to user-generated content on the internet, researchers can extract information on the impact of such data on scientists and researchers (Thelwall, 2008). Studies show that web citations are closely related to the Web of Science (WOS) citation count. Web citation collection is built on online conferences and science blogs and platforms (Thelwall & Kousha, 2015).
The quick growth of Web 2.0 and the considerable use of social media, the rapidly increasing availability of online literature, and online scholarship tools, all contributed to increasing scholarly communication online (Liu & Adie, 2013). Therefore, alternative metrics and measurements developed as a supplement to scientometrics and webometrics, in order to evaluate the influence of online practices on science; this kind of study became known as altmetrics (Priem, Groth, & Taraborelli, 2012). They can use as filters, which “reflect the broad, rapid impact of scholarship in this burgeoning ecosystem” (Priem et al., 2010). These terms are similar and often combined together. As with “bibliometrics” or “scientometrics”, the term altmetrics has been used to refer both to field of study and to the metrics or collected statistics themselves. Since altmetrics is concerned with measuring scholarly activity, it is also a subset of scientometrics. And because it has been used to measure activities on the Web, altmetrics can also be considered a subcategory of webometrics. Altmetrics focuses more narrowly on online tools and environments, rather than on the Web as a whole (Priem, 2015). The picture below illustrates how the terms are linked.
There are a variety of ways to access almetrics data. The Scopus database has a web application from altmetric.com that displays the altmetric value of an article if available. And publishers like PLOS One and Elsevier have also integrated user statistics which include, for example, HTML views and PDF downloads. Aggregator collects metric data and information from multiple sources online. It varies how the altmetrics aggregators collect and present data. This section provides information about the main largest tools such as Altmetric.com, Impactstory, and Plum Analytics that currently use in this field.
Altmetric.com (www.altmetric.com )
Altmetric.com is a London-based company that traces and analyses the altmetrics activity of scientific articles. The company markets itself to institutions, publishers, and researchers. They have three different products including altmetric bookmarklet, altmetric bookmarklet integrations, and Altmetric Explorer for Institutions (Roemer & Borchardt, 2015).
The altmetrics bookmarklet is compatible with all major browsers other than Internet Explorer (IE) and Microsoft Edge and provides the user with altmetrics scores and essential and detailed information in different platforms. Moreover, the information about people who tweeted or used these materials can be seen and used for further analysis.
In addition to Altmetric Bookmarklet functionalities Altmetric Bookmarklet Integration can be combined with individual journal articles on Scopus, in institutional repositories such as DSpace, and on journal articles published by business like SAGE, HighWire, and Nature Publishing Group (Ibid).
Altmetric Explorer for Institutions provides summaries of data at higher levels of evaluation. The service allows an individual to view altmetrics data for many journal articles, grouped by author or by source (journal). Considering a small variation on the interface which aims to address different target groups, both offer valuable analysis of wide range of comparison on altmetrics (Ibid, p. 15). Altmetric.com uses rankings for their data analysis. For example, news items are weighted more heavily than blogs, and blogs are more highly regarded than tweets. The algorithm also takes into account how authoritative the authors are. Results are presented visually with a donut that shows the proportional distribution of mentions by source type, with each source type displaying a different color—blue (for Twitter), yellow (for blogs), and red (for mainstream media sources) (Baykoucheva, 2015).
However, a DOI digital object identifier is required to retrieve data from Altmetric.com. The lack of DOI is a problem for older articles, even for articles that were written after the invention of the internet. To view an individual researcher’s metric data, it is necessary for the institution to subscribe. Illustration two on the next page shows an example of how an article’s metric score may look like.
Illustration 2: Altmetric information from Altmetric.com about article: Impact of climate change on the future of biodiversity (Bellard, et al., 2012). [Accessed 2018-05-19]
ImpactStory was founded by Heather Piwowar and Jason Priem as a Non Governmental Organization (NGO) with the goal of helping researchers examine the impact of their research. Their research profiles are based on altmetric sources such as Altmetric.com, Arxiv, Scopus, and Wikipedia. ImpactStory collects a variety of metrics with the goal of collecting and sharing impact data from all research projects by all researchers through open data. Even though it is a non-profit organization with entirely open data, there is a fee for creating a profile on ImpactStory (Roemer & Borchardt, 2015). The company offers a free one-month trial. The online service tracks journal articles, preprints, datasets, presentation slides, research codes, and other research outputs. It is known that ImpactStory aggregates data from Mendeley, GitHub, and Twitter, but the company does not disclose all its sources (Baykoucheva, 2015).
ImpactStory’s uniqueness lies in its analysis and ability to display the impact of one’s research in an easily understandable format, called an “impact story”. Users may need to import their items into ImpactStory, which in turn automatically gathers impact statistics from Scopus, Mendeley, Google Scholar, Slideshare, ORCID, and Pubmed Central. However, ImpactStory is not synchronized with the systems mentioned above and cannot automatically update its content. This application is an excellent tool for scholars who want to trace the impact of their web-native scholarship (Yang & Li, 2015, p. 233).
PlumX (www.plu.mx )
EBSCO owns Plum Analytics and, like ImpactStory, collects metric data and analyses it. Their product aimed at researchers is subscription system called PlumX. Researchers create a profile where you can categorize, visualize and analyse research results and impact. Plum Analytics collects data from an ever-expanding list of vendors such as EBSCO, PLOS One, Facebook, Twitter, WorldCat, Youtube, Scopus, PubMed, Wikipedia, Mendeley, and Amazon, to name a few. The company divides metric data into five different categories; usage (clicks, downloads, library loans), captures (bookmarks, saved favourites), mentions (blog posts, comments, Wikipedia links), social media (likes, divisions, tweets) and citations (Scopus, PubMed). Plum Analytics labels all of these downloads, blog posts, library loans, and so on as « artifacts »; an artifact is any research product available online (Roemer & Borchardt, 2015).
What are the advantages of altmetrics?
Supporters of the new approach to measuring the impact of research believe that altmetrics has many advantages compared to conventional bibliometric methods (Hammarfelt, 2014). The following list of the benefits of altmetrics is based on a categorization of the benefits mentioned in the literature by (Wouters & Costas, 2012). These authors recognized four benefits of altmetrics as compared to traditional metrics:
1. Broadness: the measurement encompass impact beyond the academic scientific community; 2. Speed: altmetrics measure impact soon after publication of the paper; 3. Diversity: altmetrics cover non-paper material; 4. Openness: altmetrics data is more accessible than traditional bibliometric data (Bornmann, 2014a, p. 898).
Most summaries of the benefits of altmetrics emphasize their potential for measuring the broader impact of research (Priem, Parra, Piwowar, & Waagmeester, 2012; Weller, Dröge, & Puschmann, 2011) with the hope that this more encompassing approach will result in a greater understanding of outside interest in and use of academic materials (Fausto et al., 2012). In contrast to a reliance on citations, altmetrics offers an opportunity to measure the engagement of a larger group outside the academic world (Adie, 2014; Hammarfelt, 2014). Furthermore, the breadth of altmetrics could support more holistic evaluation efforts; a range of altmetrics may help to solve the reliability problems of individual measures by triangulating scores from easily-accessible “converging partial indicators” (Priem, 2015, p. 274).
Citation counts do provide a reliable and valid measurement, but can only be provided several years after an article’s publication (Wang, 2013). In comparison, altmetrics provides the data impact within very short length of time after the publication (Haustein et al., 2014). Many social web tools offer real-time access to structured altmetric data via application programming interfaces (APIs) (Priem & Hemminger, 2010), with which the impact of a paper can be tracked at any time after publication. Consequently, the use of altmetric methods could be a practical solution in fields where publication and thus citation processes are slow (Hammarfelt, 2014).
Altmetrics are not simply another set of data but rather have the ability to provide a basis for the evaluation of the importance of scientific artifacts beyond text publications i.e. databases or statistical analyses (Bornmann, 2014a, p. 898).
This current demand for this broader approach is proof that other forms of scholarly products now play a crucial role in research evaluation (Piwowar, 2013; Rousseau & Ye, 2013). Therefore, altmetrics provides an opportunity to measure the impact of these research products both in science (Priem, 2015) and beyond science (Galloway, et al., 2013). As well as determining the impact of varying kinds of scientific material, altmetrics can also be used to trace a diversity of scholarly activities such as teaching and service activities (Rodgers & Barbrow, 2013). Accordingly, the diversity of fields with national and international scholars and, a large public audience should benefit from an approach that takes various publication channels into account (Hammarfelt, 2014).
Altmetrics provide a fascinating opportunity for measuring societal impact beyond the confines of a case study, given free access to data through Web APIs, which provide quick feedback about a large publication set (Galloway, et al., 2013). This context means that data collection is hardly troublesome (Thelwall, Haustein, Lariviere, & Sugimoto, 2013). In addition, altmetrics data is currently based on platforms with distinctly determined boundaries and data types, as is the case with Twitter or Mendeley (Priem, 2015), which makes the analysis of data and the interpretation of results easier. Accessibility of information allows researchers to see their own impact statistics and the data for other publications. As Wouters and Costas (2012) observe, many altmetrics analysis services are not open to the public due to the secretive practices of large companies such as Twitter and Mendeley.
Regarding what above mentioned, it seems that recently altmetrics got more attention to be used in studies and to evaluate the impact of research publications. Promoters of this new method noted that altmetrics have many privileges (as explained above) in contrast to bibliometric methods for measuring the impact of research. Moreover, there are different ways to access the altmetric data and this study provided information about the three significant tools that currently being used in altmetric. Consequently, it could be concluded that altmetrics offer great potential to conduct a research which is the reason that current study also use it as a base. Additionally, among tools, altmetric.com is the most suitable tool for blogs, news outlets and tweets and Mendeley readers. Additionally it has been chosen because the information that is provided in this service not only can meet all the requirements for this study but also they are accessible.
Table of contents :
1.1 WHAT ARE ALTMETRICS?
1.2 RESEARCH QUESTIONS AND LIMITATIONS
1.2.1 RESEARCH QUESTIONS
1.2.3 OUTLINE OF THE THESIS
2.1 FROM BIBLIOMETRICS TO ALTMETRICS
2.2 ALTMETRICS TOOLS
2.3 WHAT ARE THE ADVANTAGES OF ALTMETRICS?
3 LITERATURE REVIEW
3.1 LIBRARY AND INFORMATION SCIENCE: TOWARDS A DEFINITION OF THE FIELD, A DEBATE
3.2 THE DEVELOPMENT OF LIS
3.3 ALTMETRICS IN DIFFERENT DISCIPLINES
4.1 INTRODUCTION TO THE THEORETICAL PERSPECTIVES
4.2 CITATION THEORIES
4.2.1 THE NORMATIVE THEORY OF MERTON´S NORMS
4.2.2 SOCIAL CONSTRUCTIVIST THEORY
22.214.171.124 SOCIAL CAPITAL
126.96.36.199 ATTENTION ECONOMICS
188.8.131.52 IMPRESSION MANAGEMENT
4.3 APPLICATION OF THEORIES TO ALTMETRICS
5.1 METHOD OF CHOICE
5.2 DATA COLLECTION
5.3 DATA ANALYSIS
5.4 ETHICAL ISSUES
5.5 RELIABILITY AND VALIDITY
6.1 ALTMETRICS COVERAGE OF ARTICLES
6.2 THE DEMOGRAPHIC INFORMATION OF PEOPLE WHO SHARE AND READ LIS ARTICLES
6.2.1 WHO READS LIS ARTICLES IN SWEDEN?
6.2.2 WHO SHARES LIS ARTICLES IN SWEDEN?
6.3 WHAT TOPICS ARE MOST OFTEN MENTIONED
7 DISCUSSION AND CONCLUSION
7.1 ALTMETRIC SCORE OF LIS PAPERS
7.2 DEMOGRAPHIC INFORMATION OF WHO READS AND SHARES PAPERS
7.2.1 WHO READS LIS PAPERS IN SWEDEN
7.2.2 WHO SHARES LIS PAPERS IN SWEDEN?
7.3 WHICH TOPICS GOT THE MOST ATTENTION IN SWEDEN?
7.5 SUGGESTIONS FOR FUTURE RESEARCH
8.2 METHODOLOGY AND THEORY
8.3 RESULTS AND CONCLUSIONS