Blog

Showing 10 of 11 Results

09/15/2023
profile-icon Liselotte Brandstrup

 


Written by Lene Janussen Gry

Are you part of the academic community on Twitter (now X)? Recent changes to the platform are affecting the academic social media arena and altmetrics tools.

Elon Musk’s takeover of the social media platform Twitter (now X) has brought in changes that affect the academic community, both in terms of where academic news and publications are shared and discussed, and now also in terms of altmetrics.

ALTMETRICS: ‘Alternative article-level metrics’ are metrics based on a research article’s mentions as harvested
from social media platforms, news media, and policy documents, Twitter having been one of the main providers.
The main altmetrics tools are altmetric.com (by Digital Science) and PlumX metrics (by Elsevier).

Elsevier recently announced that its altmetrics tool PlumX would discontinue all Twitter (X) metrics as of August 31st 2023, the stated reason being: “changes in market conditions” (plumanalytics.com, 31 August 2023).

Altmetric.com has announced that they are “observing ongoing developments with the platform closely”. They are still tracking tweets as usual but are “working on a number of scenarios as to how we might adjust to any developments that may occur in the future” (altmetric.com, 8 June 2023).

A large number of academic Twitter users have already fled the platform or are planning/considering doing so due to dissatisfaction with recent changes to the platform, the most problematic of which are discussed here: Social media has changed – Will academics catch up?).

Some have moved on to alternative platforms, including Mastodon, Threads, Spoutible etc., and many are, of course, still hanging on to Twitter(X), perhaps simultaneously maintaining a presence on other platforms, to be able to keep up with their peers. So far, former Twitter users have sought out different alternatives – and there is no consensus on where the “new Twitter” is.

Elsevier has not made it clear whether they will harvest mentions from other social media platforms instead, and altmetric.com are still monitoring developments, as mentioned.

The CBS Library newsletter will keep you posted on future changes. If you have questions or comments on this matter, please contact Lene Janussen Gry.

This post has no comments.
09/01/2021
profile-icon Liselotte Brandstrup


Illustration:: https://bit.ly/3zfSwyf

Written by Lene Hald

You may be familiar with ‘JIF’, the Journal Impact Factor used in Web of Science and created by Clarivate. This metric is often used to compare the quality of journals and is based on the average number of citations received by articles in a journal in a 2-year window (find out more about JIF right here).

JIF is not field-normalized, which means that different citation patterns in different scientific fields are not taken into account. This makes it less than optimal if you want to compare JIFs across scientific disciplines. Enter Journal Citation Indicator, the latest toolbox addition from Clarivate.

Journal Citation Indicator (JCI) is a field-normalized metric that can help you measure the citation impact of journals across disciplines and as such supplements JIF. The metric is based on the average number of citations that articles from a journal receive in a 3-year window, which are in turn field-normalized (check out Clarivate´s discussion paper “Introducing the Journal Citation Indicator” to find out more about the computation of JCI).

Careful judgement required
In the blog post Journal Citation Indicator. Just Another Tool in Clarivate’s Metrics Toolbox? on The Scholarly Kitchen, JCI is criticized for the way it aims to achieve field-normalization, because it risks distorting the impression of how multi-field journals perform in the process.

 As is the case with all bibliometric indicators, you should use JCI with caution, and as Clarivate puts it themselves: “The normalization steps make it more reasonable to compare journals across disciplines, but careful judgement is still required” (Introducing the Journal Citation Indicator, p. 5).
 

Find a journal JCI
To find the JCI for a journal, simply access the Journal Citation Reports and enter the journal name.

If you have any questions about the Journal Citation Indicator or about bibliometric in general, please contact the CBS Library bibliometric team at: metrics.lib@cbs.dk

This post has no comments.
03/15/2021
profile-icon Liselotte Brandstrup

 

By Claus Rosenkrantz Hansen & Lene Hald

The Danish National Research Database (DNRD) has served as a joint discovery service for local research databases of Danish research institutions and has been the place to go for an overview of Danish research results. DNRD was discontinued in January 2021 and a permanent replacement has not yet emerged, it is however under development.

As a temporary alternative, DNRD refers to NORA, a prototype of a national Open Research Analytics platform.
 

NORA provides an overview of Danish research results, and can also be used as a research analytics platform. Just to give you an example: in NORA, you will find a summary of Danish research in relation to the Sustainable Development Goals. This information can be narrowed down to a specific university:

Illustration: Snapshot of SDG’s in relation to CBS publications

NB! It is important to keep in mind that the underlying data in NORA is not pulled from the local research databases of Danish universities but is primarily based on data from Dimensions (a database from Digital Science). This may distort the representation of research results as the ratio of research publications actually indexed in Dimensions varies across universities.

Find links to local research databases, read more about the discontinuation of DNRD, and check for updates

If you have any questions, please contact Lene Hald or Claus Rosenkrantz Hansen.

This post has no comments.
06/11/2020
profile-icon Liselotte Brandstrup

undefined

Written by Dicte Madsen

If your research project involves large publication data sets, you will probably need an API to retrieve information from major publishers or bibliographic databases.

What is an API?
An API - Application Programming Interface - is a tool used to share data between applications. APIs can be used for extracting data from a database; they are often used to embed content from one application in another or to dynamically post content from one system to another.

Where do I find APIs?
Many scholarly publishers and database vendors provide APIs for the extraction of data for research purposes.  Whereas the choices of API are plentiful, this article will zoom in on the ones that relate to scholarly research specifically. Check out the list of APIs for Scholarly Resources compiled by MIT Libraries.

How much data can I access?
Although APIs are often freely available, it does not necessarily follow that you have unlimited access to all the data contained in the databases.

You can divide the Scholarly APIs into three categories:

For more examples, check out the SMU Libraries guide.

In many cases, researchers are interested in getting access to full data sets. Some of the companies behind the subscription-based APIs are happy to share data with non-commercial research projects. However, nothing usually comes for free so please make sure to read the fine print before applying for access, as they may require that you share with them all research outputs resulting from 

the use of their data and tools as soon as possible after dissemination. Companies that provide full data sets are:

  • Dimensions: “Free data access for scientometric research projects”
  • Elsevier: ICSR Lab: “Access rich datasets on a powerful computational platform, free for research use”
  • Microsoft Academic: “Research more, search less”
  • Web of Science: “Our APIs are better because our data is better”

If you have any questions, please reach out to the library bibliometrics team at metrics.lib@cbs.dk

This post has no comments.
03/13/2020
profile-icon Liselotte Brandstrup

 


Written by Dicte Madsen

SciVal (Scival.com) is a bibliometric tool for analyzing and visualizing the relative positioning of research of individuals, institutions and countries.
The results can be used in decision-making, strategic planning, recruitment or for identifying collaborators. It offers access to research performance indicators of more than 12.000 research institutions and offers built-in metrics based on citations, publications, and usage data that may be used to measure productivity, citation impact or subject disciplinarity.
SciVal is based on data from Scopus (1996 - present), one of the largest abstract and citation databases available.
SciVal’s four modules enable analyses for different purposes:

Overview:
Get an overview of research performance for a fixed period for a single entity, e.g. researcher or institution. The metrics, e.g.  Field-Weighted Citation Impact, citations per publication or h-index, are based on either publications or citations.  

Benchmark:
Evaluate your institution against its research field or peer institutions. Use the benchmark module to compare or monitor progress over time. Identify your institution’s strengths and weaknesses. This module utilizes the full dataset from 1996-2020. 

Collaboration:
Explore institutional collaboration and co-authorships. Who are the primary collaborators at your institutions, in each countries etc. Data is presented as a world map that can be drilled down to the level of individual collaborators.

Trends: Analyze the research trends of any research area, topic or topic cluster using citation and usage data to discover top performers, rising stars, and current developments within research fields.

SciVal offers report templates that may be populated with the results of analyses from all four modules and presented as both tables and visualizations. Save and share the report definitions.
You can access SciVal on campus by using your Elsevier login from Scopus or Mendeley. Alternatively, you can create an account using your CBS e-mail.

SciVal is easy to use, but be mindful when interpreting the results; Size, discipline, publication type, coverage, manipulation, and period will affect the metrics.

Visit Scival

For further information, please contact the Research Metrics Team or consult the “Research Metrics Guidebook”.

 

This post has no comments.
11/22/2018
profile-icon Liselotte Brandstrup

By  Lars Nondal and Dicte Madsen

Research impact, research performance, research quality – or whatever you call it, is at the top of everybody’s agenda.
Join our introduction to SciVal, a new tool that makes it possible for us to analyze how CBS research publications (journal articles) perform, compared to other universities and business schools. We do this by primarily looking at citation patterns and citation numbers, but also by analyzing other indicators.
Example of SciVal comparison chart:

 




What does this figure tell us?
Apparently, in 2012, CBS had a lower Field-Weighted Citation Impact (FWCI) than INSEAD and London Business School – but in 2015 and 2017, CBS had the highest FWCI of the three?
Is that a good thing, is it bad or is it just about acceptable? And what exactly is a FWCI? Curious?

Register for a SCIVAL INTRODUCTION on Thursday 13 Dec 2018 9.30-10.30 in SP1.03

What is SciVal?
In their own words: “SciVal offers quick, easy access to research performance of more than 10,400 research institutions and their associated researchers from 230 nations worldwide”.
SciVal is based on Scopus data and while Scopus is ideal for analyzing the citation performance of individual articles and researchers, SciVal is an analytical tool that allows for easy analysis of research performance (citations) for aggregated levels of publication sets and for groups of authors (departments, research groups etc.). 
SciVal comprises separate modules for benchmarking against other institutions or groups of authors and collaboration analysis (co-authorship) etc.

Access to Scival
All CBS researchers and staff have access to SciVal. Simply sign in with your CBS credentials. If you already have an account with Scopus or any other Elsevier product, you do not need to register again. On-campus access only.

NB. Important!
SciVal is mostly relevant for departments, research groups, and research topics dominated, either totally, or at least substantially, by journal publication. If your discipline is dominated by scientific publication channels other than journal articles, then SciVal is not very useful for you.

If you have any question, please contact: Lars Nondal or Dicte Madsen

This post has no comments.
02/20/2018
profile-icon Liv Bjerge Laursen

Journal Citation Reports from Thomson Reuters is now updated with 2014 data about the journals. So now it’s possible to retrieve, analyze and compare 2014 Journal Impacts Factors (JIFs) for your favourite journals.

What exactly does a JIF tell us?
Well, the journal with the highest JIF in 2014 in the categories ‘Business’ and ‘Finance’ (which includes a total of 201 journals!) is the Academy of Management Review with a JIF of 7.475.

The calculation behind this figure:  first we’ll find the number of articles published in this journal in the years 2012 and 2013 (29 + 30 = 59), then we count how many times these 59 articles have been cited by other articles in Web of Science in 2014 (441), and the we divide 441 by 59 = 7.475.

JIFs are by far the most heavily used journal citation indicator – and often considered to be a proxy for quality.
But it’s not undisputed, and a number of competing indicators have been developed by competitors to Thomson Reuters/JCR. Like SCIMAGO Journal Rank which is based on Scopus data. 

Please contact, Lars Nondal, ext. 3691, if you have any questions.

This post has no comments.
02/20/2018
profile-icon Liv Bjerge Laursen

An updated version of the ABS journal quality list is now available from http://www.bizschooljournals.com/.
(NB! personal registration is mandatory, does not allow for printing/downloading - only on-screen reading).

The new list is much larger, as it contains 1401 journals, against 823 in ABS 2010. The list still applies a 5-category classification system, or ranking, of the journals. In the 2012-2014 Development Contract CBS has been concerned with the top 2 categories, level 4 and 4*, 94 journals in total, as priority publication targets. The number of articles in level 4 and 4* has increased to 118 – 32 new journals have been added, 8 journals have been downgraded to category 3.

Among the new category 4/4* journals are titles such as Journal of World Business,  Business History Review, and International Journal of Operations Management, and Public Administration - An International Quarterly.

And among those that are no any longer categorized as 4/4* are Business History and Harvard Business Review.
For the 2015-2017 Development Contract DIR/IL has decided that our goal for this performance indicator should be to ’maintain the level’ of published articles, defined as the average number of articles published in 2012-2014 in the same journals, which amounts to 56 articles yearly.

A file with all ABS-2015 (4,4*) journals (and also all ABS-2010 (4,4*), FT45 and UTD journals) are available on CBS Share.

If you have further questions please contact Lars Nondal, ln.lib@cbs.dk.

This post has no comments.
12/22/2017
profile-icon Mette Bechmann

By Mette Bechmann

Journal, author and article level metrics are undergoing change these years, and PlumX is an alternative method of counting how people are engaging with research. Moving away from traditional article citation counts, PlumX captures many other types of document exposure as well, from news mentions to likes, tweets, and Mendeley captures, all of which are indications of the level of interest in the document.

With every reference in Scopus, you will find a PlumX “flower” in the upper right hand corner. By clicking the “flower”, you open the full overview of captured metrics.

The green petal “Usage” records the immediate interest by counting clicks and views and, in the case of books, library holdings. Moving clockwise round the “flower”, next up is “Captures” which tracks any putting-aside of the document for later use by counting bookmarks and favourites. “Mentions” counts appearances in blog posts and news media, and as links in Wikipedia. “Social media” interest is captured by counting Facebook likes and shares, tweets and re-tweets, as well as Google +1. Finally, “Citations” counts mostly traditional academic citations.

Kilde: https://blog.scopus.com/posts/plumx-metrics-now-on-scopus-discover-how-others-interact-with-your-research

This way of quantifying interest in a document is still in its infancy but you still might find it useful to explore. We will keep an eye on how it develops in the future and keep you posted.

This post has no comments.
12/21/2017
profile-icon Mette Bechmann

The annual registration of CBS research has now been completed and here are some of the main points.

2016 saw a total of 1721 CBS research contributions, a small drop from the 1766 produced in 2015.

The 2016 output includes 144 monographs, 498 peer-reviewed journal articles, as well as 387 peer-reviewed conference contributions. The complete catalogue of 2016 statistics is published in Research Statistics / Publications 2016 available from the Annual Statistics tab on CBSshare, which is also where you will find historical research statistics starting in 1999.

If you wish to explore CBS research in greater detail, please visit Research@CBS.

Please direct all queries and comments to Dicte Madsen, ext. 3692

This post has no comments.
Provided email address is invalid.
Field is required.
Field is required.