Abstract
Journal listings play an important role in current journal evaluation, publication reference, research assessment and position promotion. Considering the current orientation of science and technology development and the need of transformative changes in academic evaluation, this study aims to examine Nature Index journals list. The focus will be on exploring the coverage of research topics, assessing the innovation levels of the published papers, and identifying underrepresented research areas. The outcome will facilitate the identification of high-level science and technology journals that can serve as valuable references for the Nature Index team and relevant management and research institutions in their scientific research evaluation processes. The results of the study show that of the 3029 research topics in the science fields defined by OpenAlex, 2217 (73.19%) are covered by Nature Index journals. Across all 18 research disciplines outside of Dentistry and Veterinary, Nature Index journals show a clear advantage in academic quality (the average share of dominant research topics in each disciplinary category is around 90%). However, Nature Index journals have an advantage in terms of academic impact over the level of disruptive innovation. In this study, two journals have been also selected to supplement the latest Nature Index list by taking the disruptive innovation level as the core index and the academic influence and long-term development of the journals in the related research fields as the reference. Generally speaking, the list of Nature Index journals is relatively complete, but there is still room for adjustment. Nature Index journals have a high academic impact and disruptive innovation level in most of the research topics that they cover, but they are more focused on academic impact. Appropriately adding representative journals in uncharted research areas will enhance the application value of the Nature Index in academic evaluation.
Similar content being viewed by others
Introduction
Journal lists play an important role in journal evaluation (Y. Huang et al., 2021), publication reference (Pölönen et al., 2021; Sasvári and Urbanovics, 2019), research assessment (Piazza et al., 2017), and position promotion (Bales et al. 2019). Current journal lists are often created based on expert peer review and bibliometrics (Mikhailenko and Goncharov, 2017; Smith, 2010a, 2010b) (including Web of Science and Scopus, which are also a special category of journal lists). However, a significant number of journals lists usually do not achieve broad consensus (Brezis and Birukou, 2020; Jiang and Liu, 2022b; Stachura et al., 2024). In the process of practical application, some researchers present complex attitudes towards journal lists (Fassin, 2021; Serenko and Bontis, 2024), questioning their inclusiveness (Li et al., 2019), fairness (Grossmann et al., 2019)), interdisciplinarity (Kushkowski and Shrader, 2013), institutional applicability (Beets et al., 2015; Jaafar et al., 2021), scientific validity (Chavarro et al., 2018; Moosa, 2016). They have also reflected on the added effects (Pons-Novell and Tirado-Fabregat, 2010) and the impact (Walker et al., 2019) of journal listings. Related studies have also shown that broader lists of journals can be misleading in scientific assessments (Borrego, 2021; J. Li et al., 2023), and do not do a good job of highlighting assessments of innovative outcomes (Jiang et al., 2025a). Nevertheless, journal lists continue to be created (Wang et al., 2023) and applied (Baskerville, 2008; George, 2019) in the face of different needs in the practice of research evaluation.
At present, the majority of journal lists are published by relevant research organizations or professional associations. Conversely, there is a paucity of journal lists proposed by academic publishers. As a product line and brand extension of the Nature journal series (Khelfaoui and Gingras, 2022), Nature Index journals list (‘Introducing the index’, 2014) have gained attention and been widely used in the fields of global research management and research evaluation (Lin et al., 2015) including but not limited to: national research assessment (Jiao and Yu, 2020; Lin et al., 2015; Silva, 2016), disciplinary evaluation (Lin et al., 2017; Yang et al., 2019), institutional evaluation (Chen, 2018; Y. Liu et al., 2018), journal evaluation (Li and Wang, 2016; Hayatdavoudi et al., 2023), and research model analysis(Cai and Han, 2020; Zhu et al., 2021). Similarly, the evaluation effectiveness (Bornmann and Haunschild, 2017; Haunschild and Bornmann, 2015b), standard scientificity (Campbell and Grayson, 2015; Haunschild and Bornmann, 2015a) and evaluation orientation (Waltman and Traag, 2017) of Nature Index have been questioned to some extent.
So how do we effectively assess the academic quality of journal listings? Traditional scientific assessment relies heavily on the binary system of bibliometrics and peer review, but both have significant limitations. The validity of bibliometric methods of assessment at the individual level has been questioned by academics (Giovanni, 2017), and some scholars have even put forward a critical stance of ‘bibliometric denialism’. Relying exclusively on peer review faces structural problems such as review subjectivity and resource constraints (Ferguson, 2020). NOR-CAM’s theory of hierarchical assessment offers a new perspective: scientometrics is uniquely suited to macro-level analyses at the subject area level, a meso-level need that is difficult to cover by peer-review mechanisms. On the other hand, in terms of evaluation, the selection mechanism of Nature Index has systematically integrated expert review and impact indicators, but lacks the assessment of the innovation level of research papers. Although academic impact and peer recognition can indirectly map the value of research, there is still an essential difference between them and the nature of innovation. This study chooses to bring in the evaluation perspective of disruptive innovation to better assess the academic quality of journal listings.
Since Wu et al. (Wu et al., 2019) published the article Large teams develop and small teams disrupt science and technology and proposed the Disruption index, the evaluation method of disruptive innovation based on citation network has received extensive attention from scholars in various fields. We have proposed the method of calculating Disruption Index based on open citation data in the course of our previous research (Jiang and Liu, 2024b), and have successively carried out the evaluation research at both literature and journal level (Jiang et al., 2023; Jiang et al., 2024a; Jiang and Liu, 2024b; Zixuan et al, 2024c, Jiang et al., 2025a, 2025b; Jiang and Liu, 2022a; Jiang, 2023b; Jiang et., 2024a, Jiang and Liu, 2024b; Zixuan et al., 2024c; Liu X, Jiang Y 2025) and the practice of domain analysis (Jiang and Liu, 2023a), so that the evaluation of disruptive innovation at the relevant level has gradually become mature.
Considering the realistic role of journal lists in the present time, this study proposes to use Nature Index as a research object to conduct a scientometrics study by linking multiple data sources. The study will explore the coverage of research topics in Nature Index, assess the level of innovation of Nature Index journal articles in different topics, and analyze the research topics not yet covered by Nature Index journal articles, so as to select high-quality journals in these research topics. The results of the study will serve as a valuable reference for the Nature Index team, as well as for relevant management departments and research institutes in their academic evaluation work.
Method
Research object
In this study, 145 journals based on the latest Nature Index collection as well as all other journal papers collected by OpenAlex in the same year were selected for the study. The specific list can be obtained from https://www.nature.com/nature-index/faq. All citation data are subject to the common restrictions of the OpenAlex and COCI databases, with the latest year ending at the end of 2023. Specifically, a total of 92708 research articles from Nature Index journals and 1553210 research articles from non-Nature Index journals were included. For each article, OpenAlex classified it under multiple topics, but assigned a major topic to each article based on the confidence level of the classification. The disciplinary classification of research articles in this paper is based on this sole major theme only.
Data sources
The data to be acquired for this study include journal literature information and citation relationship information. In the actual research process, they were obtained through OpenAlex and COCI, respectively.
OpenAlex is an academic infrastructure built by OurResearch. It provides the academic community with alternatives similar to Scopus and Web of Science, but free and open (Scheidsteger and Haunschild, 2023), with the possibility of using APIs for data access (Harder, 2024) and also has a mature utility toolkit, openalexR (Aria et al., 2024). OpenAlex is now widely used in the field of scientometrics (Foderaro and Gunnarsson Lorentzen, 2024; Okamura, 2023; Ortega and Delgado-Quirós, 2024; Sued, 2024; Tu Le et al., 2024; Xu et al., 2024; Yan et al., 2024). Compared with similar paid services, OpenAlex has significant advantages in terms of inclusiveness, affordability and usability.
COCI, the Crossref Open DOI to DOI Citation OpenCitations Index (Heibi et al., 2019) is the largest dataset ever released successively and the only citation dataset currently integrated. It contains over 116 million bibliographic resources and over 2,013 million citation links (as of July 2024), and aims to provide a disruptive alternative to traditional proprietary citation indexes (Peroni and Shotton, 2020). As a public digital infrastructure OpenCitations has become one of the primary sources of scholarly data for publishers, authors, librarians, funders, and researchers (Hendricks et al., 2020). The metadata it contains is expanding at an average rate of 11% per year, and the functionality and support of the APIs it provides have been enhanced and expanded (Lammey, 2016), and are likewise widely used in real-world research (Bologna et al., 2022; Borrego et al,. 2023; Heibi and Peroni, 2022; Jiang and Liu, 2023a; Spinaci et al., 2022; Y. Zhu et al., 2020). The latest version of the COCI dataset now combines PIDs such as DOIs and PMIDs into the OpenCitations Meta Indentifier (OMID), which enhances the inclusiveness of different citation data sources (Massari et al., 2024). This initiative also lays a solid foundation for linking the use of both OpenAlex and COCI data sources in this study.
Despite the problems of OpenAlex’s metadata and COCI’s citation linking data at the institutional (L. Zhang et al. 2024), linguistic (Céspedes et al., 2025) and coverage (Martín-Martín et al., 2021; Ortega and Delgado-Quirós, 2024) levels, they are still the best available options for scientometrics researchers to conduct large-scale studies when combining multiple dimensions of accessibility, availability, and quality problems.
Data processing
Since this study connects multiple data sources used to conduct a larger data scale scientometrics study, in this section we will detail the specific data processing and usage process (as shown in Fig. 1):
-
(1)
Based on the API provided by OpenAlex and the pyalex library on GitHub (https://github.com/J535D165/pyalex), we used python scripts to obtain metadata from OpenAlex for all research papers published in journals in 2020.
-
(2)
Get the latest list of Nature Index journals from the official Nature Index website.
-
(3)
Download all the dump data (PID-OMID relationship dataset and OMID-OMID citation relationship dataset) provided by OpenCitations from figshare.
-
(4)
Import these three datasets into the local SQLite3 database to complete the data acquisition work.
-
(5)
After completing the data import work, based on the realistic needs of this study, we slice the OMID-OMID citation relationship dataset, and add indexes to all the datasets reasonably, in order to improve the efficiency of data use.
-
(6)
Based on the PID-OMID relationship data table in the local database, the corresponding OMID data of the literature obtained from OpenAlex were extracted (papers that could not obtain the corresponding OMIDs were excluded from the subsequent study).
-
(7)
Based on the extracted OMID data table, calculate the impact and disruptive innovation level of the selected research papers respectively. The method of calculating the disruptive innovation level of papers based on COCI dataset can be referred to the article A new method of calculating the disruption index based on open citation data published in Journal of Information Science (Jiang and Liu X, 2024b)
-
(8)
Based on the obtained calculations and the Nature Index journals list, a multi-level data analysis was carried out.
Evaluation indicators
The evaluation indicators involved in this study include two main categories: disruptive innovation indicators and impact indicators which are used to comprehensively assess the academic quality of journal lists. We believe that if research papers covered by journal lists are ranked better than those not covered by journal lists in terms of both academic impact and level of disruptive innovation, this will, to a certain extent, reflect the academic quality of journal lists.
In the actual study, the disruptive innovation indicator uses the absolute disruptive index Dz (as in Eq. 1) and the impact indicator uses the cumulative citation frequency of the paper. Both of these two indexes are used in calculating the average/median impact ranking and the ranking of disruptive innovation of a topic. Since the calculation process of the disruption index is more complicated and the amount of data is large (Song et al., 2022), and we do not have access to massive commercial citation data resources, we used open citation data for the calculation to struck a balance between the time window and the computational workload (uniformly choosing the time window of 2020–2023). Considering that the focus papers included in this study are all from the same year, there is no problem of large time window differences between papers.
The reason for using the absolute disruptive index Dz rather than the original D index in this study is that related studies have confirmed that the original D index has limited effectiveness in evaluating disruptive innovations (Bornmann et al., 2020), and that the absolute disruptive index outperforms the original D index (Jiang and Liu, 2024), and solves the problem of the existence of inconsistency (X. Liu et al., 2020). Therefore, although the optimal variant of the D index is still being explored by the scientometrics community, the Dz index is still the better choice at present.
In Eq. 1NF refers to papers that cite only the focus paper. NB refers to papers that cite both the focus paper and references of the focus paper. NR refers to papers that cite only references of the focus paper without citing the focus paper.
Result
Assessment of topic coverage in Nature Index journals
In order to better assess the topic coverage of Nature Index journals, the research topics coverage extent of Nature Index journals was comparatively analyzed at different levels in this study, and the results are shown in Figs. 2, 3. From them we can find:
From Fig. 2, we can find: Nature Index journals cover 2217 research topics (73.19%) of the 3029 research topics in scientific fields defined by OpenAlex. In the fields of health sciences, physical sciences, and life sciences, Nature Index journals covered 91.58%, 60.21%, and 81.11% of the research topics, respectively.
From Fig. 3, we can see: The coverage of Nature Index journals varies across the 20 specific disciplinary categories. In the field of chemical engineering, Nature Index journals have the greatest coverage, reaching 100%. The lowest coverage is found in the field of computer science, reaching only 33.11%.
Assessment of academic quality of Nature Index journals in different topics
In order to better assess the academic quality of Nature Index journals list, this study conducted a comparative analysis of the level of academic impact and the level of disruptive innovation of papers published in Nature Index journals at the level of research topics. Considering that topics with a limited number of published papers may have an impact on the results of the analysis (Li, 2025), this section only analyses topics with at least 10 published research papers (>1% of the Nature Index papers included in this research). In addition, calculating the mean and median may not provide a better comparison of the difference in level between research papers in Nature Index and non-Nature Index journals due to the fact that most of the low-quality papers have a level of disruptive innovation of zero. In our actual study we chose the mean and median rankings of absolute disruption index and cumulative citation frequency of research papers to assess the academic quality of Nature Index journals in different topics. The results are shown in Figs. 4, 5, and Tables 1–4.
In Table 1, we can find: (1) in terms of average impact level, Nature Index journal papers are higher than non-Nature Index journal papers in 1124 (99.73%) research topics; (2) in terms of average disruptive innovation level, Nature Index journal papers are higher than non-Nature Index journal papers in 1010 (89.62%) research topics; (3) in terms of both average impact and disruptive innovation level, Nature Index journals were higher/lower than non-Nature Index journal papers in 1009 (89.53%)/2 (0.18%) research topics.
In Table 2, we can find: (1) in terms of median impact level, Nature Index journal papers are higher than non-Nature Index journal papers in 1124 (99.73%) research topics; (2) in terms of median disruptive innovation level, Nature Index journal papers are higher than non-Nature Index journal papers in 1038 (92.10%) research topics; (3) in terms of both median impact and disruptive innovation level, Nature Index journals were higher/lower than non-Nature Index journal papers in 1038 (92.10%)/3 (0.27%) research topics.
From Table 3 and Table 4, we can find: Dentistry and Veterinary are not the main research disciplines that Nature Index journals currently focus on Nature Index journals exhibit academic quality strengths in all 18 research disciplines outside of Dentistry and Veterinary. Nature Index journals’ average proportion of dominant research topics across disciplines is around 90%. But Nature Index journals have an advantage in terms of academic impact compared to the level of disruptive innovation.
An assessment of high-quality journals in areas not covered by Nature Index journals
In order to provide certain references and suggestions to relevant research institutions and the management team of Nature Index, this study conducted an in-depth analysis based on 812 research topics not covered by Nature Index to explore the average academic impact and disruptive innovation level of relevant journals in these research topics. On this basis, reference to the Web of Science collection and the latest JCI partition as a reference for the compliance and long-term level of journals was used to obtain the relevant results of the selection results as shown in Table 5.
Discussion
Nature Index journals list is quite complete, but there is still room for adjustment
From the results of the above study, we can find that although Nature Index covers most of the research topics, the degree of coverage in specific disciplinary categories shows large differences. In some disciplines, Nature Index journals cover only 50 to 60% of the research topics. Therefore, for research institutions and researchers focusing on these research themes, the use of Nature Index for S&T evaluation work may not be comprehensive enough.
Nature Index journals have a high academic level in more covered research topics, but focus more on academic impact
Nature Index journals have a high level of academic excellence in more of the research topics covered, both in terms of academic impact and academic innovation. When broken down, Nature Index journals better represent high-impact research. In a previous study of mega journals, we also found that due to the country bias in the peer review process (Thelwall et al., 2021), specific journals have also developed a unique preference for the country of origin of their authors (Zhu, 2021)).Considering the fact that due to the pre-eminence of developed countries in the field of science and technology, they have a pool of senior scholars in various fields, leading to the fact that the results of senior scholars are often recognized beyond their own quality in contemporary peer review and other processes (Kardam et al., 2016). Similarly in a study of virology papers we found that expert peer review was more likely to identify high-impact papers than highly innovative papers (Jiang and Liu, 2023c).
Increasing the number of representative journals in uncovered research areas will enhance the application value of Nature Index in academic evaluation
In this study, two journals were selected as additions to the latest Nature Index journals list based on the level of disruptive innovation as the core indicator, and with reference of the academic influence and long-term development of the journals in the relevant research fields. The addition of these two journals may enhance the rationality and comprehensiveness of the Nature Index list to a certain extent.
Conclusion
Based on the OpenAlex and COCI databases, this study evaluates the coverage of research topics in Nature Index journals list, and the level of disruptive innovation of Nature Index journal research articles in the research topics covered. In addition, the study also analyzed a few research topics not covered by Nature Index journals, and mined representative journals, which can provide reference for Nature Index and other related research institutions in optimizing journal lists and subsequent assessment models. The results of the study show that: (1) the Nature Index journals list needs to be further improved and adjusted. (2) Nature Index journals have higher academic standards in more research topics covered, but focus more on academic impact. (3) Increasing the number of representative journals in uncovered research areas will enhance the application value of Nature Index in academic evaluation.
Considering that this study involves the processing of massive amounts of data, open bibliographic databases and open citation databases were chosen to be used in practical applications to carry out this study. Although the methodological validity of disruptive innovation evaluation based on open citation data has been verified in the course of previous research (Jiang and Liu, 2024), the quality of open bibliographic databases needs to be further improved to address the loss of information obtained from integrating different sources (Cioffi et al., 2022; Delgado-Quirós and Ortega, 2024), the promotion of data connectivity in different dimensions (Jiao et al., 2023), and the opening of data at the national level (Moretti et al., 2024) would be effective in improving the robustness of research evaluation based on open citation data (C.-K. (Karl) Huang et al., 2020). In fact, these problems also exist to some extent in commercial citation databases such as Web of Science and Scopus (Chinchilla-Rodríguez et al., 2024; Kramer and de Jonge, 2022; Samanta and Rath, 2023), and the desire to further improve the quality of the data awaits the joint efforts of bibliographers and providers of relevant data sources.
In addition, since the Nature Index did not cover the humanities and social sciences in its original value setting, we did not include the humanities and social sciences related fields in OpenAlex in our study. Therefore, the findings of this study are not directly applicable to the humanities and social sciences.
Data availability
All data can be obtained from opencitations.net and openalex.org.
References
Aria M, Le T, Cuccurullo C, Belfiore A, Choe J (2024) openalexR: an R-Tool for collecting bibliometric data from OpenAlex. R J 15(4):167–180. https://doi.org/10.32614/RJ-2023-089
Bales S, Hubbard DE, vanDuinkerken W, Sare L, Olivarez J (2019) The use of departmental journal lists in promotion and tenure decisions at American research universities. J Acad Librariansh 45(2):153–161. https://doi.org/10.1016/j.acalib.2019.02.005
Baskerville R (2008) For better or worse: How we apply journal ranking lists. Eur J Inf Syst 17(2):156–157. https://doi.org/10.1057/ejis.2008.7
Beets SD, Kelton AS, Lewis BR (2015) An assessment of accounting journal quality based on departmental lists. Scientometrics 102(1):315–332. https://doi.org/10.1007/s11192-014-1353-0
Bologna F, Di Iorio A, Peroni S, Poggi F (2022) Open bibliographic data and the Italian National Scientific Qualification: Measuring coverage of academic fields. Quant Sci Stud 3(3):512–528. https://doi.org/10.1162/qss_a_00203
Bornmann L, Haunschild R (2017) An empirical look at the nature index. J Assoc Inf Sci Technol 68(3):653–659. https://doi.org/10.1002/asi.23682
Bornmann L, Devarakonda S, Tekles A, Chacko G (2020) Are disruption index indicators convergently valid? The comparison of several indicator variants with assessments by peers. Quant Sci Stud 1(3):1242–1259. https://doi.org/10.1162/qss_a_00068
Borrego Á (2021) Are mega-journals a publication outlet for lower quality research? A bibliometric analysis of Spanish authors in PLOS ONE. Online Inf Rev 45(2):261–269. https://doi.org/10.1108/OIR-04-2018-0136
Borrego Á, Ardanuy J, Arguimbau L (2023) Crossref as a bibliographic discovery tool in the arts and humanities. Quant Sci Stud 4(1):91–104. https://doi.org/10.1162/qss_a_00240
Brezis ES, Birukou A (2020) Arbitrariness in the peer review process. Scientometrics 123(1):393–411. https://doi.org/10.1007/s11192-020-03348-1
Cai X, Han T (2020) Analysis of the division of labor in China’s high-quality life sciences research. Scientometrics 125(2):1077–1094. https://doi.org/10.1007/s11192-020-03582-7
Campbell N, Grayson M (2015) A response to ‘Discussion about the new Nature Index. Scientometrics 102(2):1831–1833. https://doi.org/10.1007/s11192-014-1516-z
Céspedes L, Kozlowski D, Pradier C, Sainte-Marie MH, Shokida NS, Benz P, Poitras C, Ninkov AB, Ebrahimy S, Ayeni P, Filali S, Li B, LariviŠre V (2025) Evaluating the linguistic coverage of OpenAlex: an assessment of metadata accuracy and completeness. J Assoc Inform Sci Technol 76(6):884–895. https://doi.org/10.1002/asi.24979
Chavarro D, Ràfols I, Tang P (2018) To what extent is inclusion in the Web of Science an indicator of journal ‘quality’? Res Eval 27(3):284–284. https://doi.org/10.1093/reseval/rvy015
Chen X (2018) Analysis and reflection on discipline competitiveness of Chinese Universities Based on Nature Index. Chin Univ Sci Technol Z1:90–95. https://doi.org/10.16209/j.cnki.cust.2018.z1.026
Chinchilla-Rodríguez Z, Costas R, Robinson-García N, Larivière V (2024) Examining the quality of the corresponding authorship field in Web of Science and Scopus. Quant Sci Stud 5(1):76–97. https://doi.org/10.1162/qss_a_00288
Cioffi A, Coppini S, Massari A, Moretti A, Peroni S, Santini C, Shahidzadeh Asadi N (2022) Identifying and correcting invalid citations due to DOI errors in Crossref data. Scientometrics 127(6):3593–3612. https://doi.org/10.1007/s11192-022-04367-w
Delgado-Quirós L, Ortega JL (2024) Completeness degree of publication metadata in eight free-access scholarly databases. Quant Sci Stud 5(1):31–49. https://doi.org/10.1162/qss_a_00286
Fassin Y (2021) Does the Financial Times FT50 journal list select the best management and economics journals? Scientometrics 126(7):5911–5943. https://doi.org/10.1007/s11192-021-03988-x
Ferguson CL (2020) Open peer review. Ser Rev 46(4):286–291. https://doi.org/10.1080/00987913.2020.1850039
Foderaro A, Gunnarsson Lorentzen D (2024) Traditional, dialogical and complex scholarly communication: towards a renewed trust in science. J Doc 80(6):1313–1332. https://doi.org/10.1108/JD-12-2023-0252
George JF (2019) Journal lists are not going away: a response to Fitzgerald et al. Commun Assoc Inform Syst 45:134–138. https://doi.org/10.17705/1CAIS.04508
Giovanni A (2017) Bibliometric evaluation of research performance: where do we stand? Vopr Obrazovaniya / Educ Stud Mosc 1:112–127. https://doi.org/10.17323/1814-9545-2017-1-112-127
Grossmann A, Mooney L, Dugan M (2019) Inclusion fairness in accounting, finance, and management: an investigation of A-star publications on the ABDC journal list. J Bus Res 95:232–241. https://doi.org/10.1016/j.jbusres.2018.10.035
Harder R (2024) Using Scopus and OpenAlex APIs to retrieve bibliographic data for evidence synthesis. A procedure based on Bash and SQL. MethodsX 12:102601. https://doi.org/10.1016/j.mex.2024.102601
Haunschild R, Bornmann L (2015a) Criteria for Nature Index questioned. Nature 517(7532):21–21. https://doi.org/10.1038/517021d
Haunschild R, Bornmann L (2015b) Discussion about the new Nature Index. Scientometrics 102(2):1829–1830. https://doi.org/10.1007/s11192-014-1505-2
Hayatdavoudi J, Goltaji M, Haghighat M (2023) WAAI: a weighted author affiliation index for journal evaluation. J Sch Publ 54(3):379–400. https://doi.org/10.3138/jsp-2022-0074
Heibi I, Peroni S (2022) A quantitative and qualitative open citation analysis of retracted articles in the humanities. Quant Sci Stud 3(4):953–975. https://doi.org/10.1162/qss_a_00222
Heibi I, Peroni S, Shotton D (2019) Software review: COCI, the OpenCitations Index of Crossref open DOI-to-DOI citations. Scientometrics 121(2):1213–1228. https://doi.org/10.1007/s11192-019-03217-6
Hendricks G, Tkaczyk D, Lin J, Feeney P (2020) Crossref: the sustainable source of community-owned scholarly metadata. Quant Sci Stud 1(1):414–427. https://doi.org/10.1162/qss_a_00022
Huang CK (2020) Comparison of bibliographic data sources: implications for the robustness of university rankings. Quant Sci Stud 1(2):445–478. https://doi.org/10.1162/qss_a_00031
Huang Y, Li R, Zhang L, Sivertsen G (2021) A comprehensive analysis of the journal evaluation system in China. Quant Sci Stud 2(1):300–326. https://doi.org/10.1162/qss_a_00103
Introducing the index (2014) Nature 515(7526):S52–S53. https://doi.org/10.1038/515S52a
Jaafar R, Pereira V, Saab SS, El-Kassar A-N (2021) Which journal ranking list? A case study in business and economics. EuroMed J Bus 16(4):361–380. https://doi.org/10.1108/EMJB-05-2020-0039
Jiang Y, Liu X (2022b) Open peer review: mode, technology, problems, and countermeasures. Chin J Sci Tech Periodicals 33(9):1196–1205. https://doi.org/10.11946/cjstp.202202280120
Jiang Y, Liu X (2022a) An innovative evaluation index of scientific journals—Journal Disruption Index (JDI) and its empirical research. Chin J Sci Tech Periodicals 33(7):965–972. https://doi.org/10.11946/cjstp.202202280120
Jiang Y, Liu X (2023c) The relationship between absolute disruption index, peer review index and CNCI: a study based on virology papers. Libr Inf Serv 67(3):96–105. https://doi.org/10.13266/j.issn.0252-3116.2023.03.009
Jiang Y, Liu X (2023a) A bibliometric analysis and disruptive innovation evaluation for the field of energy security. Sustainability 15(2):969. https://doi.org/10.3390/su15020969
Jiang Y, Liu X (2023b) A construction and empirical research of the journal disruption index based on open citation data. Scientometrics 128(7):3935–3958. https://doi.org/10.1007/s11192-023-04737-y
Jiang Y, Wang L, Li X (2023) Innovative evaluation of Chinese SCIE-indexed scientific journals: an empirical study based on the disruption index. Chin J Sci Tech Periodicals 34(8):1060–1068. https://doi.org/10.11946/cjstp.202302190096
Jiang Y, Liu X, Wang L (2025a) Evaluation and comparison of the academic quality of open-access mega journals and authoritative journals: disruptive innovation evaluation. J Med Internet Res 27:e59598. https://doi.org/10.2196/59598
Jiang Y, Liu X, Wang L (2025b) The disruptive innovation evaluation and empirical analysis of Chinese, Japanese, Indian, and South Korean scientific journals. J Sch Publ 56(1):60–78. https://doi.org/10.3138/jsp-2023-0056
Jiang Y, Liu X, Zhang Z, Yang X (2024a) Evaluation and comparison of academic impact and disruptive innovation level of medical journals: bibliometric analysis and disruptive evaluation. J Med Internet Res 26:e55121. https://doi.org/10.2196/55121
Jiang Y, Liu X (2024b) A new method of calculating the disruption index based on open citation data. J Inform Sci 01655515241263545 https://doi.org/10.1177/01655515241263545
Jiao C, Li K, Fang Z (2023) How are exclusively data journals indexed in major scholarly databases? An examination of four databases. Sci Data 10(1):737. https://doi.org/10.1038/s41597-023-02625-x
Jiao Y, Yu Z (2020) Analysis of the performance degree of China’s Editorial Board Members in International High-Impact Journals—Taking the Journals Listed in Nature Index Database as an Example. Sci Technol Publ 9:130–136. https://doi.org/10.16510/j.cnki.kjycb.20200915.006
Kardam K, Kejriwal A, Sharma K, Kaushal R (2016) Ranking scholarly work based on author reputation. 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI), 2084–2088. https://doi.org/10.1109/ICACCI.2016.7732358
Khelfaoui M, Gingras Y (2022) Expanding nature: product line and brand extensions of a scientific journal. Learned Publ 35(2):187–197. https://doi.org/10.1002/leap.1422
Kramer B, de Jonge H (2022) The availability and completeness of open funder metadata: case study for publications funded by the Dutch Research Council. Quant Sci Stud 3(3):583–599. https://doi.org/10.1162/qss_a_00210
Kushkowski JD, Shrader CB (2013) Developing a Core List of Journals in an Interdisciplinary Area. Libr Resour Tech Serv 57(1):51–65. https://doi.org/10.5860/lrts.57n1.51
Lammey R (2016) Using the Crossref Metadata API to explore publisher content. Sci Editing 3(2):109–111. https://doi.org/10.6087/kcse.75
Li H (2025) Global or regional: the hidden truth behind ShanghaiRanking’s global university ranking by the subject of Law. Scientometrics 130(1):515–530. https://doi.org/10.1007/s11192-024-05214-w
Li J, Lu X, Li J, Wu D (2019) Evaluating journal quality by integrating department journal lists in a developing country: are they representative? J Acad Librariansh 45(6):102067. https://doi.org/10.1016/j.acalib.2019.102067
Li J, Long Q, Lu X, Wu D (2023) Citation beneficiaries of discipline-specific mega-journals: who and how much. Hum Soc Sci Commun 10(1):1–10. https://doi.org/10.1057/s41599-023-02050-w
Li W, Wang Y (2016) The Scientometric analysis of nature index journals based on earth and environmental science field. J Mod Inf 36(10):171–177. https://doi.org/10.3969/j.issn.1008-0821.2016.10.029
Lin L, Zhang G, He S (2017) Analysis on Nature Index in Asia region: examples of China, India, Japan, South Korea, and Singapore. Bull Chin Acad Sci 32(12):1379–1383. https://doi.org/10.16418/j.issn.1000-3045.2017.12.012
Lin L, He S, Zhu X, Wang D (2015) Thoughts on significance of scientific innovation evaluation based on Nature Index. Chin J Sci Tech Periodicals 26(2):191–197. https://doi.org/10.11946/cjstp.201412191218
Liu X, Jiang Y (2025) Journal disruption index based on citation data source optimization and its empirical study. Chin J Sci Tech Periodicals 36(4):512–521. https://doi.org/10.11946/cjstp.202411071210
Liu X, Shen Z, Liao Y, Yang L (2020) The Research about the Improved Disruption Index and Its Influencing Factors. Libr Inf Serv 64(24):84–91. https://doi.org/10.13266/j.issn.0252-3116.2020.24.010
Liu Y, Lin D, Xu X, Shan S, Sheng QZ (2018) Multi-views on Nature Index of Chinese academic institutions. Scientometrics 114(3):823–837. https://doi.org/10.1007/s11192-017-2581-x
Martín-Martín A, Thelwall M, Orduna-Malea E, Delgado López-Cózar E (2021) Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: A multidisciplinary comparison of coverage via citations. Scientometrics 126(1):871–906. https://doi.org/10.1007/s11192-020-03690-4
Massari A, Mariani F, Heibi I, Peroni S, Shotton D (2024) OpenCitations Meta. Quant Sci Stud 5(1):50–75. https://doi.org/10.1162/qss_a_00292
Mikhailenko I, Goncharov M (2017) The practices of creating the lists of recommended scientific journals. Sci Tech Libraries 0(10):47–53. https://doi.org/10.33186/1027-3689-2017-10-47-53
Moosa IA (2016) A critique of the bucket classification of journals: the ABDC List as an Example. Econ Rec 92(298):448–463. https://doi.org/10.1111/1475-4932.12258
Moretti A, Soricetti M, Heibi I, Massari A, Peroni S, Rizzetto E (2024) The Integration of the Japan Link Center’s Bibliographic Data into OpenCitations. J Open Human. Data 10(1):21. https://doi.org/10.5334/johd.178
Okamura K (2023) A half-century of global collaboration in science and the “Shrinking World”. Quant. Sci Stud 4(4):938–959. https://doi.org/10.1162/qss_a_00268
Ortega JL, Delgado-Quirós L (2024) The indexation of retracted literature in seven principal scholarly databases: a coverage comparison of dimensions, OpenAlex, PubMed, Scilit, Scopus, The Lens and Web of Science. Scientometrics 129(7):3769–3785. https://doi.org/10.1007/s11192-024-05034-y
Peroni S, Shotton D (2020) OpenCitations, an infrastructure organization for open scholarship. Quant Sci Stud 1(1):428–444. https://doi.org/10.1162/qss_a_00023
Piazza S, Mori S, Gallevi SB (2017) Research evaluation in Humanities: an empirical study on the main journal lists about History and philosophy of science. JLIS 8(1):93–113. https://doi.org/10.4403/jlis.it-388
Pölönen J, Guns R, Kulczycki E, Sivertsen G, Engels TCE (2021) National lists of scholarly publication channels: an overview and recommendations for their construction and maintenance. J Data Inf Sci 6(1):50–86. https://doi.org/10.2478/jdis-2021-0004
Pons-Novell J, Tirado-Fabregat DA (2010) Is there life beyond the ISI Journal lists? The international impact of Spanish, Italian, French and German economics journals. Appl Econ 42(6):689–699. https://doi.org/10.1080/00036840701720804
Samanta DKS, Rath DDS (2023) Retrieval effectiveness of Scopus and Web of Science based on three types of subject metadata: A comparative study in economics. Ann Libr Inf Stud 70(3):132–145. https://doi.org/10.56042/alis.v70i3.2492
Sasvári P, Urbanovics A (2019) The journals on the domestic lists of the Ix section of the hungarian academy of sciences in light of the requirements of international journal selection. Pénzügyi Szle = Public Financ Q 64(3):369–392. https://doi.org/10.35551/PFQ_2019_3_4
Scheidsteger T, Haunschild R (2023) Which of the metadata with relevance for bibliometrics are the same and which are different when switching from Microsoft Academic Graph to OpenAlex? Profesional de la información 32(2):e320209. https://doi.org/10.3145/epi.2023.mar.09
Serenko A, Bontis N (2024) Dancing with the devil: the use and perceptions of academic journal ranking lists in the management field. J Document 80(4):773–792. https://doi.org/10.1108/JD-10-2023-0217
Silva V (2016) Scientometrics: Nature Index and Brazilian science. Da Academia Brasileira de Ciências 88(3):1597–1599. https://doi.org/10.1590/0001-3765201620150054
Smith DR (2010a) Identifying a Set of ‘Core’ Journals in Occupational Health, Part 1: Lists Proposed by Others. Arch Environ Occup Health 65(2):106–110. https://doi.org/10.1080/19338241003730945
Smith DR (2010b) Identifying a Set of ‘Core’ Journals in Occupational Health, Part 2: Lists Derived by Bibliometric Techniques. Arch Environ Occup Health 65(3):173–175. https://doi.org/10.1080/19338241003730952
Song P, Feng C, Long C, Yang Z, Song Y (2022) Study on discovery of outstanding scientific and technological talents in specific domains based on optimized disruptive index. J Intell 41(5):61–65. https://doi.org/10.3969/j.issn.1002-1965.2002.05.010
Spinaci G, Colavizza G, Peroni S (2022) A map of Digital Humanities research across bibliographic data sources. Digital Scholarsh Human 37(4):1254–1268. https://doi.org/10.1093/llc/fqac016
Stachura A, Banaszek Ł, Włodarski PK (2024) Expert‐guided evaluation of medical research may promote publishing low‐quality studies and increase research waste: a comparative analysis of Journal Impact Factor and Polish expert‐based journal ranking list. J Evid Based Med 17(2):256–258. https://doi.org/10.1111/jebm.12615
Sued GE (2024) La producción científica mexicana en Inteligencia Artificial: Un análisis bibliométrico. Investigación Bibliotecológica: archivonomía, bibliotecología e información 38(100):87–105. https://doi.org/10.22201/iibi.24488321xe.2024.100.58893
Thelwall M, Allen L, Papas E-R, Nyakoojo Z, Weigert V (2021) Does the use of open, non-anonymous peer review in scholarly publishing introduce bias? Evidence from the F1000Research post-publication open peer review publishing model. J Inf Sci 47(6):809–820. https://doi.org/10.1177/0165551520938678
Tu Le OT, Hong Le AT, Thanh Vu TT, Cam Tran TT, Van Nguyen C (2024) Management control systems for sustainable development: a bibliographic study. Cogent Bus Manag 11(1):2296699. https://doi.org/10.1080/23311975.2023.2296699
Walker JT, Fenton E, Salter A, Salandra R (2019) What Influences Business Academics’ Use of the Association of Business Schools (ABS) List? Evidence From a Survey of UK Academics. Br J Manag 30(3):730–747. https://doi.org/10.1111/1467-8551.12294
Waltman L, Traag V (2017) Is the Nature Index at odds with DORA? Nature 545(7655):412–412. https://doi.org/10.1038/545412a
Wang J, Halffman W, Zhang YH (2023) Sorting out journals: the proliferation of journal lists in China. J Assoc Inf Sci Technol 74(10):1207–1228. https://doi.org/10.1002/asi.24816
Wu L, Wang D, Evans JA (2019) Large teams develop and small teams disrupt science and technology. Nature 566(7744):378–382. https://doi.org/10.1038/s41586-019-0941-9
Xu H, Liu M, Bu Y, Sun S, Zhang Y, Zhang C, Acuna DE, Gray S, Meyer E, Ding Y (2024) The impact of heterogeneous shared leadership in scientific teams. Inf Process Manag 61(1):103542. https://doi.org/10.1016/j.ipm.2023.103542
Yan J, Monlong J, Cougoule C, Lacroix-Lamandé S, Wiedemann A (2024) Mapping the scientific output of organoids for animal and human modeling infectious diseases: a bibliometric assessment. Vet Res 55(1):81. https://doi.org/10.1186/s13567-024-01333-7
Yang Y, Xu D, Chen S, Han S, Xu S (2019) Analysis of research hotspots in the field of global medical research based on Natural Index. J China Soc Sci Tech Inf 38(11):1129–1137. https://doi.org/10.3772/j.issn.1000-0135.2019.11.001
Zhang L, Cao Z, Shang Y, Sivertsen G, Huang Y (2024) Missing institutions in OpenAlex: possible reasons, implications, and solutions. Scientometrics 129(10):5869–5891. https://doi.org/10.1007/s11192-023-04923-y
Zhang Z, Yang X, Liu X, Jiang Y (2024c) Effects of author’s internationalization level on journal impact and disruptive innovation in China journals indexed in SCI. Chin J Sci Tech Periodicals 35(11):1611–1618. https://doi.org/10.11946/cjstp.202405210538
Zhu H (2021) Home country bias in academic publishing: a case study of the New England Journal of Medicine. Learned Publ 34(4):578–584. https://doi.org/10.1002/leap.1404
Zhu Y, Yan E, Peroni S, Che C (2020) Nine million book items and eleven million citations: a study of book-based scholarly communication using OpenCitations. Scientometrics 122(2):1097–1112. https://doi.org/10.1007/s11192-019-03311-9
Zhu Y, Kim D, Yan E, Kim MC, Qi G (2021) Analyzing China’s research collaboration with the United States in high-impact and high-technology research. Quant Sci Stud 2(1):363–375. https://doi.org/10.1162/qss_a_00098
Author information
Authors and Affiliations
Contributions
YJ contributed to propose the topic of this article, determine the content framework and paper writing, MJ contributed to review and revise this article.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Ethical approval
This study does not involve human participants or their data.
Informed consent
This study does not involve human participants or their data.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Jiang, Y., Ma, J. Are Nature Index journals a valid basis for academic assessment: a study of academic impact and disruptive innovation assessment based on open bibliographic metadata and citation data. Humanit Soc Sci Commun 12, 1062 (2025). https://doi.org/10.1057/s41599-025-05387-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1057/s41599-025-05387-6