Abstract
The responsible research assessments initiative aims to broaden the scope of what can be recognised in science evaluation. As a scientometric approach, the open educational resources (OER) statistics framework contributes to this mission by rewarding academic teaching as a performance class via OER. Specialised OER infrastructures can be considered data providers for OER statistics. The analysis of selected OER infrastructures shows that in order to obtain comprehensive OER-related datasets that can be used for the determination of the OER statistics without significant additional effort, improvements in the metadata provided are needed. This applies to the completeness of databases, the use of persistent identifiers, and the references and attributions/citations of OER, but also to details such as information on quality, year of creation, version management, or granularity of OER.
1 Introduction
Science is a complex system that serves to gain new insights, enhance human knowledge systematically and combines various performance areas and tasks to achieve this. Research, with its pursuit of knowledge, is indisputably at the core of science and represents a central area of activity. Inseparably linked to this is higher-education teaching, which is indispensable for training future scientists and thus high-quality research. Like almost all areas of performance, science is subject to continuous performance evaluation. The focus of science evaluation has so far been mainly on research, and here in particular on research publications as a medium for documenting research processes and disseminating scientific findings. By favouring research and research outputs, other key achievements of the scientific system, such as teaching, are not sufficiently taken into account in science evaluation. Researchers and institutions that are involved in less recognised areas run the risk of suffering disadvantages compared to researchers with a strong scientific publication record. This inequality between performance areas is addressed by the responsible research assessment movement (RRA, Owen, Macnaghten, & Stilgoe, 2012). Various initiatives such as the Coalition for Advancing Research Assessment (CoARA, 2022) and projects like the Open and Universal Science project research assessment framework (OPUS RAF; O’Neill, n.d.), the SCOPE Framework for Research Evaluation (SCOPE; INORMS, 2021) or GraspOS Open Science Research Assessment Framework[1] aim to broaden the performance categories in the context of science evaluation and to include activities that are not directly research-related which comprises teaching or efforts to transfer research results into practice.
Science evaluation can be carried out in various ways. Both qualitative and quantitative methods can be applied. Scientometrics, alongside Informetrics, Bibliometrics, Webometrics, and Altmetrics, can be categorised as a quantitative method to examine structures and interrelationships between information objects. Scientometrics focuses specifically on scientific work results and thus on the core of science. It has established itself over decades as a method for science evaluation. Science can be understood as the inseparable combination of scientific research and teaching. In the context of current developments in the field of RRA, it makes sense to also recognise teaching as a scientific performance area and to understand it as an object of science evaluation using scientometric methods.
This study is part of a research programme that is based on this perspective. It continues the previous work on a scientometric-based framework for open educational resources (OER), the OER statistics framework. This framework conceives of OER as the teaching-related counterpart to scientific publications, which, under certain conditions, are suitable as a subject of measurement for science evaluation using scientometric methods. The preceding work consists of a preliminary study on the feasibility of scientometric indicators based on OER (Kullmann & Weimer, 2023), a detailed analysis of OER as a scientometric measurement object (Kullmann, 2025), a main study on the evaluation of the first version of the OER statistics framework as a theoretical concept by scientometric experts (Kullmann & Weimer, 2024) and a comprehensive presentation of the OER statistics framework in its revised second version as a result of the evaluation process (Weimer & Kullmann, 2025, in press). In addition, the programme includes the investigation of the specific problem of authorship and quality of OER as basic requirements for recognisable scientific achievements (Kullmann & Rasulzade, 2025). The present study serves to identify the requirements for data about OER, which are needed to create OER statistics based on the OER statistics framework. Furthermore, it serves to survey the current status of the availability of this data in specific OER infrastructures. It thus bridges the gap between the theoretical concept of the OER statistics framework and scientometric practice.
The OER statistics framework is a theoretical framework on the scientometric consideration of OER. It contributes to the visualisation and recognition of teaching achievements in science evaluation and complements previous efforts within the above-mentioned initiatives as well as, for example, ACUMEN (ACUMEN; European Commission, 2023) and OS-CAM (OS-CAM; European Commission, Directorate-General for Research and Innovation, Cabello Valdes, Rentier, & Kaunismaa, 2017). Within the OER statistics framework, a clear distinction between the individual and institutional level is made. At the individual level, the teaching services provided by an individual in the form of OER are analysed. The institutional level refers to the activities of larger organisational units such as departments, research institutes/chairs, or entire universities. The division in an individual and institutional metric is largely due to the desire to counter some of the frequently discussed problems with the scientometric evaluation of individuals as summarised in the Leiden Manifesto (Hicks, Wouters, Waltman, Rijcke, & Rafols, 2015). As a consequence, this resulted in the demand to involve the evaluated person at the individual level in the creation of the OER statistics. The institutional-level indicators have their roots in established scientometric evaluation procedures for institutions (DZHW, 2021). These have been adapted to the special characteristics of OER. During development, the question of current feasibility, which is significantly influenced by the availability of data, was deliberately not taken into account. In the following, a bridge to practice is built with the examination of the question of what data is specifically needed for the OER statistics framework and whether this is already available in a suitable form.
To apply the OER statistics framework at both individual and institutional levels, a solid data basis is necessary. In recent years, many OER repositories have emerged worldwide (Marín & Villar-Onrubia, 2023; Santos-Hermosa, 2023). The aim of these OER infrastructures is to make educational artefacts publicly available for reuse by teachers and learners. The key feature is the largely unrestricted access to OER, which distinguishes them from university Learning Management Systems with access barriers (e. g. required user accounts). OER infrastructures typically hold many different types of open teaching/learning artefacts with different granularities, ranging from graphics, photos, or exercise sheets up to videos or whole courses. A special form of OER infrastructure is Open Textbook Libraries with a specific focus on openly licensed books for higher education purposes. Open Textbooks play an important role, particularly at universities in the United States, Canada and South Africa (Ashman, 2023; Farrow, Pitt, & Weller, 2020; Pitt, 2023; Stagg & Partridge, 2019). Due to the presentation of open teaching/learning artefacts via comprehensible metadata and their search function, OER infrastructures can basically be compared to bibliographic databases in the field of scientific literature. In their function as reference systems for OER, OER infrastructures can be considered as data providers for OER statistics. The common practice of indexing open educational artefacts with the help of extensive metadata supports this argument.
This study concentrates on the data which would be necessary to compile OER statistics on the institutional level and possible data sources. The following research questions are examined:
Which data are required to compile OER statistics on the institutional level?
To what extent can OER infrastructures be used as possible data sources to serve as a basis for OER statistics?
By answering these two research questions, the contribution of this study consists of a breakdown of the data requirements of the OER statistics framework. Furthermore, selected OER infrastructures are used to show the challenges currently faced in the area of data provision for OER in the context of scientometric analyses.
The theoretical background of OER statistics is explained in the Research background section. The methodology chosen to answer the research questions is described in the Methods section. The Data requirements for the OER statistics framework section addresses the question of the specific data requirements arising from the OER statistics indicators. This is followed by the section on the Suitability of OER infrastructures as data sources for the OER statistics framework, which presents the results of the analyses of selected OER infrastructures for the creation of OER statistics. The sections Discussion and Conclusion provide an overall assessment and summary of the work.
2 Research Background
2.1 OER Statistics Framework
The OER statistics framework is a tool for the scientometric consideration of OER (Kullmann & Weimer, 2024; Weimer & Kullmann, 2025, in press). It builds the basis of the following work. Thus, the data requirements discussed in the following are derived from it (see Section 4). The aim of the OER statistics framework is to recognise teaching performance in higher education by expanding the recognition and rewards system of science to include freely available and possibly openly licensed teaching/learning resources as special scientometric measurement objects. These are categorised as achievements worthy of recognition alongside other scientific outputs such as publications. Due to their special characteristics and their diversity, OER represents a very complex object of measurement. Thus, OER statistics are based on a specific understanding of OER which is summarised in the following OER definition for scientometric purposes (Kullmann & Rasulzade, 2025; Kullmann, 2025):
OER are publicly available, freely accessible materials that have been created specifically for teaching/learning purposes and are of sufficient quality and level of personal contribution. OER are divided into the categories of Dedicated Learning Content for learning materials primarily intended for learners, and Learning Design Content for supporting materials for teachers. OER also include contributions to a supportive OER ecosystem that facilitates the creation, use and dissemination of OER. This includes, for example, infrastructure elements developed for open teaching/learning purposes, such as OER repositories, OER-supportive working environments, editorial work, consulting services, training and other support services for OER authors.
To create the broadest possible basis for the recognition of teaching achievements, the concept of openness is not only related to legal openness and far-reaching possibilities for subsequent use but is extended to include the possibility of free access, i.e. public availability of free-of-charge teaching/learning materials. In order to achieve a distinction from other scientific output, only artefacts that were created specifically for teaching/learning purposes or that contribute to the promotion of OER through various organisational measures within an OER ecosystem are eligible for OER statistics. Other important characteristics of OER in the context of the OER statistics framework are a sufficient level of personal contribution by OER authors and sufficient content-related and didactical quality. The problem that an author’s OER can be edited by other authors as part of the OER lifecycle (Figure 1) and published as a new version under their name without major efforts worth being rewarded, is specifically relevant with regard to the consideration of performances. A sufficient quality of OER plays a crucial role especially for learners who rely on the provided open artefacts. Apart from that, from a scientometric view, only high-quality material is eligible for consideration as it is the case with scientific publications. Ensuring that these two entry requirements are met is no easy task. For this reason, before the OER statistics are compiled, it must be determined what a sufficient level of personal contribution and sufficient quality look like in concrete terms and how their existence is to be checked in each case. Due to the wealth of possible representations of OER (worksheets, slide sets, videos, courses, etc.), instead of considering every single type and format, the OER statistics framework classifies resources into three categories: Dedicated Learning Content, Learning Design Content, and OER ecosystem. OER falling into the Dedicated Learning Content category represents OER created specifically for teaching/learning purposes, which can be freely and ideally openly reused by other teachers or used by learners to acquire knowledge. The Learning Design Content category includes OER that supports teachers in the creation of their own teaching/learning materials. The OER ecosystem category is used to present services for establishing an OER-friendly environment, which includes counselling services as well as the operation of OER infrastructures, for example.

OER lifecycle (Glahn, Kalz, Gruber, & Specht, 2010).
The OER statistics framework consists of indicators at the level of individuals (individual level) and the institutional level for the assessment of organisational units like universities. Table 1 illustrates these two levels and its indicators.
OER statistics framework – indicators on individual and institutional level
Individual level | Institutional level |
---|---|
Productivity indicators | |
Total OER count | Total OER count |
Number of “Dedicated Learning Content” | Number of “Dedicated Learning Content”
|
Number of “Learning Design Content” | Number of “Learning Design Content” |
Year of first publication (academic age) | — |
Number of active years of OER publication (NAY) | — |
Number of OER per year (arithmetic average) | — |
— | OER publication dynamics |
— | Indexed growth rate (trend) of OER |
— | Subject profile |
Resonance indicators | |
Total attributions (TA) | — |
Number of attributed OER (NA-OER) | — |
Proportion of attributed OER (PA-OER) | — |
Number of attributions per OER (arithmetic average) (CA-OER) | — |
Proportion of self-attributed in total attributions (%) | — |
— | Mean normalised attribution score (MNAS) |
— | Highly attributed OER |
— | Unattributed OER |
Cooperation indicators | |
Number of contributing authors (NCA) | — |
Number of OER publications as first author | — |
Sole-authored OER (SA) | — |
Co-authored OER (CA) | — |
— | Sector cooperation |
— | Most important co-publishing institutions |
— | Most important co-publishing partner countries |
Openess indicators | |
Number of OER with a high degree of openness | |
Number of OER with a medium degree of openness | |
Number of OER with a low degree of openness | |
Number of free of charge material | |
Altmetrics indicators | |
Downloads of OER | — |
Views of OER | — |
Bookmarks of OER | — |
Social media conversations about OER | — |
Number of blogposts | — |
Number of mentions in syllabi | — |
Transfer indicators | |
Attribution in an OER to research material | |
Citation in a research material to OER | |
OER material resulting from research projects | |
Support indicators | |
— | OER policy |
OER certification | OER certifications |
— | OER infrastructures |
— | OER funding |
— | OER services |
— | OER community support |
There is partial congruence between the two levels, but there are also significant differences to consider. For example, the categorisation of OER into granularity classes at the institutional level is not the case at the individual level. This is due to the fact that a more qualitative assessment of teaching/learning performance is sought at the individual level in order to counteract the typical criticisms of scientometric assessments of individuals (cf. Leiden Manifesto; Hicks et al., 2015). Thus, the person to be evaluated should be integrated into the process and the output should be evaluated individually. The indicators also differ between the levels in other areas, which is due to the different scientometric evaluation of individuals and institutions.
2.2 OER Infrastructures
Publication, findability, and accessibility are important prerequisites for the reception of OER in the educational field. To ensure this, around the globe specific OER infrastructures have emerged in recent years that can be regarded as a special case of Digital Knowledge infrastructures. OER infrastructures include open courseware (OCW), thematic websites, Wikimedia platforms, and, in particular, open textbook libraries and OER repositories (Marín & Villar-Onrubia, 2023; Figure 2).

OER Infrastructures as specialised Digital Knowledge Infrastructures (Marín & Villar-Onrubia, 2023).
OER infrastructures all serve to make OER discoverable and to disseminate them among interested reusers. Due to their aggregative nature, OER repositories are particularly noteworthy. As specialised infrastructures, in many cases, they provide OER from a range of different subject areas, in various representations (e.g. slides, exercises, and videos) created at various universities. With regard to the provided functions and services, besides enabling OER authors to publish their materials, they ensure that the resources are adequately described by appropriate metadata and often also provide quality assurance measures in terms of content and formal aspects. In addition, they enable a qualified search for OER and the provision of the OER themselves for download. Another important reason for placing OER infrastructures at the centre of investigations into their suitability for the OER statistics framework is that OER does not currently belong to the collection mandate of literature databases.
OER infrastructures are discussed in the literature from various perspectives. The desirable and actually provided functions (Heck et al., 2020; Hiebl, Kullmann, Heck, & Rittberger, 2023; Santos-Hermosa, Ferran, & Abadal, 2017) and quality aspects (Atenas & Havemann, 2013; Romero-Pelaez, Segarra-Faggioni, Piedra, & Tovar, 2019) are important topics. OER metadata also play a central role (Menzel, 2023; Segarra-Faggioni & Romero-Pelaez, 2022; Simão de Deus & Barbosa, 2020; Tavakoli, Elias, Kismihók, & Auer, 2020; Tischler, Heck, & Rittberger, 2022). The results of these metadata studies make different statements about the quality, comprehensiveness, and completeness of the metadata used, with different focuses, for example, on findability, didactic description, or quality. In the context of the scientometric analysis of OER, which is relevant for the present study, they can only be used to a very limited extent, since the OER metadata requirements posed by scientometric analyses are not discussed. Since large-scale OER infrastructures aggregate OER and can thus be compared in many ways to scientific literature databases or Open Access repositories, they represent a suitable starting point for investigating the current availability of data for creating OER statistics based on the OER statistics framework. For this reason, in the following, the data requirements from the OER statistics framework will first be operationalised and their feasibility will be checked in a second step using selected OER infrastructures. The focus is not on the general quality of OER metadata, but rather on identifying the data requirements of the OER statistics framework and the extent to which these are already provided by selected OER infrastructures.
3 Method
In the first step, the indicators at the institutional level of the OER statistics framework were examined for the data required to determine them. The data requirements resulting from this analysis are summarised in Table 4. In a further step, OER-related data sources are examined for their suitability to provide the required data for the OER statistics institutional level. As data sources, specialised OER infrastructures are considered. These serve as aggregating instances for the collection of a considerable amount of OER metadata and materials. The selection of appropriate OER infrastructures for this study was based on the following criteria:
The OER infrastructure provides materials for higher education teaching. The OER statistics framework is concerned with the recognition of OER for higher education. For this reason, only those OER infrastructures that provide OER for higher education are of interest for the analysis of the framework’s data requests.
The OER infrastructure has an interdisciplinary collection mandate. For the analysis of the data requirements of the OER statistics, aggregating infrastructures with different types of OER from different disciplines are of particular interest. These are suitable for scientometric analyses due to the large number of different data sets on OER.
The OER infrastructure collects materials from different institutions. For the analysis of the data requirements of the institutional level of the OER statistics framework, it is particularly important that the OER infrastructures examined aggregate data on OER from different institutions.
The OER infrastructure is mentioned in relevant scientific literature and listings of the OER community.[2]
The OER infrastructure is findable via a Google Search.
The analysis includes OER infrastructures from Germany, where there is a strong OER movement (Marín et al., 2022; Santos-Hermosa, 2023). In addition, infrastructures in the USA are considered where OER in higher education also play an import role (Marín et al., 2022). Open textbooks as a special form of OER are taken into account against the background of their importance in countries with high university fees (Bethel, 2020; Fisher, 2018). In addition, open textbooks are high-granular OER and an established form of publication that can be expected to have complete reference records which play a role in the scientometric OER statistics. The selected OER infrastructures were subject to other studies (Marín & Villar-Onrubia, 2023; Perifanou & Economides, 2022; Simão de Deus & Barbosa, 2020).
As a result, MERLOT and OER Commons were selected from the US-OER infrastructures that met the selection criteria. MERLOT offers a comprehensive collection of open teaching/learning resources for schools and higher education. In addition to its own collection of materials (MERLOT Collection), MERLOT allows users to search other digital libraries for OER and the web, all from a single interface. MERLOT has a sophisticated peer review process for many of the materials provided in the MERLOT collection. OER Commons provides OER for schools and higher education. The comprehensive and curated collection for higher education includes complete university courses, open textbooks, and various other open materials for teachers and students. For Germany, instead of single OER infrastructures, the metadata standards published by OER Repo AG, which are used in OER infrastructures operated at the federal state level like ZOERR,[3] were included in the evaluation. OER Repo AG[4] has developed and published comprehensive metadata recommendations. These include the metadata profiles LOM[5] for Higher Education OER Repositories[6] and LRMI[7]-based Allgemeines Metadatenprofil für Bildungsressourcen (AMB),[8] which each contain central metadata for the formal and content-related description of OER, as well as the controlled vocabularies Hochschulfächersystematik[9] and Hochschulcampus Ressorcentypen.[10] The Open Textbook Library (OTL)[11] and Open Stax[12] were considered prominent Open Textbook providers for the higher education sector. They cover a wide range of disciplines and topics with their mostly peer-reviewed books. Table 2 provides an overview of the selected OER infrastructures.
Selected OER infrastructures for the exemplary examination of current data availability for the OER statistics framework
OER infrastructures | Description |
---|---|
MERLOT https://merlot.org/merlot/ |
|
OER Commons https://oercommons.org/ |
|
Open Textbook Library (OTL) https://open.umn.edu/opentextbooks |
|
Open Stax https://openstax.org/ |
|
LOM for Higher Education OER Repositories https://dini-ag-kim.github.io/hs-oer-lom-profil/latest/ |
|
AMB https://dini-ag-kim.github.io/amb/draft/ |
In preparation for this study, information on the available metadata was collected through my own research and by contacting the operators. MERLOT made reference to the entry mask for adding new OER resources as the representation of metadata used.[13] For OER Commons, the publicly available metadata profile was examined.[14] The same applies to the German metadata profiles LOM for Higher Education OER Repositories and AMB. For OTL a MARC-based metadata schema was provided. Additionally, on the OTL webpage, the metadata of all OER presented by OTL was available for download.[15] Open Stax refers to the central GitHub repository.[16] In addition, the search masks and OER material pages were taken into account to get a comprehensive picture of the data that are gathered for the provided OER. It should be emphasised that the analysis does not focus on an audit of the individual infrastructures, but rather on fundamental strengths and weaknesses in the provision of data for the OER statistics by specialised OER infrastructures.
4 Data Requirements for the OER Statistics Framework
In the following, the first research question is addressed. For this purpose, the data needed to create the OER statistics framework will be derived and described. On the one hand, data requirements arise from the OER definition (Kullmann, 2025) on which the OER statistics framework is based. This definition considers the special features of OER in the context of scientometric analyses. On the other hand, the data requirements are derived from the indicators of the OER statistics framework. The focus here is on the institutional level, which is shown in detail in Table 1.
4.1 Data Requirements Derived from the OER Definition
The OER statistics are based on a specific understanding of OER, which results in entry requirements for every teaching/learning artefact to be considered scientometrically. Firstly, it must be checked whether it has been created specifically for teaching/learning purposes (requirement 001). A sufficient level of personal contribution by the authors involved in an OER (requirement 002) and of sufficient quality of the OER (requirement 003) must be ensured. Furthermore, information about the resource type of an OER (requirement 004) is important to enable the classification into one of the two categories Dedicated Learning Content or Learning Design Content. In addition, the artefact under consideration has to be either openly licensed or, as a minimum requirement, publicly available and accessible, and free of charge. With regard to the examination of the degree of openness, there is an overlap with the openness indicators, which is why the data requirements regarding this issue are discussed there (requirements 012, 013, 014, 015).
4.2 Data Requirements of the Productivity Indicators
The productivity indicators consist of the total OER count in the categories Dedicated Learning Content, which contains materials created for teaching/learning purposes, and the category Learning Design Content, which contains artefacts that support teachers in the creation of teaching materials or lessons. At the institutional level of the OER statistics, the Dedicated Learning Content is categorised according to four levels of granularity. In addition, the OER publication dynamics and the indexed growth rate are calculated, which make it possible to recognise trends regarding the publication of OER. The compilation of productivity indicators is rounded off by the determination of a subject profile.
For the total counts, the number of published OER by or in cooperation with an institution must be determinable. To make this possible, all OER under consideration must be explicitly assigned to a uniquely identifiable institution (requirement 005). The materials should be categorised in terms of granularity (requirement 006). The resource type, which is essential for the categorisation of Dedicated Learning Content or Learning Design Content, is also necessary. This is already required by the OER definition (requirement 004).
The publication dynamics at the institutional level can be illustrated by the average annual growth rate of OER in a given observation period. The publication dynamics is calculated using the following formula:
The values t 0 and t T refer to the first and last years of the observation period. |T| indicates the number of observation years. For the OER publication dynamics, information on the publication year of all OER under consideration is required (requirement 007). It should be noted that the OER statistics do not use a fractional but work with full counting (DZHW, 2021).
In order to determine the indexed growth rate of OER, the publication figures determined for the first year under consideration are equated with 100. It is then determined for every subsequent year of interest how these compare to the first year respectively. Values above 100 are interpreted as growth, while values below 100 are interpreted as a decline. For the indexed growth rate, the publication years of the OER under consideration are crucial (requirement 007).
For the assignment of OER to a subject area, appropriate categorisation via suitable metadata is necessary (requirement 008).
4.3 Data Requirements of the Resonance Indicators
The resonance of OER is measured via attributions, which can be compared with citations in the research publication sector. At the institutional level, the indicators MNAS, highly attributed OER, and unattributed OER are relevant.
As a fieldnormalised rate the MNAS is based on the average attribution rate in the scientific community under consideration. The calculation is based on the following formula, analogous to the MNCS method for scientific publications (DZHW, 2021):
where P
o represents the number of OER of the institution o in a selected period, a
i
the attributions of the ith OER in this timeframe, and
Highly attributed OER indicates how many OER of an institution belong to the most frequently attributed 10% of OER of a subject area in a defined observation period. To determine this, the OER must be assigned to an institution (requirement 005) and subject area (requirement 008). It is also necessary to know the respective publication year (requirement 007). Furthermore, the current attributions for each OER of a subject area must be available for the corresponding observation period (requirement 009). The same data are necessary to determine the unattributed OER.
4.4 Data Requirements of the Cooperation Indicators
The OER statistics measure cooperation between different sectors (sector cooperation) as well as institutions and countries (most important co-publishing institutions/partner countries). First of all, the allocation of the OER to an institution is fundamentally important (requirement 005). In addition, the affiliation of an institution to a specific sector is crucial. Besides universities, cooperation partners can, for example, include non-university research institutes, and partners from industry or the public sector (administration, government). Information on sector (requirement 010) and country affiliation (requirement 011) are therefore necessary for the cooperation indicators at the level of the institutions involved in the creation of OER.
4.5 Data Requirements of the Openness Indicators
The openness indicators for OER provide support at the institutional level in visualising the openness of teaching services to the outside world at a specific point in time as well as over time. The measurement is based on open licences and their allocation to the categories outlined in Table 3. The allocation to the first three categories is based on the licence information provided for an OER under consideration (requirement 012). The OER statistics focuses on Creative Common (CC) licenses which are very usual in the OER sector. If OER is provided with other licences, a mapping between the licence systems is required.
Openness categories (Kullmann, 2025)
Category | Openness | OER statistics | Licensing |
---|---|---|---|
1 | High | Number of OER with a high degree of openness | CC-0 |
CC-BY | |||
CC-BY-SA or equivalent | |||
2 | Medium | Number of OER with a medium degree of openness | CC-BY-NC |
CC-BY-NC-SA or equivalent | |||
3 | Low | Number of OER with a low degree of openness | CC-BY-ND |
CC-BY-NC-ND or equivalent | |||
4 | Free of charge | Number of free of charge material | Full copyright protection, but free of charge |
While the classification of openly licensed OER is easy if the respective licence is known, category (4) requires explicit information on the modus of usability. Here, it must be ensured that the OER is actually freely available and accessible and can be used free of charge. Metadata that provides information on availability, accessibility, and is free of charge (requirements 013, 014, 015) is therefore required for materials with this level of openness.
4.6 Data Requirements of the Transfer Indicators
The transfer indicators attribution in an OER to research material and citation in a research material to OER can be understood as an extension of the resonance indicators due to their reference to attributions and citations. In the first case, attributions of OER in a research output are recorded. In the second case, referencing of research output in OER artefacts is depicted. By linking output from the areas of research and teaching via attributions or citations, a clear assignment of artefacts to one of the two areas of teaching or research is required. This can be solved by a separate document type for OER (requirement 016). With regard to the indicator OER material resulting from research projects, a direct and clearly documented provenience of OER is crucial (requirement 017). In addition, the publication years (requirement 007), subject areas (requirement 008) as well as attributions/references per OER and year are required (requirement 009).
4.7 Data Requirements of the Support Indicators
The support indicators aim to present activities and services for the creation and design of an OER-promoting ecosystem. Thus, the support indicators do not focus on OER material but on activities and initiatives of institutions that support OER and its typical lifecycle as outlined in Figure 1. This means, for example, to provide administrative and consulting services as well as technical infrastructures to help OER authors to create, publish, find and access, reuse in unadapted or enriched form, and republish OER artefacts. Accordingly, the support indicators deal with OER policies (requirement 018), OER-related certifications (requirement 019), the development and operation of OER infrastructures (requirement 020), funding in the context of OER (requirement 021), the provision of support services to strengthen OER authors (requirement 022) or measures to educate, promote and support OER communities (e.g. legal advice, provision of human or technical resources to create OER; requirement 023). In the OER statistics themselves, the existence of these measures is recorded dichotomously (“available” or “not available”). One exception is the OER funding indicator, where the amount of funding granted can be specified as a total.
4.8 General Data Requirements
In addition to the requirements needed specifically for the determination of the indicators on the institutional level, general requirements have to be fulfilled. It is very important that OER can be assigned to an institution (requirement 005). At a minimum, the individual OER should be uniquely identifiable. Thus, at least a title (requirement 024) and a uniquely identifiable persistent identifier such as the Digital Object Identifier (DOI; requirement 025) is crucial.
Table 4 summarises all identified data requirements.
Data requirements derived from the OER statistics framework (Kullmann & Weimer, 2024)
ID | Requirement | Derived from OER statistics framework, institutional level (Kullmann & Weimer, 2024) |
---|---|---|
001 | Teaching/learning as the purpose of creation | OER definition |
002 | Sufficient level of personal contribution by OER authors | OER definition |
003 | Sufficient level of quality | OER definition |
004 | Resource type | OER definition, productivity indicators, resonance indicators transfer indicators |
005 | Institutional affiliation | Productivity indicators |
006 | Granularity of OER | Productivity indicators |
007 | Publication year | Productivity indicators, resonance indicators, transfer indicators |
008 | Subject area | Productivity indicators, resonance indicators, transfer indicators |
009 | Attributions/references per OER and year | Resonance indicators, transfer indicators |
010 | Sector affiliation of institution | Cooperation indicators |
011 | Country affiliation of institution | Cooperation indicators |
012 | Licence information | OER definition, openness indicators |
013 | Free availability | OER definition, openness indicators |
014 | Free accessibility | OER definition, openness indicators |
015 | Free of charge | OER definition, openness indicators |
016 | OER as document type in bibliographic databases | Transfer indicators |
017 | Provenience of OER | Transfer indicators |
018 | OER policy document | Support indicators |
019 | OER certifications | Support indicators |
020 | Development and operation of OER infrastructures | Support indicators |
021 | Information on OER fundings | Support indicators |
022 | Information on OER related service offers | Support indicators |
023 | Information on OER community support | Support indicators |
024 | OER title | General requirement |
025 | OER persistent identifier | General requirement |
5 Suitability of OER Infrastructures as Data Sources for the OER Statistics Framework
In the following, selected OER infrastructures will be examined to determine the extent to which they can be used as data providers for OER statistics. This addresses the second research question.
5.1 Requirement 001: Teaching/Learning as the Purpose of Creation
All OER infrastructures analysed clearly state their collection mandate for OER created for teaching/learning purposes. In the case of MERLOT, the vocabulary comprises the value “Open Access Journal Article,” which reflects a broader understanding of OER that includes openly licenced scientific publications. The explicit labelling of OA articles in the MERLOT collection makes it easy to sort them out and only include dedicated teaching and learning material. In summary, due to the well-described collection mandates of all analysed OER infrastructures, the requirement that the artefact under consideration must be created specifically for teaching/learning purposes can be fulfilled by every OER infrastructure under consideration.
5.2 Requirement 002: Sufficient Level of Personal Contribution by OER Authors
Requirement 002 is very complex, as the first step would be to determine the criteria for a sufficient level of personal contribution. Another prerequisite for scrutinising the personal contribution of reusing OER authors is the identification of connections between OER versions. The direct mapping of a relationship between different OER versions via metadata modelling is not strictly necessary to find the source version of an OER derivative. However, this is very helpful. OER versioning already plays a role in some of the OER infrastructures analysed. OER Commons connects versions via relationships (CR_isAdaptationOf, CR_hasAdaption). In addition, it is possible to indicate via the CR_Parent_Modified element if an OER from another author has been changed. In the LOM for Higher Education OER Repositories metadata profile, versions of OER can be identified via the Lifecycle element. The AMB metadata profile uses the relationships isBasedOn, isPartOf, and hasPart to provide a reference to another resource under consideration. The establishment of relationships between OER that are provided in different OER infrastructures is not realised anywhere yet. In view of the sufficient level of personal contribution of OER authors, no content-related information is provided to answer the question if the author of the new version has made a sufficient personal contribution to enrich the previous version. The requirement 002 can therefore not be fulfilled by any OER infrastructure analysed yet.
5.3 Requirement 003: Sufficient Level of Quality
The documentation of the quality of the provided OER plays a particularly important role in the US OER infrastructures. MERLOT has various options for assessing the quality of OER. These include formal peer- and editor-reviews, as well as user ratings and the option for users to leave comments. The materials with quality ratings can be filtered out using the search mask. However, the documentation of the quality assessment results is not an overarching mandatory requirement. OER Commons takes a user-based approach to quality assessment. On the one hand, there are artefacts that have been evaluated by one or more users according to the Educators Evaluating the Quality of Instructional Products (EQuIP) or AchieveOER criteria. On the other hand, users can rate the OER they use by stars. It is also possible for users to leave comments. However, the quality assessment is not documented within the metadata profile yet. The German metadata profiles LOM for Higher Education OER Repositories and AMB as well as of the OTL and Open Stax do not provide quality characteristics of OER. However, for most of the books in OTL and all books in Open Stax, it is stated, that they have been formally peer-reviewed by educators in the field. Overall, the metadata schemes do not record formal quality assessment results on individual materials. However, there are often processes that serve the formal quality assurance of materials which are documented alongside the metadata profiles.
5.4 Requirement 004: Resource Type
With regard to the resource type of the OER provided, all the OER infrastructures analysed have suitable vocabularies for categorisation. Although these vocabularies overlap, they are not identical. The metadata profiles LOM for Higher Education OER Repositories and AMB use the same terminology. In the case of the Open Textbook Libraries OTL and Open Stax, a resource type is not explicitly provided but could be added easily as additional metadata. Overall, requirement 004 is fulfilled by all OER infrastructures analysed.
5.5 Requirement 005: Institutional Affiliation
A fundamental requirement is the documentation of a relationship between an OER and the issuing institution. This can be done directly or by assigning the authors to an institution. Of great importance here is the unique identifiability of the institution. This can be implemented via unique identifiers like the Research Organisation Registry (ROR). Basic institutional allocation is provided by all OER infrastructures analysed. In the case of MERLOT, it is possible to identify the institution via the author. However, it should be noted that it is not mandatory for institutions to specify an author and fill in the field. Furthermore, MERLOT does not work with persistent identifiers for authors and institutions. OER Commons requires the mandatory entry of an institution (CR_Provider field) but also refrains from using unique identifiers. The German metadata profile LOM for Higher Education OER Repositories demands the documentation of the relationship between contributors (authors, other contributors, and institutions). The use of unique identifiers is also provided for in principle. An URI reference to the Common Authority File (GND), Wikidata, or the use of a ROR are recommended for use. The mandatory documentation of institutional affiliation in the metadata is also planned for AMB. The use of unique identifiers analogous to the LOM-based metadata profile is also recommended. For OTL the books must be in use at multiple higher education institutions, or affiliated with a higher education institution, scholarly society, or professional organisation. The optional requirement and the resulting relationship of a book to an affiliated institution are only partially reflected in the metadata. In the case of Open Stax, the associated institution is usually specified in addition to the author. However, unique identifiers are not used. Overall, information on affiliated institutions is largely available, but from a scientometric point of view, there is clear potential for improvement with regard to the presentation of independent metadata and the use of persistent identifiers.
5.6 Requirement 006: Granularity of OER
Information on granularity can currently only be found in the metadata profile LOM for Higher Education OER Repositories, where this is explicitly mapped via the aggregation-level element. However, by specifying the resource type, it is possible to categorise the granularity using the available metadata in the case of OER Commons and MERLOT. This also applies to OTL and Open Stax, as they only offer open textbooks and the granularity can be derived from the book form. Overall, not all OER infrastructures provide a granularity categorisation. However, due to the information already available on the individual OER, such a categorisation is basically possible.
5.7 Requirement 007: Publication Year
The publication years of the OER under consideration are required to calculate the productivity indicators. Firstly, the creation year of an OER must be distinguished from the publication date in a repository, which is naturally more recent and will differ from the actual creation year. In the case of MERLOT, the publication date in the repository is specified instead of the creation year. OER Commons allows to document the creation year of an artefact but does not make this mandatory. As with MERLOT, on the individual pages of the artefact, the publication date in OER Commons can be found. The metadata profile LOM for Higher Education OER Repositories contains an option for specifying OER-related data within the contribute element in the metadata section. The contribute element can be used to specify persons with different roles in relation to the metadata for the respective OER. In this context, it has been specified that the date in the case of the creator role contains the date of the last change of the OER which represents the creation date. For the validator role, on the other hand, the publication date of the OER in a repository is documented. The AMB metadata profile provides the creation date (dateCreated), modification date (dateModified), and publication date (datePublished). The information is not mandatory. The Open Textbook Library records the publication year of a book. In the case of Open Stax, the publication date and the date of the last update of the version of a book published on the Internet are given. Overall, not all OER infrastructures provide the original creation date within the applied metadata profile yet.
5.8 Requirement 008: Subject Area
For the productivity indicators, it is also necessary to specify the subject area to which the OER under consideration belongs. The picture here is uniformly positive. In all OER infrastructures analysed, the OER is assigned to specific subject areas. It should be noted that the German metadata profile based on LOM and the German AMB both use the subject classification system for students at higher education institutions used in Germany by the Federal Statistical Office.[17] However, overall, the vocabularies used are different.
5.9 Requirement 009: Attributions/References per OER and Year
The attributions of OER in other OER and/or scientific publications represent important information that is comparable to citations in scientific publications. Mapping the relationships between attributing and attributed OER is essential for determining the resonance indicators. Access to the references of the OER under consideration is crucial in order to be able to analyse these relationships. References in scientific publications can be evaluated via citation databases such as Web of Science, Scopus, or Open Citations for the majority of artefacts. However, this is not the case for references in OER. In none of the OER infrastructures analysed are the references stored separately and thus made evaluable.
5.10 Requirements 010 and 011: Sector and Country Affiliation of Institutions
With regard to the cooperation indicators, the sector and country affiliation of an institution are necessary. None of the OER infrastructures analysed currently provide explicitly corresponding data for the inclusion of this information.
5.11 Requirement 012, 013, 014 and 015: Information on Openness
The OER statistics not only considers openly licensed teaching/learning artefacts, but also OER with free access, free availability, and no cost. The recording of the degree of openness of artefacts via metadata varies between the different OER infrastructures. MERLOT records whether a material is licensed with a CCs licence of the CC-0 type or another CC licence with or without permission to use the material commercially. If a material has a different licence, it is recorded whether it can be used free of charge. In OER Commons, the individual CC licences are stored in detail in the metadata of an artefact. Other possible values are ‘Public Domain’ and information on more restricted usage rights in the categories ‘Read the fine Print’ and ‘Copyright Restricted’. Information on the free usability of materials without an open licence is not recorded. In the LOM for Higher Education OER Repositories and AMB metadata profiles, detailed legal information on the usability of an artefact is provided. With AMB, additional information on free access and free usability is provided. In general, the two Open Textbook Libraries only provide for the collection of openly licensed books in their guidelines. In the OTL metadata record, the information on the copyright status is explicitly documented. In the case of Open Stax, the licence information is stored in the GitHub repository with every single textbook. Overall, the information for requirement 012 is available in all infrastructures. However, information on requirements 013 to 015 is not available everywhere.
5.12 Requirement 016: OER as Document Type in Bibliographic Databases
The identification of OER in reference lists of publications or other scientific output is not yet possible without problems, as the systematic recording of OER as an artefact/document type in citation databases is not yet comprehensive. To determine the resonance indicators of the OER statistics, separate documentation of the attributions/references of OER has to be made. In addition, the document type OER has to be introduced in common citation databases.
5.13 Requirement 017: Provenience of OER
The provenience of the OER provided at the level of smaller units (e.g. projects) within institutions is not currently documented by any of the OER infrastructures analysed.
5.14 Requirements 018–023: OER Ecosystem
The information required to determine the support indicators is currently not mapped by any OER infrastructure.
5.15 Requirement 024–025: OER Identification
A title is always provided (requirement 024). The assignment of persistent identifiers is explicitly provided in the metadata profiles of OER Commons (CR_Native_ID) as well as LOM for Higher Education OER Repositories and AMB. In particular, the identification systems DOI, Handle, International Standard Book Numbers (ISBN), and Uniform Resource Names are proposed for use. In the case of OTL and Open Stax, the books have ISBNs. Thus, requirement 024 can already be met by the majority of the OER infrastructures analysed.
Table 5 breaks down the requirements and current data availability by indicator.
Data availability in selected OER infrastructures for the indicators of the OER statistics framework (institutional level)
OER statistics framework, institutional level (Kullmann & Weimer, 2024) | Derived requirements | Data availability |
---|---|---|
Productivity indicators | ||
Number of “Dedicated Learning Content”
|
004 resource type | Yes |
005 institutional affiliation | Partly | |
006 granularity | Partly | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Number of “Learning Design Content” | 004 resource type | Yes |
005 institutional affiliation | Partly | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
OER publication dynamics | 005 institutional affiliation | Partly |
007 creation year | Partly | |
008 subject area | Yes | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Indexed growth rate (trend) of OER | 005 institutional affiliation | Partly |
007 creation year | Partly | |
008 subject area | Yes | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Subject profile | 005 institutional affiliation | Partly |
007 creation year | Partly | |
008 subject area | Yes | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Resonance indicators | ||
MNAS | 005 institutional affiliation | Partly |
007 creation year | Partly | |
008 subject area | Yes | |
009 attributions/references | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Highly attributed OER publications | 005 institutional affiliation | Partly |
007 creation year | Partly | |
008 subject area | Yes | |
009 attributions/references | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Unattributed OER publications | 005 institutional affiliation | Partly |
007 creation year | Partly | |
008 subject area | Yes | |
009 attributions/references | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Cooperation indicators | ||
Sector cooperation | 005 institutional affiliation | Partly |
010 sector affiliation | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Most important co-publishing institutions | 005 institutional affiliation | Partly |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Most important co-publishing partner countries | 005 institutional affiliation | Partly |
011 country affiliation | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Openness indicators | ||
Number of OER with a high degree of openness | 012 license information | Yes |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Number of OER with a medium degree of openness | 012 license information | Yes |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Number of OER with a low degree of openness | 012 license information | Yes |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Number of free of charge material | 013 free availability | Partly |
014 free accessibility | Partly | |
015 free of charge | Partly | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Transfer indicators | ||
Attribution in an OER to research material | 004 resource type | Yes |
005 institutional affiliation | Partly | |
007 creation year | Partly | |
008 subject area | Yes | |
009 attributions/references | No | |
016 OER as document type | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Citation in a research material to OER | 004 resource type | Yes |
005 institutional affiliation | Partly | |
007 creation year | Partly | |
008 subject area | Yes | |
009 attributions/references | No | |
016 OER as document type | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
OER material resulting from research projects | 005 institutional affiliation | Partly |
016 OER as document type | No | |
017 provenience of OER | No | |
024 OER title | Yes | |
025 persistent identifier | Partly | |
Support indicators | ||
OER policy | 005 institutional affiliation | Partly |
018 OER policy document | No | |
OER certifications | 005 institutional affiliation | Partly |
019 OER certifications | No | |
OER infrastructures | 005 institutional affiliation | Partly |
020 OER infrastructures | No | |
OER funding | 005 institutional affiliation | Partly |
021 OER fundings | No | |
OER services | 005 institutional affiliation | Partly |
022 OER related services | No | |
OER community support | 005 institutional affiliation | Partly |
023 OER community support | No |
6 Discussion
The OER statistics framework is a theoretical framework that was developed without taking into account the actual availability of the data required for the addressed indicators. In this work, the necessary data for OER statistics is derived from the underlying OER statistics framework and the general suitability of OER infrastructures to cover these data requirements is analysed. The analysis has shown that a part of the required data on OER for the determination of the OER statistics is already available. Requirements 001 (teaching/learning as the purpose of creation), 004 (resource type), 008 (subject area) and 012 (licence information) can be considered fulfilled for all OER infrastructures analysed. Requirement 001 can be derived from the collection strategies of the OER infrastructures. Thus, no explicit metadata are necessary in this regard. For requirements 004 and 008, it should be noted that the vocabularies differ between the OER infrastructures analysed, which makes it necessary to map the values when collecting data from different OER infrastructures to determine the OER statistics. With regard to requirement 012, there is generally sufficient information available on the openness of resources. However, not every OER infrastructure specifies the individual licences for the OER provided, which makes a precise assignment to the openness categories of the OER statistics (Table 3) difficult. With regard to requirement 024, a title as a basic requirement for the identification of an OER is generally available.
For another part of the requirements, the initial situation can generally be described as basically positive but not sufficient yet. These partially fulfilled requirements include 003 (sufficient quality), 005 (institutional affiliation), 006 (granularity), 007 (creation year), 013 (free availability), 014 (free accessibility), 015 (free of charge) as well as 025 (persistent identifier). Here, supplements or improvements of the metadata of all resources are needed to fulfil the requirements of the OER statistics and make the data easy to use. This could be reached by a harmonisation and enhancement of the metadata profiles and the more comprehensive usage of mandatory metadata. In this context, the mandatory use of persistent identifiers such as DOIs would be of great value in order to be able to uniquely identify each OER. Persistent identifiers would also provide a solid basis for mapping relationships between different versions of an OER, which is of great importance for determining a sufficient level of personal contribution as one central quality aspect.
There are also requirements that have not yet been met by any of the OER infrastructures analysed. On the one hand, these are the very specific requirements resulting from the underlying OER definition of the OER statistics. The first group includes requirement 002 (sufficient level of personal contribution by OER authors), which results from the OER lifecycle and is particularly relevant for the scientometric recording of OER. A check as to whether an OER is a version of another OER and whether a sufficient personal contribution by the author of this version is recognisable after the revision has not yet been considered in any case. The requirements from the support indicators (requirements 018–023) should also be mentioned here. OER infrastructures focus on the provision of materials. These data with regard to the support indicators are strongly institution-related and are not part of the collection mandate of OER infrastructures. Nevertheless, the introduction of institutional profiles in the OER infrastructures could be considered in the medium term, providing the relevant information for OER-supplying institutions. On the other hand, there are requirements resulting from the resonance and transfer indicators for which references must be analysed. Essentially, there is a lack of explicit and analysable information on the references of OER (requirement 009), the classification of OER as a document type, and proof of the provenance of OER below the institutional level (requirement 017). Furthermore, information on the sector and country affiliation of the institutions supplying OER is missing.
7 Conclusion
This study serves to answer the question of the extent to which existing OER infrastructures can currently be used as data sources for determining the indicators of the OER statistics framework (see Section 2). In the first step, the question is asked what data requirements the OER statistics framework has in detail. The focus here was on the institutional level (Table 1). The results of the analysis of the OER statistics data requirements for the institutional indicators are presented in Section 4. Table 4 summarises these in detail. In a second step, selected OER infrastructures in Germany and the USA as major players of the OER movement are examined (Section 5). The focus is on the question if the OER statistics data requirements can already be fulfilled. Table 5 summarises the results of the analysis.
Overall, the selected OER infrastructures fulfil their role as aggregating instances for OER and OER metadata suppliers. In terms of their functional scope and metadata, they are basically comparable to scientific literature databases. However, like many literature databases, OER infrastructures have not yet been optimised for use as data sources for scientometric analyses. There is still considerable room for improvement and optimisation for a simple and labour-efficient implementation of scientometric analyses for OER. With regard to the completeness of the database, the large number of OER infrastructures with differing OER inventories is challenging for scientometric purposes. In many cases, these individual infrastructures are already networked through the exchange of resources. Meta-infrastructures like the German project Open Educational Search Index (OERSI)[18] or the US-based Openly Available Sources Integrated Search (OASIS)[19] that make OER from different OER infrastructures accessible at one point do already exist. However, from a scientometric point of view, optimisation of data availability is still necessary. This is particularly true for the provision of attribution and citation data which is crucial for the determination of the resonance indicators.
Overall, none of the OER statistics indicators can currently be determined without major effort in data collection. Furthermore, none of the OER infrastructures is suitable as a sole data source for the creation of OER statistics. However, there is a good data fundament on which to build. Expanding and standardising the available metadata in the direction of scientometric use cases would be an important step. A joint effort by all OER infrastructure operators to harmonise metadata and vocabularies as part of a scientometric adaptation of data structures would be particularly helpful. In the context of metadata optimisation, the consideration of specialised specifications, such as those provided by the Open Citations initiative[20] for references and the methodology for FAIR-by-design production of learning material (Filiposka et al., 2023), could be a role model in this context. The overarching goal should be to develop an OER data infrastructure that enables the selection of all OER published by OER authors and institutions.
Acknowledgments
The author would like to express sincere thanks to the anonymous reviewers for their very fruitful reviews.
-
Funding information: The author states no funding involved.
-
Author contribution: Sylvia Kullmann: conceptualisation, methodology, formal analysis, validation, investigation, project administration, resources, visualisation, writing – original draft, writing – review & editing.
-
Conflict of interest: The author states no conflict of interest.
References
Ashman, M. (2023). A framework for evaluating the creation and adaptation of open textbooks. OTESSA Conference Proceedings 2023. doi: 10.18357/otessac.2023.3.1.244.Search in Google Scholar
Atenas, J., & Havemann, L. (2013). Quality assurance in the open: An evaluation of OER repositories. The International Journal for Innovation and Quality in Learning, 2, 24–32. http://eprints.rclis.org/20517/1/30-288-1-PB.pdf.Search in Google Scholar
Bethel, E. (2020). Open textbooks: Quality and relevance for postsecondary study in the Bahamas. International Review of Research in Open and Distributed Learning, 21(2), 61–80. doi: 10.19173/irrodl.v21i2.4598.Search in Google Scholar
Coalition for Advancing Research Assessment (CoARA). (2022). Agreement on reforming research assessment. https://coara.eu/app/uploads/2022/09/2022_07_19_rra_agreement_final.pdf.Search in Google Scholar
Deutsches Zentrum für Hochschul- und Wissenschaftsforschung (DZHW). (2021). Methodik. Bibliometrischer Indikatorbericht für Institutionen. Version 20211007. https://bibliometrie.info/downloads/Methodikanhang.pdf.Search in Google Scholar
European Commission. (2023). Academic careers understood through measurement and norms (ACUMEN). Better measures for evaluating researchers. https://cordis.europa.eu/article/id/159979-better-measures-for-evaluating-researchers.Search in Google Scholar
European Commission, Directorate-General for Research and Innovation, Cabello Valdes, C., Rentier, B., Kaunismaa, E., Metcalfe, J., Esposito, F., McAllister, D., , … O´Carroll, C. (2017). Evaluation of research careers fully acknowledging Open Science practices – Rewards, incentives and/or recognition for researchers practicing Open Science. Publications Office. https://data.europa.eu/doi/10.2777/75255.Search in Google Scholar
Farrow, R., Pitt, R., & Weller, M. (2020). Open Textbooks as an innovation route for open science pedagogy. Education for Information, 36(2020), 227–245. doi: 10.3233/EFI-190260.Search in Google Scholar
Filiposka, S., Green, D., Mishev, A., Kjorveziroski, V., Corleto, A., Napolitano, E., … Lazzeri, E. (2023). Draft methodology for FAIR-by-design learning materials (1.2). Zenodo. doi: 10.5281/zenodo.7875540.Search in Google Scholar
Fisher, M. (2018). Evaluation of cost savings and perceptions of an open textbook in a community college science course. The American Biology Teacher, 80(6), 410–415. doi: 10.1525/abt.2018.80.6.410.Search in Google Scholar
Glahn, C., Kalz, M., Gruber, M., & Specht, M. (2010). Supporting the reuse of open educational resources through open standards. In T. Hirashima, A. F. Mohd Ayub, L. F. Kwok, S. L. Wong, S. C. Kong, & F. Y. Yu (Eds.), Workshop Proceedings of the 18th International Conference on Computers in Education (pp. 308–315). Asia-Pacific Society for Computers in Education.Search in Google Scholar
Heck, T., Kullmann, S., Hiebl, J., Schröder, N., Otto, D., & Sander, P. (2020). Designing open informational ecosystems on the concept of open educational resources. Open Education Studies, 2(1), 252–264. doi: 10.1515/edu-2020-0130.Search in Google Scholar
Hicks, D., Wouters, P., Waltman, L., Rijcke, S., & Rafols, I. (2015). The Leiden Manifesto for research metrics. Use these ten principles to guide research evaluation. Nature, 7548(520), 429–431. doi: 10.1038/520429a.Search in Google Scholar
Hiebl, J., Kullmann, S., Heck, T., & Rittberger, M. (2023). Reflecting open practices on digital infrastructures: Functionalities and implications of knowledge. In D. Otto, G. Scharnberg, M. Kerres, & O. Zawacki-Richter (Eds.), Distributed learning ecosystems. Wiesbaden: Springer VS. doi: 10.1007/978-3-658-38703-7_11.Search in Google Scholar
INORMS The International Network of Research Management Societies. Research Evaluation Group (REG). (2021). The SCOPE framework: A five-stage process for evaluating research responsibly. https://inorms.net/wp-content/uploads/2022/03/21655-scope-guide-v10.pdf.Search in Google Scholar
Kullmann, S. (2025). Teaching counts! Open Educational Resources as an object of measurement for scientometric analysis. Quantitative Science Studies, 6, 216–237. doi: 10.1162/qss_a_00346.Search in Google Scholar
Kullmann, S., & Rasulzade, S. (2025). What is a recognizable contribution? On the characteristics of OER authorship. 18. Internationales Symposium für Informationswissenschaft (ISI 2025). Chemnitz, Deutschland. doi: 10.5281/zenodo.14925594.Search in Google Scholar
Kullmann, S., & Weimer, V. (2023). Teaching as part of open scholarship – scientometric indicators for open educational resources. Proceedings of ISSI 2023 – the 19th International Conference of the International Society for Scientometrics and Informetrics, (Vol. 1, pp. 667–683). doi: 10.5281/zenodo.8246995.Search in Google Scholar
Kullmann, S., & Weimer, V. (2024). Teaching as part of open scholarship: Developing a scientometric framework for Open Educational Resources. Scientometrics, 129(10), 6065–6087. doi: 10.1007/s11192-024-05007-1.Search in Google Scholar
Marín, V., & Villar-Onrubia, D. (2023). Online infrastructures for open educational resources. In O. Zawacki-Richter & I. Jung (Eds.), Handbook of open, distance and digital education. Singapore: Springer. doi: 10.1007/978-981-19-2080-6_18.Search in Google Scholar
Marín, V., Zawacki-Richter, O., Aydin, C., Bedenlier, S., Bond, M., Bozkurt, A., … Zhang, J. (2022). Institutional measures for supporting OER in higher education: An international case-based study. Open Education Studies, 4(1), 310–321. doi: 10.1515/edu-2022-0019.Search in Google Scholar
Menzel, M. (2023). Developing a metadata profile for higher education OER repositories. In D. Otto, G. Scharnberg, M. Kerres, & O. Zawacki-Richter (Eds.), Distributed learning ecosystems. Wiesbaden: Springer VS. doi: 10.1007/978-3-658-38703-7_14.Search in Google Scholar
O’Neill, G. (n.d.). Open and Universal Science Project (OPUS). Deliverable 3.1. Indicators and Metrics to Test in the Pilots. https://opusproject.eu/wp-content/uploads/2023/09/OPUS_D3.1_IndicatorsMetrics_FINAL_PUBLIC.pdf.Search in Google Scholar
Owen, R., Macnaghten, P., & Stilgoe, J. (2012). Responsible research and innovation: From science in society to science for society, with society. Science and Public Policy, 39, 751–760. doi: 10.1093/scipol/scs093.Search in Google Scholar
Perifanou, M., & Economides, A. A. (2022). Measuring quality, popularity, demand and usage of Repositories of open educational resources (ROER): A study on thirteen popular ROER. Open Learning: The Journal of Open, Distance and e-Learning, 38(4), 315–330. doi: 10.1080/02680513.2022.2033114.Search in Google Scholar
Pitt, R. (2023). Open textbooks in higher education teaching. In D. Otto, G. Scharnberg, M. Kerres, & O. Zawacki-Richter (Eds.), Distributed learning ecosystems. Wiesbaden: Springer VS. doi: 10.1007/978-3-658-38703-7_6.Search in Google Scholar
Romero-Pelaez, A., Segarra-Faggioni, V., Piedra, N., & Tovar, E. (2019). A proposal of quality assessment of OER based on emergent technology. 2019 IEEE Global Engineering Education Conference (EDUCON) (pp. 1114–1119). Dubai, United Arab Emirates. doi: 10.1109/EDUCON.2019.8725067.Search in Google Scholar
Santos-Hermosa, G. (2023). The role of institutional repositories in higher education: Purpose and level of openness. In D. Otto, G. Scharnberg, M. Kerres, & O. Zawacki-Richter (Eds.), Distributed learning ecosystems. Wiesbaden: Springer VS. doi: 10.1007/978-3-658-38703-7_4.Search in Google Scholar
Santos-Hermosa, G., Ferran, N., & Abadal, E. (2017). Repositories of open educational resources: An assessment of reuse and educational aspects. The International Review of Research in Open and Distributed Learning (IRROLD), 18(5), 85–120. doi: 10.19173/irrodl.v18i5.3063.Search in Google Scholar
Segarra-Faggioni, V., & Romero-Pelaez, A. (2022). Automatic classification of OER for metadata quality assessment. 2022 International Conference on Advanced Learning Technologies (ICALT) (pp. 16–18). doi: 10.1109/ICALT55010.2022.00011.Search in Google Scholar
Simão de Deus, W., & Barbosa, F. E. (2020). The use of metadata in open educational resources repositories: An exploratory study. 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC) (pp. 123–132). doi: 10.1109/COMPSAC48688.2020.00025.Search in Google Scholar
Stagg, A., & Partridge, H. (2019), Facilitating open access to information: A community approach to open education and open textbooks. Proceedings of the Association for Information Science and Technology, 56, 477–480. doi: 10.1002/pra2.76.Search in Google Scholar
Tavakoli, M., Elias, M., Kismihók, G., & Auer, S. (2020). Quality prediction of open educational resources a metadata-based approach. 2020 IEEE 20th International Conference on Advanced Learning Technologies (ICALT) (pp. 29–31). doi: 10.1109/ICALT49669.2020.00007.Search in Google Scholar
Tischler, F., Heck, T., & Rittberger, M. (2022). Nützlichkeit und Nutzbarkeit von Metadaten bei der Suche und Bereitstellung von offenen Bildungsressourcen. Information – Wissenschaft & Praxis, 73(5–6), 253–263. doi: 10.1515/iwp-2022-2238.Search in Google Scholar
Weimer, V., & Kullmann, S. (2025, in press). Open educational resources (OER) in science evaluation – recognizing and rewarding academic teaching. Frankfurt am Main: DIPF Leibniz Institute for Research and Information in Education.Search in Google Scholar
© 2025 the author(s), published by De Gruyter
This work is licensed under the Creative Commons Attribution 4.0 International License.
Articles in the same Issue
- Research Articles
- Proposing a Conceptual Framework for Social Listening in Libraries: A Potential Game Changer to Engage Gen Z and Centennial Users
- Roles, Challenges, and Sustainability of Australian Journals: A Survey of Editors
- Diversity and Application of Biplot Methods in Ecuadorian Research: A Systematic Literature Review
- Towards OER Statistics: Data Requirements for the Scientometric Analysis of Open Educational Resources
- Academic Libraries and the Development Agenda: Librarying the First Aspiration of Agenda 2063 in Ghana
- Leveraging AI-Generated Visuals for Enhancing Management of Career Orientation: A Quasi-Experimental Study
- Artificial Intelligence in Academic Writing and Research: Adoption and Effectiveness
- Promoting Information Literacy to Mitigate Misinformation on Agricultural Government Schemes: A Farmer-Focused Perspective
- Review Articles
- Transparency in Open Science: An Actionable Principle?
- Writing the Pesantren Library: A Conceptual and Framework Proposition
- Trends and Analysis of Artificial Intelligence Research in Latin America (2013–2023)
- Communications
- The Now-Defunct ResearchGate Score and the Extant Research Interest Score: A Continued Debate on Metrics of a Highly Popular Academic Social Networking Site
- When Data Meets the Past: Data Collection, Sharing, and Reuse in Ancient World Studies
- Integration of Critical Information Literacy Skills in Academic Libraries in Africa
Articles in the same Issue
- Research Articles
- Proposing a Conceptual Framework for Social Listening in Libraries: A Potential Game Changer to Engage Gen Z and Centennial Users
- Roles, Challenges, and Sustainability of Australian Journals: A Survey of Editors
- Diversity and Application of Biplot Methods in Ecuadorian Research: A Systematic Literature Review
- Towards OER Statistics: Data Requirements for the Scientometric Analysis of Open Educational Resources
- Academic Libraries and the Development Agenda: Librarying the First Aspiration of Agenda 2063 in Ghana
- Leveraging AI-Generated Visuals for Enhancing Management of Career Orientation: A Quasi-Experimental Study
- Artificial Intelligence in Academic Writing and Research: Adoption and Effectiveness
- Promoting Information Literacy to Mitigate Misinformation on Agricultural Government Schemes: A Farmer-Focused Perspective
- Review Articles
- Transparency in Open Science: An Actionable Principle?
- Writing the Pesantren Library: A Conceptual and Framework Proposition
- Trends and Analysis of Artificial Intelligence Research in Latin America (2013–2023)
- Communications
- The Now-Defunct ResearchGate Score and the Extant Research Interest Score: A Continued Debate on Metrics of a Highly Popular Academic Social Networking Site
- When Data Meets the Past: Data Collection, Sharing, and Reuse in Ancient World Studies
- Integration of Critical Information Literacy Skills in Academic Libraries in Africa