Abstract
This study examines the relationship between self-regulated learning (SRL) and learning transparency in digital learning environments. Using a digital learning environment on the topic of chemical equilibrium as an example, the study examines how to promote the planning, execution, and self-reflection of learning processes in line with Zimmerman’s SRL model by making the learning objectives explicit to secondary school students. The aim is to support learners in SRL within digital learning environments, on the one hand, and with complex chemistry topics, on the other hand, as both pose significant demands on (meta-)cognitive abilities. An initial practicality check with first-year primary school teacher trainees showed that structured learning objectives can help learners organize themselves and assess their own learning. At the same time, potential for optimization was identified concerning the design of the learning environment and its functionality. The results highlight the potential of learning objectives as a central tool for fostering metacognitive, cognitive, and motivational strategies in digital chemistry education. In the next step, the study will be conducted with a larger sample of students to comprehensively evaluate the effects of the learning objectives.
1 Introduction
Transparency in the learning process is a theoretical construct that can help learners in applying SRL skills within digital learning environments. The ability to learn in a self-regulated way is considered a key competence in modern society 1 and an important requirement for lifelong learning. 2 To foster these skills, it is essential to support students in developing and applying cognitive, motivational, and metacognitive strategies. However, upper secondary school students often struggle with setting their own goals and maintaining motivation. 3 At the same time, teachers at this level rarely promote SRL. 4
Two central aspects arise in the context of SRL with digital learning environments: On the one hand, there is the potential to promote SRL through digital learning environments; on the other hand, the learning environment also place high demands on learners. 5 , 6 One challenge that learners face when using digital learning environments is, for example, to select suitable cognitive or metacognitive strategies. 5 , 6 , 7 In the context of learning with digital learning environments, aspects of time management, effort regulation, or metacognitive strategies (e.g. self-monitoring, self-management, planning) are particularly relevant. Therefore, promoting these skills in digital learning environments is important for more effective digital learning. 8 , 9 , 10 , 11 In addition to its importance for digital learning, promoting SRL is especially crucial in chemistry education. The content of this subject is often cognitively demanding. Topics such as chemical equilibrium require a high degree of SRL skills. 12 Integrating learning transparency within SRL environments offers an opportunity to provide targeted support to students. 13 To address this problem and support secondary school students in chemistry, a digital learning environment for chemical equilibrium was developed that includes learning objectives, questions for self-reflection and a transparent design (for example, about the sequence of the learning environment). These measures are intended to create transparency in learning in order to make it clear to learners what demands or expectations are placed on them within the learning process, what knowledge is required of them and what the goals are. 13 , 14
2 Theoretical background
2.1 Self-regulated learning
Self-regulated learners are able to activate, change and maintain cognitions, affects and behaviors that are designed to achieve their own learning goals. 15 , 16 , 17 , 18 The (meta-)cognitive, emotional and motivational dimensions of learning are critical components of SRL. 19 SRL is influenced by a variety of regulatory mechanisms and regulatory judgments, such as self-efficacy, knowledge of cognitive and metacognitive strategies, or the ability to reflect. 19 , 20 In recent decades, a large number of models of SRL have been developed. 19 Zimmerman’s 17 , 18 model is one of the most important models and is therefore considered the basis for this study due to its processual nature. Zimmerman 17 , 18 distinguishes between the forethough, performance and self-reflection phases. In the forethough phase, tasks are analyzed, for example by selecting appropriate strategies or setting goals. At the same time, motivational aspects play an important role. The performance phase focuses on metacognitive strategies such as self-monitoring and self-observation. These are used to carry out the planned steps, monitor the learning process and make adjustments when necessary. Finally, in the self-reflection phase, learners evaluate both their goal achievement and overall learning process. This evaluation helps them draw conclusions that can inform future learning endeavors. 17 , 18 For this reason, it is important that learners acquire knowledge about motivational, metacognitive and cognitive strategies 20 and are also enabled to apply this knowledge in a meaningful way. 21 , 22 , 23 Measures to promote (meta-)cognitive skills, for example, can improve academic performance. 20 , 24 , 25 , 26 , 27 Another way to promote aspects of sSRL (e.g., goal setting, planning, monitoring), 28 , 29 , 30 , 31 , 32 as well as academic performance, is assessment. 28 , 33 In this context, the way in which teachers organize the assessment measure is an important factor, but the exact conditions are not yet entirely clear. 21 , 28 , 34 SRL has also already been investigated in the context of chemistry lessons. Feldman-Maggor et al. 35 and Feldman-Maggor 36 showed through their research that integrating SRL skills into the classroom is important, for example by giving learners choices. However, teachers should also focus on promoting learning strategies. Additionally, metacognitive support tools can facilitate self-regulated problem-solving in chemistry lessons. 37 The use of multi-touch experiment instructions also has the potential to promote SRL indirectly. 38
2.2 Transparency of learning
The second theoretical construct is learning transparency. The “Transparency in Learning and Teaching” (TILT) framework provides a helpful foundation for the conceptual categorization of learning transparency. Even before the learning process begins, this framework emphasizes three central steps that are clearly defined by a structured approach and specifically support the processing of tasks: 13
Learners should know the purpose of the task and be informed about the actual benefit of the task before working on it.
It should be made clear to learners in the task design what they are supposed to do and how they are supposed to do it.
Learners should know the criteria that make a task successful.
The aim of these steps is to support teachers in the design of tasks and to encourage learners on a metacognitive level when working on the tasks. 13 However, not only tasks are important in the context of teaching, for example, communication with the learners also plays a role.
Veenman 39 addresses metacognitive processes with his WWW&H rule. In this rule, he emphasizes that it is important that learners are guided and trained to answer the questions ‘What to do when, why is it necessary and how to do it?’. Both approaches, Winkelmes’ and Veenman’s, offer solutions to the problem that learners often face in school of not using the best strategies in the right places. 2 , 3 , 39 Therefore, students should be supported, for example with support programs that focus on metacognitive skills 24 , 25 even before they work on the tasks.
In order to implement a broad concept of learning transparency, it is also relevant to include aspects of assessment research. Black and William 40 make it clear in their theory of formative assessment that it is important to gain insights into student performance so that the next steps in teaching/learning can be planned. In the context of SRL, however, the focus is particularly on the learners and their own learning process, which can and should be supported by the teachers in the context of formative assessment. 21 By making learning intentions and learning criteria transparent, teachers can ensure that learners take responsibility for their own learning process. 32 , 40 , 41 The assumption of responsibility can also be promoted through self-assessment by the learners. 28 , 42 , 43 This encourages learners to actively reflect and evaluate their own learning process so that they can better assess the quality of their work. 43 The promotion of these skills is particularly important in the context of SRL. 17 , 44 On this basis, it is logical to link SRL with formative and self-assessment, as learners can be supported explicitly in the forethough and self-reflection phase and implicitly in the performance phase through transparent learning intentions, learning criteria and self-reflection.
In the concepts presented so far, a direct method for the actual realization of learning transparency has not played a role. Learning objectives are one way of linking all concepts together. Learning objectives are statements that use an operator to communicate the purpose of the lesson, the expected performance and the conditions under which the performance is to be achieved. 14 , 45 , 46 , 47 , 48 Three central aspects are attributed to them: 14 , 46 , 47 , 48
Behavior: Learning objectives should reflect an observable action, be measurable and specific, and have appropriate action verbs.
Conditions: Learning objectives should describe the circumstances under which learners are expected to demonstrate the behavior.
Criterion: Learning objectives should specify the benchmark or standard that learners must meet in order to demonstrate acceptable performance.
Orr et al. 14 found that there is much literature on learners perceiving learning objectives as useful for their learning, but there is little literature on their effectiveness on, for example, achievement. Furthermore, the results of studies on learning objectives are inconsistent. 14 , 49 , 50 , 51 , 52 Some studies show that learning objectives that are too specific have lower effects on performance than non-specific ones, 50 while Wirth et al. 52 found no differences in specificity with general learning objectives. Some studies have also found that transparent learning objectives can help learners to better organize their time and effort, 53 but learners may also have difficulties in assessing the expected depth of learning. 54 For this reason, it is even more important that learners receive clear instructions on the use of learning objectives and the benefits of their use. 27 , 39 , 51 These results show that learning objectives can be suitable for promoting more effective learning and also aspects of SRL. However, it is clear that it is necessary to make transparent to learners what is expected of them and how learning objectives can be used.
Based on these research approaches, we have developed a definition for transparency of learning to clarify which aspects teachers should consider when planning lessons in order to support learners in initiating, implementing and reflecting on their learning processes: 55
Learning transparency includes all measures that a teacher can take to design learning situations in such a way that learners know
why they are learning something and how they can use the knowledge they have acquired to achieve situation-specific goals that go beyond the learning situation.
what they have to work on and what demands are placed on them.
how they can adequately assess their own learning achievements.
Compared to existing approaches, the definition of learning transparency presented here goes beyond the previous focus on particular elements, such as task processing 13 and formative assessment. 40 These facets all contribute to metacognitive skills and can promote SRL. Our definition emphasizes a comprehensive, process-oriented approach. This approach integrates task and learning objective transparency, linking them to the overarching goal of enabling learners to actively shape, control, and critically reflect on their learning processes. Thus, the definition provides teachers with a structured guide for incorporating learning transparency as a holistic concept into lesson planning and implementation.
3 Interaction of SRL and learning transparency in digital learning environments
Digital learning environments present a number of challenges for learners. Students often find it difficult to cope with the particular autonomy of such learning environments. 6 A common challenge is the non-linear structure of digital learning environments, 5 , 56 which often require learners to navigate multiple media formats and complex organizational structures. This complexity demands constant self-monitoring. 56 Additionally, learners must make numerous decisions within such environments, necessitating skills like goal setting and effective coordination of their learning processes. 5 , 10 , 56 , 57 , 58 Metacognitive, cognitive and motivational strategies are necessary to deal with these demands. 5 , 6 , 10 , 58 As studies show that learners who are equipped with better SRL skills are also more able to cope with such learning environments. Consequently, SRL is widely regarded as an effective approach to support learning in such contexts. 6 , 56 The literature identifies two main ways to promote SRL: 6 , 59 , 60
Direct support by integrating measures to teach learners SRL skills strategies through training.
Indirect support by creating a learning environment that requires learners to apply SRL skills that they already know.
Digital learning environments not only place high demands on learners in terms of applying SRL, but conversely, they can also be used to promote SRL. 6 At this point, therefore, we will focus on the indirect promotion of SRL through a digital learning environment. However, it should be noted that studies have shown no or even negative effects when using only indirect support. 60 In order to indirectly promote SRL, it is useful to design a learning environment that includes complex, challenging learning tasks, clear learning objectives and at the same time encourages learners to self-assess. 61 For this reason, we combine the indirect promotion of SRL with the implementation of learning transparency measures in order to meet the challenges and potentials mentioned. We are guided by Zimmerman’s 17 process model and combine this with suitable support measures:
In the forethought phase, learners are made aware of why, what and how they can learn in relation to the task analysis through appropriate measures (for example, learning objectives; explanations of functions). This enables them to decide on relevant strategies and sub-goals. 13 Learners are also encouraged to actively extend their knowledge by applying their prior knowledge. In addition, learners are given transparent learning objectives and criteria. 60
In the performance phase, learners are supported in their metacognitive perception of their own learning process (for example, scaffolding or step-by-step exploration of the learning environment). Equally important, learners have the opportunity to navigate the learning environment relatively freely. 60 , 62 However, learners are also free to decide how fast they work and what level of support they need. 63
In the self-reflection phase, learners can check themselves using sample solutions and actively reflect on their learning progress using self-assessment sheets. Learners can then decide whether they need to explore the topic more deeply, for example by choosing more in-depth tasks.
There is already some research on the effects of transparency of learning in the context of SRL. For example, the use of learning objectives or assessment guidelines can be relevant on several levels, as learners can be supported in the forethought phase in organizing their learning and selecting adequate learning strategies. 51 , 53 , 64 On the other hand, it can also be useful to use them as a self-assessment tool in the self-reflection phase. 51 , 53 Most research in this area deals with higher education rather than secondary school students and digital learning environments. 10 , 65
4 Research goal and design
The overarching goal of this study is to investigate how SRL can be promoted through learning transparency in digital learning environments. To this end, we have developed a digital learning environment for secondary school students that contains transparent learning objectives. Before the learning environment can be evaluated quantitatively and qualitatively in the pilot and main study, the materials must first be tested in a practical check. During the practicality check, we address the following questions:
Q1:
How do learners use the learning environment, the functions of the learning environment, the learning objectives and the self-assessment?
Q2:
How do learners evaluate the learning environment, the functions of the learning environment, the learning objectives and the self-assessment?
The practicality check was carried out with first-year primary school teacher trainees. These students were chosen because they are similar to 11th grade school students in terms of age and prior knowledge and can therefore be used to determine whether the learning objectives and support measures in the learning environment are helpful. The primary school teacher trainees worked on a part of the learning environment for 60 min. Their work on the tablets was recorded using screen capture software. Finally, the students were interviewed to gain a more detailed insight into their attitude toward the presented approach.
Before presenting the results, the learning environment will be described below.
5 The digital learning environment
5.1 Structure and content
The basis for the indirect promotion of SRL is a digital learning environment on chemical equilibrium. The digital learning environment was created using genially. 66 The environment is organized into three different modules, each with the same structure but different content. The first module focuses on chemical equilibrium and the law of mass action, the second module focuses on Le Chatelier’s principle and the last module focuses on applying what has been learnt in the context of the environment and industry.
In addition, there will be three different treatment groups (Figure 1). Students will either receive learning objectives at the beginning and at the end (G1), only at the beginning (G2) or none at all (G3).

Overview of the groups.
Figure 2 shows the process of the digital learning environment for each module. First, all groups receive an explanation of how the learning environment works. Two groups (G1 and G2) are then given learning objectives to help them plan their learning process before working on the tasks. G3 does not receive any further support before they complete the tasks. This is followed by the learning unit using text, experimental videos and models (see Figure 3 for an example). At the same time, the learners should apply the acquired knowledge with the help of learning tasks. After the learners in G 1 have worked through all the learning units and tasks and compared their solutions with the sample solution, they carry out a self-assessment based on the learning objectives. The learners can then decide whether to repeat the tasks, choose more in-depth tasks or move on to the next module. In the next module, the cycle starts all over again. The aim of this modularization, in addition to the division into parts, is to find out whether the learners in the learning environment approach the topics differently in the course of their learning process and therefore apply different learning strategies.

Procedure of the digital learning environment.

Example page for chemical equilibrium.
Figure 3 gives an insight into an example page of the digital learning environment. At the beginning, the learners are given the learning objective and are then asked to click through the learning environment step by step. Several experiment videos and a model are integrated on this page. Additional explanations of specialist terms are provided in some places. At the end, learners are given a learning task to apply the knowledge they have acquired on this page. The aim is to provide a clearly structured learning environment that also offers enough freedom to make independent decisions.
5.2 Formulating of transparent learning objectives
It can be assumed that the use of learning objectives is an effective way to implement learning transparency in the classroom while promoting SRL in the context of digital learning. Orr et al. 67 rightly point out that it is not yet clear how teachers can help their students to better understand their performance, the conditions and criteria of learning objectives to enable effective learning. For this reason, we present an approach that makes learning objectives transparent to help secondary school students understand them.
When formulating transparent learning objectives, we focus on the three central attributes (behavior, condition and criterion) of learning objectives. In addition, suggestions for the formulation of learning objectives are made in the literature. Accordingly, learning objectives should be adapted to the students’ prior knowledge, 68 be clear, understandable and concrete 14 , 68 , 69 and clarify the purpose of the learning objective. 51 , 64 However, they should also have a clearly describable, observable behaviour, 14 , 47 , 70 clear actionverbs 14 , 45 and be linked to tasks. 47 Last but not least, it is important that they are realistic, challenging 47 , 71 and measurable, 47 , 48 , 69 make the time limits transparent 69 and are universally accessible. 47 , 71
Our approach attempts to include these components in order to support learners in understanding the learning objectives. The example in Figure 4 illustrates how this approach is structured.

Structure of the learning objective with icons for purpose, prior knowledge and goal.
At the centre of the approach is the learning objective, which relates to a thematic focus in the learning environment and can therefore be linked to explicit tasks. In this example it is:
“I can explain what a chemical equilibrium is and what specific characteristics it has.”
This learning objective covers the understanding of chemical equilibrium and the characteristics behind it. Learners should be able to explain these aspects after they worked in the learning environment. Our aim was to make the objective as understandable, clear, and concrete as possible. However, we suppose that this was not enough for learners to really understand what is expected of them, especially if they have never had the topic before. At this point, we thought it was important to give the learners additional explanations. The first thing we did was to add the purpose component:
“Expand your knowledge of chemical reactions here.”
With these measures, we wanted to address the learners directly and clarify the purpose of this knowledge and the context in which it is needed. Secondly, we focused on linking the learning objective to the students’ prior knowledge:
“So far, you can explain that chemical reactions have certain characteristics (for example, that chemical reactions are associated with energy conversion). Chemical equilibriums also have specific characteristics.”
In this case, this refers explicitly to the content of chemistry lessons, but to some extent we also make references to mathematics lessons. Here, all learners have already dealt with chemical reactions and know that chemical reactions have specific characteristics. Our aim is to break down barriers by helping learners to remember what they already know and plan their learning at the same time. Finally, we wanted to give them a goal to work towards:
“At the end of this module, you will be able to explain why this is a chemical equilibrium based on the reaction shown.”
In this example, learners are shown a video of a chemical reaction that illustrates the influence of a chemical equilibrium. Here, learners are given a time frame (“At the end of this module”) and prepared for what to expect so they can choose appropriate learning strategies. At the beginning of each module, but also in parallel with the completion of the tasks, learners are given these learning objectives.
6 Checking the practicality
The sample group for the practicality check comprises three female university students. The first results are presented below. In this context, the two research questions Q1 and Q2 are answered. Because the sample size is small, the results should be treated with caution. Nevertheless, the findings offer valuable insights into the practical suitability of the learning environment and how well it is accepted by students.
Screen recordings
The screen recordings provided a more detailed insight into the use of the learning environment (Q1) to understand which of the implemented features were actually being used. The students needed about 60 min each to complete the first module. With this in mind, it seems reasonable for us to plan around 90 min per module in the pilot study, as not all tasks were fully completed by the students.
The data indicate that students accessed all the basic pages of the learning environment; however, only one student attempted the more in-depth tasks. The student completed one of the three tasks completely, one partially and the third not at all. In the context of SRL, this demonstrates that the student may have worked on the tasks according to her individual needs. In particular, the in-depth task was carried out with great care. Furthermore, we could see that all the students made full use of the presented approach and not only read the four learning objectives, but also looked at the purpose, prior knowledge and goal aspects. However, we suppose that these aspects were still not emphasized enough, especially for secondary school students. We therefore decided that learners could only move on to the next learning objective if they had read all three aspects.
Some of the tasks, such as the observation tasks for the implemented experiment videos and the task of adjusting the chemical equilibrium, were completed correctly and thoroughly by all students. This suggests that the content is suitable for the pilot study. However, the tasks for calculating the equilibrium constants require revision, because none of the students completed them correctly and in full. Based on the video recordings, it can be concluded that the students had difficulty working through the task. Additional support elements and minor changes to the task design will therefore be implemented for the pilot study.
Certain features demonstrated higher levels of usage than others. However, the implemented help for individual technical terms was used regularly by all students, whereas the buttons for the outline and learning objectives of the module were not used by any of the students. It would therefore seem sensible to review the frequency of use of these features in the pilot study to avoid an overload of features. Some tip cards also need to be revised because they were used by learners but aren’t providing as much support as hoped. The option to use the results with the sample solution was also used only sporadically. For this reason, the relevance of using the sample solution should be emphasized more strongly in the pilot study.
Finally, self-assessment questions at the end of each module were completed in full by all learners. Two students reported achieving two of the four learning objectives. One student reported not achieving any of the learning objectives. All students agreed that they had difficulty understanding the content related to the law of mass action and the equilibrium constant. The analysis of the screen recording confirms this self-assessment. First, it is encouraging that all students used the self-assessment, and that our analysis and the students’ self-assessments are consistent. Based on these findings, no changes are necessary for the subsequent pilot study.
The screen recordings provided a more detailed insight into the students’ approach than interviews or completed tasks alone would have. Based on this, it is possible to provide an answer to the first research question, as the students basically used the learning environment as expected. At the same time, the functions used appear to be useful, though minor changes are necessary for more effective learning. The extensive and sensible use of learning objectives and self-assessments is positive. However, it remains to be seen how effective these measures are.
Interviews
The interviews provided a more focused understanding of learners’ attitudes towards the digital learning environment (Q2). Learners particularly praised the structure and presentation of the learning environment (“I thought it was very well prepared. Everything was easy to read and it was well-structured. I liked that very much.”). They appreciated that it was consistently clear “what I have to be able to do at the end”. At the same time, learners highlighted the usefulness of the help cards, which they only needed at the end of the module. They said that they were helpful because “you could look at them if you weren’t quite sure”. It also seems to make sense to emphasize the tasks more when introducing them, as one of the students stated that she had problems with this. On the other hand, everything worked on a technical level. This is a good basis, given the complexity of the digital learning environment. As observed in the screen recordings and confirmed during interviews, some features were not utilized by students. For example, learners stated that they did not need the outline button in the learning environment (“I think I pressed that at the end for the demanding tasks. I didn’t need that.”). Due to the small sample size, we will wait for the results of the pilot study before making any modifications to the features.
Learners found the learning objectives themselves clear and comprehensible and appreciated their structured design, as it clarified what was expected of them. One student found the three-part structure of the learning objectives helpful, because it showed how they built on each other. This structure also showed her that she needed to understand one objective before moving to the next. Another student said that she understood better what was expected of her because of the three-part structure (“The first learning objective seemed easy at first, but then I saw it again with the second learning objective and thought, ‘Okay, you really have to make sure that you check that.’ I thought that was quite good.”). However, two learners noted that the procedure for working with the learning objectives was not entirely clear at the beginning of the module. They remarked that these objectives did not significantly impact their approach to work (“I just paid more attention to the fact that this is important right now. I didn’t plan in advance that this is exactly what I have to do now.”). For this reason, we will do even more to explain how the learning objectives are structured (purpose, prior knowledge, goals) and how they work so that learners can actually utilize them to their full potential. In addition, one learner stated that they would prefer more learning objectives within university courses, “because then I can see what we’ve covered, mark it as complete, and that helps me.” Another student added that having more learning objectives helps them “categorize the content more quickly and have a better overview, which makes it a bit more networked”.
Regarding the self-assessment, the learners stated that they would like to have a third button and not just ‘I have achieved’ and ‘I have not yet achieved’ (“Well, it helped me, but I would have liked to have had another button in the middle.”). Another student replied, however, that she believed that if a third button were added, many children would be more likely to say, ‘Yes, I’m in the middle.’ We will not pursue this idea in the pilot study as we are concerned that learners will only use the middle option. So, it seems reasonable to us to wait for the pilot study. However, the learners also said that the self-assessment helped them to assess their knowledge better, as they were “honest with themselves.” One student added that the self-assessment made her realize that she needs to take a closer look at the math problems. Finally, the learners stated that they would like to have more of these digital learning environments in the classroom as they were able to learn at their own pace and revisit some of the content several times. One student said that this had helped her in school, as all her chemistry lessons had been frontal, which she found boring and did not learn much from (“I think something like that would have given me a different approach. So, hey, you really can’t do it now, or I can do it, and I want to do it again.”).
The interviews made it possible to gain learners’ impressions of the digital learning environment and the implemented functions. This made it possible to provide an answer to Q2. It became clear that the learners were satisfied overall and that the learning objectives and self-assessment in particular were perceived as positive by the learners. However, it also became clear that individual functions in the learning environment are not even necessary and may only overload the learning environment.
7 Conclusion and outlook
Within this paper, our aim was to show learning transparency as a way to indirectly promote SRL in a digital learning environment. Furthermore, with our approach to transparent learning objectives, we want to offer a possible answer to the question of how teachers can better clarify the performance, the conditions and criteria of learning objectives to their students. 64
In this first practicality check, with primary school teacher trainees, of the learning environment and the presented approach to learning transparency, we were able to gain helpful insights into the use and evaluation of the digital learning environment. Initial results from the practicality check with teacher trainees highlight that learning objectives can effectively support learners’ organization and self-assessment. For example, learners found the learning objectives helpful in organizing their learning and were better able to assess whether they had learned something through the self-assessment measures. These points show that the implementation of learning objectives as part of the forethought and self-reflection phase can be a useful way of stimulating processes of SRL. At the same time, opportunities for improvement in the design and functionality of the digital learning environment were identified. The learning environment will be slightly revised before piloting in schools: The learning objectives themselves have proven to be understandable and helpful, but it should be made even clearer at the beginning of the learning environment how to work with them. Due to the low usage of the sample solution, it makes sense to better emphasize the positive aspects. On the other hand, we have decided not to remove some functions, which have little use by students, such as the outline button. We will wait for the pilot study in secondary schools and then decide whether to remove them. However, we will make small changes to the learning environment to better support learners. For example, we will provide additional support when working on math problems to help learners understand how to complete a task.
It is important to note that in the context of this check, only the first module and only one treatment group was tested. For this reason, the use of the second and third modules in particular needs to be examined in more detail as part of the pilot study. The next step is a pilot study with around 60 students. Our aim here is to examine the use of the learning objectives and how the learning environment works in detail with the help of screen recordings. In addition, the developed and used test instruments will be implemented for the first time and possibly optimized.
This will be followed by a larger main study. This study will examine how the learning objectives affect content knowledge, cognitive load and the use of SRL strategies. In addition, screen recordings will be made of the individual actions of the students on the tablets.
Acknowledgments
We would like to thank the students who participated in the practicality test.
-
Research ethics: Not applicable.
-
Informed consent: The students provided written informed consent to participate in this study. They were given important information about the project, data protection, data security and further processing of the data. Participation in the study was voluntary for the students. There were no disadvantages for non-participation.
-
Author contributions: NB: Conceptualization, Data curation, Investigation, Methodology, Project administration, Software, Visualization, Writing – original draft, IM: Conceptualization, Methodology, Project administration, Resources, Supervision, Writing – original draft. All authors have accepted responsibility for the entire content of this manuscript and approved its submission.
-
Use of Large Language Models, AI and Machine Learning Tools: The authors acknowledge the use of OpenAI’s generative AI technologies ChatGPT 4o, DeepL and DeepL Write for some translations.
-
Conflict of interest: The authors state no conflict of interest.
-
Research funding: None declared.
-
Data availability: The datasets presented in this article are not readily available because in the context of the written consent form, we assured the students that their data will only be stored on devices belonging to our university. Hence, we are not permitted to disclose the data publicly. Requests to access the datasets should be directed to IM.
References
1. Foster, N.; Piacentini, M.; Ercikan, K.; Guo, H.; Por, H.-H. Innovating Assessments to Measure and Support Complex Skills, 1st ed.; OECD: Paris, 2023.10.1787/e5f3e341-enSearch in Google Scholar
2. Stebner, F.; Schuster, C.; Weber, X.-L.; Greiff, S.; Leutner, D.; Wirth, J. Transfer of Metacognitive Skills in Self-Regulated Learning: Effects on Strategy Application and Content Knowledge Acquisition. Metacog. Lear. 2022, 17 (3), 715–744. https://doi.org/10.1007/s11409-022-09322-x.Search in Google Scholar
3. Dörrenbächer-Ulrich, L.; Bregulla, M. The Relationship Between Self-Regulated Learning and Executive Functions—A Systematic Review. Educ. Psychol. Rev. 2024, 36 (3). https://doi.org/10.1007/s10648-024-09932-8.Search in Google Scholar
4. Dignath, C.; Büttner, G. Teachers’ Direct and Indirect Promotion of Self-Regulated Learning in Primary and Secondary School Mathematics Classes – Insights from Video-Based Classroom Observations and Teacher Interviews. Metacog. Learn. 2018, 13 (2), 127–157. https://doi.org/10.1007/s11409-018-9181-x.Search in Google Scholar
5. Greene, J. A.; Moos, D. C.; Azevedo, R. Self‐regulation of Learning with Computer‐based Learning Environments. de 2011, 2011 (126), 107–115. https://doi.org/10.1002/tl.449.Search in Google Scholar
6. Perels, F.; Dörrenbächer, L. Selbstreguliertes Lernen und (technologiebasierte) Bildungsmedien [Self-regulated learning and (technology-based) educational media]. In Handbuch Bildungstechnologie: Konzeption und Einsatz digitaler Lernumgebungen: mit 141 Abbildungen und 17 Tabellen; Niegemann, H. M., Weinberger, A., Eds.; Springer: Berlin, 2020; pp 81–92.10.1007/978-3-662-54368-9_5Search in Google Scholar
7. Azevedo, R.; Moos, D. C.; Greene, J. A.; Winters, F. I.; Cromley, J. G. Why Is Externally-Facilitated Regulated Learning More Effective than Self-Regulated Learning with Hypermedia? ETR&D 2008, 56 (1), 45–72. https://doi.org/10.1007/s11423-007-9067-0.Search in Google Scholar
8. Azevedo, R.; Witherspoon, A. M. Self-Regulated Learning with Hypermedia. In Handbook of Metacognition in Education; Hacker, D. J., Dunlosky, J., Graesser, A. C., Eds.; The Educational Psychology Series; Routledge: New York, 2009; pp 319–339.Search in Google Scholar
9. Bannert, M.; Reimann, P. Supporting Self-Regulated Hypermedia Learning through Prompts. Instr. Sci. 2012, 40 (1), 193–211. https://doi.org/10.1007/s11251-011-9167-4.Search in Google Scholar
10. Broadbent, J.; Poon, W. L. Self-regulated Learning Strategies & Academic Achievement in Online Higher Education Learning Environments: A Systematic Review. Internet High. Educ. 2015, 27, 1–13. https://doi.org/10.1016/j.iheduc.2015.04.007.Search in Google Scholar
11. Doo, M. Y.; Zhu, M. A Meta‐analysis of Effects of Self‐directed Learning in Online Learning Environments. J. Comput. Assist. Learn 2024, 40 (1), 1–20. https://doi.org/10.1111/jcal.12865.Search in Google Scholar
12. Hagos, T.; Andargie, D. Effects of Formative Assessment with Technology on Students’ Meaningful Learning in Chemistry Equilibrium Concepts. Chem. Educ. Res. Pract. 2024, 25, 276–299. https://doi.org/10.1039/D2RP00340F.Search in Google Scholar
13. Winkelmes, M.-A. Why it Works. In Transparent Design in Higher Education Teaching and Leadership; Tapp, S., Felten, P., Winkelmes, M.-A., Boye, A., Finley, A., Eds.; Routledge: New York, 2023; pp 17–35.10.4324/9781003448396-3Search in Google Scholar
14. Orr, R. B.; Csikari, M. M.; Freeman, S.; Rodriguez, M. C. Writing and Using Learning Objectives. CBE Life Sci. Educ. 2022, 21 (3), 1–6. https://doi.org/10.1187/cbe.22-04-0073.Search in Google Scholar PubMed PubMed Central
15. Pintrich, P. R. The Role of Goal Orientation in Self-Regulated Learning. In Handbook of Self-Regulation; Boekaerts, M., Pintrich, P. R., Zeidner, M., Eds.; Elsevier: San Diego, 2000; pp 451–502.10.1016/B978-012109890-2/50043-3Search in Google Scholar
16. Winne, P. H.; Hadwin, A. F. Studying as Self-Regulated Learning. In Metacognition in educational theory and practice; Dunlosky, J., Graesser, A. C., Hacker, D. J., Eds.; Educational Psychology Series: New York, 1998; pp 277–304.Search in Google Scholar
17. Zimmerman, B. J. Attaining Self-Regulation. In Handbook of Self-Regulation; Boekaerts, M., Pintrich, P. R., Zeidner, M., Eds.; Elsevier: San Diego, 2000; pp 13–39.10.1016/B978-012109890-2/50031-7Search in Google Scholar
18. Zimmerman, B. J. Becoming a Self-Regulated Learner: An Overview. Theory Into Pract. 2002, 41 (2), 64–70. https://doi.org/10.1207/s15430421tip4102_2.Search in Google Scholar
19. Panadero, E. A Review of Self-Regulated Learning: Six Models and Four Directions for Research. Front. psychol. 2017, 8, 422. https://doi.org/10.3389/fpsyg.2017.00422.Search in Google Scholar PubMed PubMed Central
20. Sitzmann, T.; Ely, K. A Meta-Analysis of Self-Regulated Learning in Work-Related Training and Educational Attainment: What we Know and Where we Need to Go. Psychol. Bull. 2011, 137 (3), 421–442. https://doi.org/10.1037/a0022777.Search in Google Scholar PubMed
21. van der Linden, J.; van der Vleuten, C.; Nieuwenhuis, L.; van Schilt-Mol, T. Formative Use of Assessment to Foster Self-Regulated Learning: The Alignment of Teachers’ Conceptions and Classroom Assessment Practices. J. Form. Des. Learn. 2023, 7 (2), 195–207. https://doi.org/10.1007/s41686-023-00082-8.Search in Google Scholar
22. Winne, P. H. Cognition and Metacognition Within Self-Regulated Learning. In Handbook of Self-Regulation of Learning and Performance; Schunk, D. H., Greene, J. A., Eds.; Educational Psychology Handbook Series; Routledge: New York, 2018, 2nd ed.; pp. 36–48.10.4324/9781315697048-3Search in Google Scholar
23. Wirth, J.; Stebner, F.; Trypke, M.; Schuster, C.; Leutner, D. An Interactive Layers Model of Self-Regulated Learning and Cognitive Load. Educ. Psychol. Rev. 2020, 32 (4), 1127–1149. https://doi.org/10.1007/s10648-020-09568-4.Search in Google Scholar
24. Dent, A. L.; Koenka, A. C. The Relation between Self-Regulated Learning and Academic Achievement across Childhood and Adolescence: A Meta-Analysis. Educ. Psychol. Rev. 2016, 28 (3), 425–474. https://doi.org/10.1007/s10648-015-9320-8.Search in Google Scholar
25. de Boer, H.; Donker, A. S.; van der Werf, M. P. C. Effects of the Attributes of Educational Interventions on Students’ Academic Performance. Rev. Educ. Res. 2014, 84 (4), 509–545. https://doi.org/10.3102/0034654314540006.Search in Google Scholar
26. Boer, H. de; Donker, A. S.; Kostons, D. D.; van der Werf, G. P. Long-Term Effects of Metacognitive Strategy Instruction on Student Academic Performance: A Meta-Analysis. Educ. Res. Rev. 2018, 24, 98–115. https://doi.org/10.1016/j.edurev.2018.03.002.Search in Google Scholar
27. Dignath, C.; Büttner, G. Components of Fostering Self-Regulated Learning Among Students. A Meta-Analysis on Intervention Studies at Primary and Secondary School Level. Metacog. Learn. 2008, 3 (3), 231–264. https://doi.org/10.1007/s11409-008-9029-x.Search in Google Scholar
28. Andrade, H. L. A Critical Review of Research on Student Self-Assessment. Front. Educ. 2019, 4. https://doi.org/10.3389/feduc.2019.00087.Search in Google Scholar
29. Broadbent, J.; Sharman, S.; Panadero, E.; Fuller-Tyszkiewicz, M. How Does Self-Regulated Learning Influence Formative Assessment and Summative Grade? Comparing Online and Blended Learners. Internet High. Educ. 2021, 50, 100805. https://doi.org/10.1016/j.iheduc.2021.100805.Search in Google Scholar
30. Kostons, D.; van Gog, T.; Paas, F. Training Self-Assessment and Task-Selection Skills: A Cognitive Approach to Improving Self-Regulated Learning. Learn. Instr. 2012, 22 (2), 121–132. https://doi.org/10.1016/j.learninstruc.2011.08.004.Search in Google Scholar
31. Li, J.; Yongqi Gu, P. Formative Assessment for Self-Regulated Learning: Evidence from a Teacher Continuing Professional Development Programme. Syst. 2024, 125, 103414. https://doi.org/10.1016/j.system.2024.103414.Search in Google Scholar
32. Panadero, E.; Andrade, H.; Brookhart, S. Fusing Self-Regulated Learning and Formative Assessment: A Roadmap of Where we are, how we Got here, and Where we are Going. Aust. Educ. Res 2018, 45 (1), 13–31. https://doi.org/10.1007/s13384-018-0258-y.Search in Google Scholar
33. Yan, Z.; Lao, H.; Panadero, E.; Fernández-Castilla, B.; Yang, L.; Yang, M. Effects of Self-Assessment and Peer-Assessment Interventions on Academic Performance: A Meta-Analysis. Educ. Res. Rev. 2022, 37, 100484. https://doi.org/10.1016/j.edurev.2022.100484.Search in Google Scholar
34. Köppe, C.; Verhoeff, R. P.; van Joolingen, W. Elements for Understanding and Fostering Self-Assessment of Learning Artifacts in Higher Education. Front. Educ. 2024, 9. https://doi.org/10.3389/feduc.2024.1213108.Search in Google Scholar
35. Feldman-Maggor, Y.; Blonder, R.; Tuvi-Arad, I. Let Them Choose: Optional Assignments and Online Learning Patterns as Predictors of Success in Online General Chemistry Courses. Internet. High. Educ. 2022, 55, 100867. https://doi.org/10.1016/j.iheduc.2022.100867.Search in Google Scholar
36. Feldman-Maggor, Y. Identifying Self-Regulated Learning in Chemistry Classes – a Good Practice Report. Chem. Teach. Int. 2023, 5 (2), 203–211. https://doi.org/10.1515/cti-2022-0036.Search in Google Scholar
37. Jasper, L.; Melle, I. Scaffolding Self-Regulated Problem Solving: The Influence of Content-independent Metacognitive Prompts on Students’ General Problem-Solving Skills. Chem. Teach. Int. 2025, 1–10. https://doi.org/10.1515/cti-2025-0012.Search in Google Scholar
38. Seibert, J.; Heuser, K.; Lang, V.; Perels, F.; Huwer, J.; Kay, C. W. M. Multitouch Experiment Instructions to Promote Self-Regulation in Inquiry-Based Learning in School Laboratories. J. Chem. Educ. 2021, 98 (5), 1602–1609. https://doi.org/10.1021/acs.jchemed.0c01177.Search in Google Scholar
39. Veenman, M. V. Assessing Metacognitive Deficiencies and Effectively Instructing Metacognitive Skills. Teach. Coll. Rec. 2017, 119 (13), 1–14. https://doi.org/10.1177/016146811711901303.Search in Google Scholar
40. Black, P.; Wiliam, D. Developing the Theory of Formative Assessment. Educ. Asse. Eval. Acc. 2009, 21 (1), 5–31. https://doi.org/10.1007/s11092-008-9068-5.Search in Google Scholar
41. Clark, I. Formative Assessment: Assessment is for Self-Regulated Learning. Educ. Psychol. Rev. 2012, 24 (2), 205–249. https://doi.org/10.1007/s10648-011-9191-6.Search in Google Scholar
42. Brown, G. T. L.; Harris, L. R. Student Self-Assessment. In SAGE Handbook of Research on Classroom Assessment; McMillan, J., Ed.; SAGE: Thousand Oaks, 2013; pp 367–393.10.4135/9781452218649.n21Search in Google Scholar
43. Panadero, E.; Jonsson, A.; Strijbos, J.-W. Scaffolding Self-Regulated Learning through Self-Assessment and Peer Assessment: Guidelines for Classroom Implementation. In Assessment for Learning. Laveault, D., Allal, L.; The Enabling Power of Assessment Ser; Springer: Cham, Vol. 4, 2016; pp. 311–326.10.1007/978-3-319-39211-0_18Search in Google Scholar
44. Panadero, E.; Alonso-Tapia, J. How Do Students Self-Regulate? Review of Zimmerman’s Cyclical Model of Self-Regulated Learning. Analesps 2014, 30 (2), 450–462. https://doi.org/10.6018/analesps.30.2.167221.Search in Google Scholar
45. Anderson, L. W.; Krathwohl, D. R. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives; Longman: New York, 2009.Search in Google Scholar
46. Bloom, B. S.; Engelhart, M. D.; E. J.; Hill, W. H.; Krathwohl, D. R. Taxonomy of Educational Objectives: The classification of Educational Goals; Handbook I: The Cognitive Domain; Longmans: New York, 1956.Search in Google Scholar
47. Mager, R. F. Preparing Instructional Objectives: A Critical Tool in the Development of Effective Instruction, 3rd ed.; completely rev; The Mager six-pack; Center for Effective Performance: Atlanta, 1997.Search in Google Scholar
48. Rodriguez, M. C.; Albano, A. D. The College Instructor’s Guide to Writing Test Items: Measuring Student Learning, 1st ed.; Routledge: New York, 2017.10.4324/9781315714776Search in Google Scholar
49. Duchastel, P. C.; Merrill, P. F. The Effects of Behavioral Objectives on Learning: A Review of Empirical Studies. Rev. Educ. Res. 1973, 43 (1), 53–69. https://doi.org/10.3102/00346543043001053.Search in Google Scholar
50. Marzano, R.; Gaddy, B.; Dean, C. C. What Works in Classroom Instruction; Mid-continent Research for Education and Learning (McREL): Aurora, 2000.Search in Google Scholar
51. Osueke, B.; Mekonnen, B.; Stanton, J. D. How Undergraduate Science Students Use Learning Objectives to Study. J. microbiol. biol. educ. 2018, 19 (2). https://doi.org/10.1128/jmbe.v19i2.1510.Search in Google Scholar PubMed PubMed Central
52. Wirth, J.; Künsting, J.; Leutner, D. The Impact of Goal Specificity and Goal Type on Learning Outcome and Cognitive Load. Comput. Hum. Behav. 2009, 25 (2), 299–305. https://doi.org/10.1016/j.chb.2008.12.004.Search in Google Scholar
53. Minbiole, J. Improving Course Coherence & Assessment Rigor: “Understanding by Design” in a Nonmajors Biology Course. Am. Biol. Teach. 2016, 78 (6), 463–470. https://doi.org/10.1525/abt.2016.78.6.463.Search in Google Scholar
54. Brooks, S.; Dobbins, K.; Scott, J. J.; Rawlinson, M.; Norman, R. I. Learning About Learning Outcomes: The Student Perspective. Teach. High. Educ. 2014, 19 (6), 721–733. https://doi.org/10.1080/13562517.2014.901964.Search in Google Scholar
55. Bergander, N.; Melle, I. Lerntransparenz in einer digitalen Lernumgebung [Learning Transparency in a Digital Learning Environment]. Entdecken, lehren und forschen im Schülerlabor. Jahrestagung in Bochum 2024; van Vorst, H., Ed.; Universität: Duisburg-Essen, Vol. 45, 2025.Search in Google Scholar
56. Moos, D. C.; Stewart, C. A. Self-Regulated Learning with Hypermedia: Bringing Motivation into the Conversation. In International Handbook of Metacognition and Learning Technologies. In Springer International Handbooks of Education; Azevedo, R., Aleven, V., Eds.; Springer: New York, Vol. 26, 2013; pp 683–695.10.1007/978-1-4419-5546-3_45Search in Google Scholar
57. Azevedo, R.; Moos, D. C.; Johnson, A. M.; Chauncey, A. D. Measuring Cognitive and Metacognitive Regulatory Processes During Hypermedia Learning: Issues and Challenges. Educ. Psychol. 2010, 45 (4), 210–223. https://doi.org/10.1080/00461520.2010.515934.Search in Google Scholar
58. Scheiter, K.; Gerjets, P. Learner Control in Hypermedia Environments. Educ. Psychol. Rev. 2007, 19 (3), 285–307. https://doi.org/10.1007/s10648-007-9046-3.Search in Google Scholar
59. Schuster, C.; Stebner, F.; Geukes, S.; Jansen, M.; Leutner, D.; Wirth, J. The Effects of Direct and Indirect Training in Metacognitive Learning Strategies on Near and Far Transfer in Self-Regulated Learning. Learn. Instr. 2023, 83, 101708. https://doi.org/10.1016/j.learninstruc.2022.101708.Search in Google Scholar
60. Dignath, C.; Veenman, M. V. j. The Role of Direct Strategy Instruction and Indirect Activation of Self-Regulated Learning—Evidence from Classroom Observation Studies. Educ. Psychol. Rev. 2021, 33 (2), 489–533. https://doi.org/10.1007/s10648-020-09534-0.Search in Google Scholar
61. Perry, N.; Phillips, L.; Dowler, J. Examining Features of Tasks and Their Potential to Promote Self-Regulated Learning. Teach. Coll. Rec. 2004, 106 (9), 1854–1878. https://doi.org/10.1111/j.1467-9620.2004.00408.x.Search in Google Scholar
62. Dyrna, J. Mit digitalen Medien selbstgesteuert lernen? Ansätze zur Ermöglichung und Förderung von Selbststeuerung in technologieunterstützten Lernprozessen [Self-Regulated Learning with Digital Media? Approaches to Enabling and Promoting Self-Regulation in Technology-Supported Learning Processes]. In Selbstgesteuertes Lernen in der Beruflichen Weiterbildung: Ein Handbuch für Theorie und Praxis; Dyrna, J., Riedel, J., Schulze-Achatz, S., Köhler, T., Eds.; Waxmann: Münster, 2021; pp 247–261.10.31244/9783830993643Search in Google Scholar
63. Perry, N. E. Young Children’s Self-Regulated Learning and Contexts that Support it. J. Educ. Psychol. 1998, 90 (4), 715–729. https://doi.org/10.1037/0022-0663.90.4.715.Search in Google Scholar
64. Barnard, M.; Whitt, E.; McDonald, S. Learning Objectives and Their Effects on Learning and Assessment Preparation: Insights from an Undergraduate Psychology Course. Assess. Eval. High. Educ. 2021, 46 (5), 673–684. https://doi.org/10.1080/02602938.2020.1822281.Search in Google Scholar
65. Wong, J.; Baars, M.; Davis, D.; van der Zee, T.; Houben, G.-J.; Paas, F. Supporting Self-Regulated Learning in Online Learning Environments and MOOCs: A Systematic Review. Int. J. Hum.-Comput. Interact. 2019, 35 (4–5), 356–373. https://doi.org/10.1080/10447318.2018.1543084.Search in Google Scholar
66. Genially Web, S.L. Genially. https://genially.com/ (accessed 2025-08-19).Search in Google Scholar
67. Orr, R. B.; Gormally, C.; Brickman, P. A Road Map for Planning Course Transformation Using Learning Objectives. CBE Life Sci. Educ. 2024, 23 (2), es4. https://doi.org/10.1187/cbe.23-06-0114.Search in Google Scholar PubMed PubMed Central
68. Sewagegn, A. A. Learning Objective and Assessment Linkage: Its Contribution to Meaningful Student Learning. Ujer 2020, 8 (11), 5044–5052. https://doi.org/10.13189/ujer.2020.081104.Search in Google Scholar
69. Dresel, M.; Lämmle, L. Motivation [Motivation]. In Emotion, Motivation und selbstreguliertes Lernen; Götz, T., Ed.; StandardWissen Lehramt; Schöningh: Stuttgart, 2011; pp 79–142.Search in Google Scholar
70. Felder, R. M.; Brent, R. Teaching and Learning STEM: A Practical Guide, 2nd ed.; John Wiley & Sons Inc.: Newark, 2024.Search in Google Scholar
71. Fessl, A.; Maitz, K.; Dennerlein, S.; Pammer-Schindler, V. The Impact of Explicating Learning Goals on Teaching and Learning in Higher Education: Evaluating a Learning Goal Visualization. In Technology-Enhanced Learning for a Free, Safe, and Sustainable World: 16th European Conference on Technology Enhanced Learning; Laet, T. de, Klemke, R., Alario-Hoyos, C., Hilliger, I., Ortega-Arranz, A., Eds.; Springer: Cham, 2021.10.1007/978-3-030-86436-1_1Search in Google Scholar
© 2025 the author(s), published by De Gruyter, Berlin/Boston
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.