Abstract
The aim of this work is to use the laboratory platform as a unique tool for engaging and encouraging the students to develop critical thinking and guided self-evaluation abilities. Moreover, the fact that the teaching process in the lab is held in small groups enables the detection of difficulties in an early stage, thus allowing an immediate feedback and relevant assistance. Taking advantage of this unique environment we have developed a rubric score based interface for students and teaching assistants (TAs) aimed to encourage student’s reflection and engagement while enabling a rational and valid grading of students’ performance by the TAs. The rubric consists of eight parameters which are evaluated in each lab session, both by the student and the TA. Following the lab session, a comparison of both evaluations is performed by the lab manager thus enabling real time intervention and formative assessment, along with important insights on the learning process of the student. Our preliminary studies, performed in Chemistry laboratories, indicated that this methodology enabled an efficient formative assessment of the learning process during the course. This tool might help students to rapidly improve their performance in laboratory courses, by stimulation of a continuous reflection process.
1 Theoretical background
In this work a specific utilization of the chemistry laboratory environment for engineering students is described, enabling the creation of a better interface between the student and the teaching assistant (TA). Laboratory courses have been part of the curriculum in science and engineering studies for decades, and their various levels learning outcomes were reviewed recently (Agustian et al., 2022). During the years there was a lot of debating regarding the efficiency of this from of teaching. Specifically, a few years ago (Bretz, 2019) the need for providing evidence for the importance of laboratory courses and the learning process of the students was explicitly evoked. Laboratory courses requires a lot of expensive resources both in materials and equipment and in staff training, and it is of great importance to be able to evaluate their contribution to the students. It is well known that we live in an era in which there is an increasing need to prepare and equip the students, especially, science, technology, engineering and mathematics (STEM) students with the 21st century skills, which can be also developed using the laboratory platform (Lavi et al., 2021).
In opposite to frontal courses which can be learned and taught in large groups and remotely (Leontyev & Baranov, 2013), laboratory courses are unique, mainly since they require the engagement of all participants, both students and instructors, on many levels (Feisel & Rosa, 2005; Hofstein, 2004; Hofstein et al., 2005; Hofstein & Lunetta, 1982, 2004; Hofstein & Mamloc-Naaman, 2007). Specifically, while focusing on Chemistry laboratories’ courses, a major change in the goals of these courses was observed during the years, from learning factual information (Zuckerman, 1986), to concepts development (Abrahams et al., 1997) and scientific processes (Lloyd, 1992). In a review, published in 2013, which was dedicated to the goals of laboratory courses in the USA, many wider goals were evoked, including: group work, broader communication skills and scientific writing ability (Bruck & Towns, 2013). Based on these data, it is easy to see that laboratory courses can serve as an excellent platform for multi-dimensional teaching and engaged learning. As was suggested, good academic teaching (Hativa, 2001), is characterized by the ability to build high motivation and engagement of the students, along with immediate feedback, visualization of theoretical concepts and finally, personalized teaching in small groups of students. All these characteristics naturally exists in laboratory courses. The fact that the teaching is conducted in small groups of students, emphasizes the major role of the laboratory instructor or teaching assistants (TA). The TA is responsible for both creating an active learning environment and clearly defining what main skills are desired and graded for a specific lab session (Reid & Shah, 2007). It is important to mention that all laboratory courses require great efforts on behalf of the student and therefore a special attention should be given to an ethical and rational grading process. Moreover, since various skills are developed during the course, it is important to be able to evaluate the learning process of the students and to reflect it to the student via formative assessment process. In most cases the grading process in laboratory is composed of three components consisting of preparation, performance and post lab report. While the preparation and lab report are easily assessed using standard methods of evaluation, the evaluation of the performance in the lab is often given intuitively and not rationally by the TA. This situation can lead to biased grading due to lack of clarity and consistency between the students and teachers. It is well known that the laboratory environment is a complex learning environment (Seery et al., 2018), and due to that fact specific attention should be given to the assessment process. As suggested by Seery, we want our students to consider the laboratory as “the place to learn how to do Chemistry” (Seery, 2020) so a proper evaluation and monitoring of the learning process of the experimental work should be developed. Generally, both TAs and students have difficulty in evaluating and understanding the definition of a good accomplishment of experimental work, resulting in three major problems in the evaluation of students’ performance and learning process in laboratory courses in chemistry: lack of rational and ethical grading of the students’ performances by the TAs, lack of formative assessment and immediate feedback, lack of insight on the learning process of the student. All those problems are related to the interface between the TA and the student.
2 Methodology
In order to address those three issues mentioned above, a brainstorming meeting was conducted with all the lab managers and teachers in order to define parameters which can help and characterized a desired experimental work (Sfez, 2015). After choosing eight parameters, we have developed a rubric-score based assessment for the TAs in which parameters for evaluation of practical work as well as their relative contribution coefficient are clearly defined. The use of this rubric was used only as a pilot, in order to obtain a more ethical grading process. This rubric was used by the TAs who are usually M. Sc or PhD Chemistry or engineering students and have no formal teaching background. The rubric consists of the following parameters each of them with – their coefficients have been identified as components of performance evaluation in laboratories:
Time: Time required for the student to actualy start the experiment (10 %).
Order: Order and organization of the work environment (10 %).
GLP: Working procedures according to good laboratory practice (GLP) (10 %).
Efficiency: Efficient work in the defined time frame: good planning and division of tasks in a useful way (20 %).
Independence: Independence during the work: self-confidence and understanding of the experimental procedure (15 %).
Result: Evaluation of the experimental results in terms of accuracy and reproducibility (10 %).
Understanding: The ability to analyze and understand the meaning of the experimental results and suggest conclusions based on them (15 %).
Intuitive evaluation: Intuitive qualitative evaluation of the work (10 %): since we try to quantify a qualitative impression, it was important to include also an intuition’s-based parameter.
The TAs were asked to grade the students both intuitively and based on the rubric. They were asked not to sum all the coefficient of the parameters. Moreover, in order to improve the TA/student interface and the efficiency of learning and teaching process, the same rubric score was given also to the students for self-evaluation. The students had the choice to decide if they wanted to participate in this research, and most of them agreed gladly to take part. Each student got a questionnaire at the end of each lab session with the same parameters but no coefficient in which they have to fill their self evaluation of each parameter. After each lab session the lab manager assembled both evaluations and compared the TAs and students results. By comparing the TA and student evaluations, an insight on the learning process of each student can be observed. This self-evaluation process helped also to increase and encourage the engagement of the students in the course.
3 Results and discussions
Interesting insights on the learning process during laboratory courses were obtained by comparing the students’ self-evaluation and the TA evaluation after each lab session. Figure 1 shows the comparison between the TA’s and the students’ evaluations in the lab sessions of mechanical engineering first-year students, in a general chemistry course which included three lab sessions for each student. From all the students, we present here two extreme cases of excellent and mediocre students which were chosen based on their grades specifically in chemistry in mid term and final exams. As can be seen, on the first lab session, both students evaluated themselves higher than the TA, in most of the parameters. However, on the third and last session, the excellent student evaluates his work closer to the TA’s evaluation or even less, but the mediocre student continued to evaluate himself much higher than the TA. This type of observation is already known and documented (Felchikov & Boud, 1989).

Comparison of TA rubric scores based evaluation and the self evaluation of an excellent and mediocre student for the eight rubric parameters after the first and last lab session. Maximal possible value is noted in green, TA evaluation in red and student estimation at the end of the lab session in blue. A misunderstanding of the requirements is observed, as can be seen by a false self-estimation compare the TA, in both sessions. However, for the excellent student, a trend towards the TA evaluation can be seen while comparing the first and last session.
It can be seen that a sort of non-verbal comprehension was obtained by the excellent student, which is not reached for the mediocre student. Moreover, it seems that the ability to evaluate oneself correctly is one of the characteristics of the excellent student. In any case, these results can also suggest an inefficient student/TA interface, leading to lack of understanding of the mediocre students who need a more explicit instructions, This fact should be taken into account in the TAs preparation to the teaching in the laboratory.
The next step of this preliminary research was to check longer laboratory courses in chemistry, consisting of eight sessions. We focused on four representative types of students, describing excellent, good, mediocre (bad) and failing and almost quitting students. We have chosen these students based on their grades in previous chemistry courses. This time we chose to follow the relative difference in assessment between the TA and the student, using the relative assessment gap (ΔRAG) for evaluating their learning curves during the course. The ΔRAG for each parameter was calculated as follows:
where ES and ET stands for the student evaluation (ES) and TA evaluation (ET) respectively and EM is the maximum possible grade for the specific parameter (Figure 1). It is important to notice that the relative gap can have a negative value (if the student self-evaluation is less than the TA evaluation) or positive values, while a zero value corresponds to identical evaluation of both the student and the TA. Figure 2 represent the ΔRAG for these students and their learning process for each of the rubric score parameters. As can be seen, each student has its specific learning process which can be used for monitoring and guiding in real time and not only at the end of the course. It can be seen that for the excellent student, the ΔRAG is almost null thus confirming the trend shown in Figure 1. However, the quitting student has systematically a lower evaluation compare to the TA. In this case we were able to identify and help the student in real time, resulting in his successful graduation. Based on this intervention, we propose to use these learning curves for immediate feedback and formative assessment during the semester and assist students in real time. (Sadler, 2010; Sfez, 2015) In the few times we chose to interfere, the students were grateful and mentioned those interventions as a very positive experience. They mainly focused on the personalized and detailed attention which was given and helped them to improve even their general learning skills.

ΔRAG values for each rubric parameter for four types of students as a function of the number of lab session.
4 Conclusions
In this work we described a rubric score based grading in laboratories that was used in our institution as a pilot project. We suggest to use the laboratory platform for developing value-adding and personalized student-centered teaching and learning abilities, based on real time feed-back, along with formative assessment. Based on rubric-scores for TAs and student’ self-evaluation which improves the effectiveness of the learning process by giving specific and real-time feedback, we were able to assist students. Although, more in depth statistical analysis is required, we suggest that this method might give some insights on the learning process in laboratory courses and increase student’ engagement and interest. Since this methodology includes a self-evaluation process and immediate feedback, it might help specifically mediocre students who can greatly and rapidly improve their performance in all courses, by stimulation of a continuous reflexing process. We think that this approach could be generalized to other disciplines and become more user friendly if a proper interface is provided to the TA’s and to the students. We are now focusing on the development of a user friendly interface for both students and TAs along with machine learning analysis for finding better and valid parameters for the rubric score.
Acknowledgments
We would like to thank all the TAs, lab directors and our dear students from Mechanical, Pharmaceutical and Materials engineering departments at Azrieli college of engineering who agreed to take part in this project. We also thank A. Harel, E. Hefer and T. Lewinstein from industrial engineering department and Y. Hassin, B. Rosental and N. Catz from software engineering department at Azrieli college of engineering, for their contributions.
-
Research ethics: Not applicable.
-
Author contributions: The author has accepted responsibility for the entire content of this manuscript and approved its submission.
-
Competing interests: The author states no conflict of interest.
-
Research funding: Funding from the council of higher education in Israel and Azrieli college of engineering research funds are gratefully acknowledged.
-
Data availability: Not applicable.
References
Abrahams, M. R., Cracolia, M. S., Palmer Graves, A., Aldhamash, A. H., Kihega, J. G., Palma Gil, J. G., & Varghese, V. (1997). The nature and State of general chemistry laboratory courses offered by colleges and Universities in the United States. Journal of Chemical Education, 74, 591–594. https://doi.org/10.1021/ed074p591 Search in Google Scholar
Agustian, H. Y., Finne, L. T., Jørgensen, J. T., Pedersen, M. I., Christiansen, F. V., Gammelgaard, B., & Nielsen, J. A. (2022). Learning outcomes of university chemistry teaching in laboratories: A systematic review of empirical literature. The Review of Education, 10, 1–41. https://doi.org/10.1002/rev3.3360 Search in Google Scholar
Bretz, S. L. (2019). Evidence for the importance of laboratory courses. Journal of Chemical Education, 96, 193–195. https://doi.org/10.1021/acs.jchemed.8b00874 Search in Google Scholar
Bruck, A. D., & Towns, M. (2013). Development, implementation, and analysis of a national survey of faculty goals for undergraduate chemistry laboratory. Journal of Chemical Education, 90, 685–693. https://doi.org/10.1021/ed300371n Search in Google Scholar
Feisel, L. D., & Rosa, A. J. (2005). The role of the laboratory in undergraduate engineering education. Journal of Engineering Education, 94, 121–130. https://doi.org/10.1002/j.2168-9830.2005.tb00833.x.Search in Google Scholar
Felchikov, N., & Boud, D. (1989). Student self assessment in higher education: A meta-analysis. Review of Educational Research, 59, 395–430. https://doi.org/10.3102/00346543059004395 Search in Google Scholar
Hativa, N. (2001). Teaching for Effective Learning in Higher Education. Springer.10.1007/978-94-010-0902-7Search in Google Scholar
Hofstein, A. (2004). The laboratory in chemistry education: Thirty years of experience with developments, implementation and evaluation. Chemistry Education: Research and Practice, 5, 247–264. https://doi.org/10.1039/b4rp90027h Search in Google Scholar
Hofstein, A., & Lunetta, V. N. (1982). The role of the laboratory in science teaching: Neglected aspects of research. Review of Educational Research, 52, 201–207. https://doi.org/10.3102/00346543052002201 Search in Google Scholar
Hofstein, A., & Lunetta, V. N. (2004). The laboratory in science education: Foundation for the 21st century. Science Education, 88, 28–54. https://doi.org/10.1002/sce.10106 Search in Google Scholar
Hofstein, A., & Mamloc-Naaman, R. (2007). The laboratory in science education: The state of the art. Chemistry Education: Research and Practice, 8, 105–108. https://doi.org/10.1039/b7rp90003a Search in Google Scholar
Hofstein, A., Navon, O., Kipnis, M., & Mamloc-Naaman, R. (2005). Developing students’ ability to ask more and better questions resulting from inquiry type chemistry laboratories. Journal of Research in Science Teaching, 42, 791–806. https://doi.org/10.1002/tea.20072 Search in Google Scholar
Lavi, R., Tal, M., & Dori, Y. J. (2021). Perception of STEM alumni and students on developing 21st century skills through methods of teaching and learning. Studies in Educational Evaluation, 70, 101002. https://doi.org/10.1016/j.stueduc.2021.101002 Search in Google Scholar
Leontyev, A., & Baranov, D. (2013). Massive open online courses in chemistry: A comparative overview of platforms and features. Journal of Chemical Education, 90, 1533–1539. https://doi.org/10.1021/ed400283x Search in Google Scholar
Lloyd, B. W. (1992). The 20th century general chemistry laboratory: Its various faces. Journal of Chemical Education, 69, 866–869. https://doi.org/10.1021/ed069p866 Search in Google Scholar
Reid, N., & Shah, I. (2007). The role of laboratory work in university chemistry. Chemistry Education: Research and Practice, 8, 172–185. https://doi.org/10.1039/b5rp90026c Search in Google Scholar
Sadler, D. R. (2010). Beyond feedback: Developing student capability in complex appraisal. Assessment & Evaluation in Higher Education, 35, 535–550. https://doi.org/10.1080/02602930903541015 Search in Google Scholar
Seery, M. H., Agustina, H. Y., & Zhang, X. (2018). A framework for learning in the chemistry laboratory. Israel Journal of Chemistry, 58, 1–9.Search in Google Scholar
Seery, M. K. (2020). Establishing the laboratory as the place to learn how to do chemistry. Journal of Chemical Education, 97, 1511–1514.10.1021/acs.jchemed.9b00764Search in Google Scholar
Sfez, R. (2015). Insights on learning and grading processes in laboratory courses. The Israel Chemist and Engineer, 1, 42–46.Search in Google Scholar
Zuckerman, J. J. (1986). The coming renaissance of descriptive chemistry. Journal of Chemical Education, 63, 829–833. https://doi.org/10.1021/ed063p829 Search in Google Scholar
© 2023 the author(s), published by De Gruyter, Berlin/Boston
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Articles in the same Issue
- Frontmatter
- Editorial
- Developments in Chemistry Teacher International (CTI)
- Research Articles
- Don’t we know enough about models? Integrating a replication study into an introductory chemistry course in higher education
- Analysing and developing linguistically responsive tasks within the frame-work of the cross-disciplinary Erasmus+ project sensiMINT
- Accessible chemistry: the success of small-scale laboratory kits in South Africa
- Does it occur or not? – A structured approach to support students in determining the spontaneity of chemical reactions
- Teachers’ practices during Emergency Remote Teaching: an investigation of the needs for support and the role of Professional Learning Communities
- An interactive platform for formative assessment and immediate feedback in laboratory courses
- Application of the criteria-based assessment system to the tasks of developing the functional literacy of students in teaching chemistry
- Good Practice Reports
- How does using an AR learning environment affect student learning of a radical substitution mechanism?
- Supporting career awareness through job shadowing and industry site visits
- Research Article
- Unlocking chemistry calculation proficiency: uncovering student struggles and flipped classroom benefits
- Review Articles
- Using innovative technology tools in organic chemistry education: bibliometric analysis
- Augmented reality in developing students’ understanding of chemistry triplet: a systematic literature review
- Good Practice Reports
- Chemistry laboratory experiments focusing on students’ engagement in scientific practices and central ideas of chemical practices
- Responses of teachers in Scotland to the reintroduction of the practical project in the advanced higher chemistry curriculum
- Research Article
- Analyzing the existing programs on promoting women scientists in chemistry
Articles in the same Issue
- Frontmatter
- Editorial
- Developments in Chemistry Teacher International (CTI)
- Research Articles
- Don’t we know enough about models? Integrating a replication study into an introductory chemistry course in higher education
- Analysing and developing linguistically responsive tasks within the frame-work of the cross-disciplinary Erasmus+ project sensiMINT
- Accessible chemistry: the success of small-scale laboratory kits in South Africa
- Does it occur or not? – A structured approach to support students in determining the spontaneity of chemical reactions
- Teachers’ practices during Emergency Remote Teaching: an investigation of the needs for support and the role of Professional Learning Communities
- An interactive platform for formative assessment and immediate feedback in laboratory courses
- Application of the criteria-based assessment system to the tasks of developing the functional literacy of students in teaching chemistry
- Good Practice Reports
- How does using an AR learning environment affect student learning of a radical substitution mechanism?
- Supporting career awareness through job shadowing and industry site visits
- Research Article
- Unlocking chemistry calculation proficiency: uncovering student struggles and flipped classroom benefits
- Review Articles
- Using innovative technology tools in organic chemistry education: bibliometric analysis
- Augmented reality in developing students’ understanding of chemistry triplet: a systematic literature review
- Good Practice Reports
- Chemistry laboratory experiments focusing on students’ engagement in scientific practices and central ideas of chemical practices
- Responses of teachers in Scotland to the reintroduction of the practical project in the advanced higher chemistry curriculum
- Research Article
- Analyzing the existing programs on promoting women scientists in chemistry