Startseite Linguistik & Semiotik Spatial constructions of time: exploring Co-speech gestures in lectures on programming
Artikel Open Access

Spatial constructions of time: exploring Co-speech gestures in lectures on programming

  • Karin Stolpe

    Karin Stolpe is a senior associate professor in science and technology education at the Department of Behavioural Sciences and Learning at Linköping University, Sweden. Her research concerns the interaction between humans and technology, with a special interest in how language develops as humans communicate with artificial intelligence through prompting.

    ORCID logo EMAIL logo
    , Marlene Johansson Falck

    Marlene Johansson Falck is a professor of English linguistics at Umeå University, Sweden. Her main research interests are in the fields of language, spatial cognition, metaphor, and metaphor identification. She was part of the founding committee of the Swedish Association for Language and Cognition (now Scandinavian Association for Language and Cognition) and was a long-time member of the governing board. She is one of the founders of MetNet Scandinavia and associate editor of Metaphor and Symbol.

    ORCID logo
    und Andreas Larsson

    Andreas Larsson is a post-doctoral scholar in technology education at the Department of Behavioural Sciences and Learning at Linköping University, Sweden. He is also the director of the Swedish National Centre for Science and Technology Education. His main research interests are the relationship between humans and cultural artefacts in educational environments and how teachers’ beliefs and contextual knowledge impact teaching.

    ORCID logo
Veröffentlicht/Copyright: 22. April 2025
Veröffentlichen auch Sie bei De Gruyter Brill

Abstract

Spatiotemporal metaphors strongly affect our language about time. Time can be construed as an object that moves through space (time flies) or as a landscape through which we move (we are heading towards the weekend). Evidence of how speakers construe time can be found by observing their gestures. This study explores spatial constructions of time in co-speech gestures during programming lectures in Swedish upper-secondary classrooms. Data were collected from teachers’ co-speech gestures while lecturing on programming, a context rich in temporal and sequential references. The results show that the teachers gesture in three directions, each with a specific function. Gestures along the vertical axis are used to talk about writing code as events on a vertical timeline. The programming convention where code lines are ordered top-down, indicating events in a particular order, is suggested as an explanation. Gestures along the sagittal axis are used when the teachers take an internal perspective. Gestures along the lateral axis are used when discussing events. This is a first exploration of how time concepts are construed in Swedish programming classrooms. The research provides a foundation for more extensive studies on the role of co-speech gestures in conceptualising time, particularly in educational settings involving technological interfaces.

1 Introduction

When speakers of English talk about the weeks that follow, they construe time as an object that moves through space (Lakoff and Johnson 1999: 143), and when they say that they have passed a deadline, time is construed as a landscape that they move through (Lakoff and Johnson 1999). In this example, time is standing still while we move in the direction of the future. Another subcase of time passes us is when time is a moving object that moves toward us, as in the example of the weekend approaches (Lakoff and Johnson 1980). Even though different languages show similarities, for example, when it comes to speaking of time in terms of space (Bender and Beller 2014), differences have also been identified. For example, Bylund and Athanasopoulos (2017) conducted experiments with Spanish and Swedish bilinguals, showing that Swedish speakers were misled by stimulus length and Spanish speakers were misled by stimulus size/quantity. The authors explain the results by the fact that expressions of duration differ between the languages. In Swedish, one speaks of long/short time, while in Spanish, talking about much/small time is more common (Bylund and Athanasopoulos 2017).

Observations of patterns such as these and results from other scholarly work (e.g., Gentner et al. 2002; Núñez et al. 2006) have shown that language is structured by spatiotemporal metaphors and that spatiotemporal metaphors strongly affect how people represent order and duration of events (Boroditsky 2000; Casasanto and Boroditsky 2008). Hence, humans conceptualise time primarily in terms of space (Lakoff and Johnson 1999; Núñez and Cooperrider 2013). These spatial constructions appear in the language and how speakers gesture (e.g., Cooperrider et al. 2022).

Earlier research suggests that systematic analyses of co-speech gestures convey speakers’ spatial constructions of time. McNeill (1992) argues that gestures and spoken language should be regarded as different sides of the same underlying mental process. Research on speakers’ co-speech gestures shows that speakers tend to gesture in different directions when talking about time. They use motion along the sagittal (front-back) axis to locate the past, the present, and the future (Núñez and Sweetser 2006) and motion along the lateral (left-right) axis when gesturing about sequences of events (Casasanto and Jasmin 2012).

People’s conceptualisation of time can be flexible (Santiago et al. 2007) and be influenced by several different factors, such as specific context (Casasanto and Bottini 2014) the age of the speaker (Bylund et al. 2020), and reading and writing direction (e.g., Casasanto 2016). According to Gu (2022), “the weight and respective role of different factors in shaping an individual’s space-time mappings are still unclear” (p. 226).

Human construction of time is not only a question of linguistics. Casasanto and Jasmin (2012) suggest speakers adopt perspectives grounded in interaction with cultural artefacts, such as graphs, calendars, and written text. Since those artefacts are of later dates than spoken language, this could explain a mismatch between how time is spatially constructed in spoken language and gestures, respectively (Casasanto and Jasmin 2012). When there are contradictions in the spoken language compared to the physical experiences (e.g., writing direction), this can cause a disassociation between temporal language and temporal gestures (Casasanto 2016). Thus, there seem to be differences depending on the cultural practices regarding the direction of time gestures (Boroditsky and Gaby 2010; Cooperrider et al. 2022; Gu et al. 2019). Duffy (2014) suggests from her research on how cultural artefacts such as calendars and clocks influence our way of conceptualising time that metaphor research should investigate the role of emerging interactive technologies as cultural artefacts.

Results from previous research have guided us towards the aim of this study: to explore the relationship between Swedish speakers’ gestures along different axes and how time is construed in spoken language in the context of programming education. Therefore, this study seeks to pave the way for more extensive research on human-computer interaction that could validate these first preliminary findings.

To meet the aim, this study concerns how Swedish speakers construe time in a naturalistic setting. We collected data from teachers’ co-speech gestures while lecturing on programming in Swedish upper-secondary classrooms. Programming classes were chosen because they are likely to involve topics that include time and order (Colburn and Shute 2017; Larsson and Stolpe 2023; Manches et al. 2020). The natural setting also means teachers can access cultural artefacts (e.g., computer screen and whiteboard) and abstract computing concepts (e.g., programming syntax and conventionalised programming concepts).

The following research questions are investigated:

  1. in what way are time concepts associated with co-speech gestures along sagittal, lateral, or vertical axes?

  2. how do observed time concepts and co-speech gestures of Swedish speakers in a naturalistic setting relate to cultural artefacts?

2 Previous literature

2.1 Spatial construction of time in speech and gesture

Gesture production is intimately linked to spoken language and thought (Núñez 2004). Speakers commonly produce hand gestures to convey meaning that is co-expressive with the meanings they express through language (Kendon 2004). Furthermore, they produce and develop their gestures in synchronicity with or in close proximity to speech (Church et al. 2014; Cienki 2016). Speakers use gestures to talk about time concepts when their language includes spatial metaphors for time. However, research on speakers’ co-speech gestures reveals that speakers tend to gesture in different directions when talking about time, which also differs between languages. For example, the sagittal (front-back) axis to locate the past, the present, and the future (Núñez and Sweetser 2006) and motion along the lateral (left-right) axis when gesturing about sequences of events (Casasanto and Jasmin 2012).

English speakers typically use the sagittal axis in spoken language (the meeting is ahead of us). However, when gesturing spontaneously, they often use the lateral axis, gesturing leftward for earlier times and rightward for later times (Casasanto and Jasmin 2012; Valenzuela et al. 2020). Contrary, Mandarin speakers use a variety of spatial axes to gesture about time, including all three axes: the lateral, vertical and sagittal (Gu 2022).

Gestures along the sagittal axis generally involve a deictic centre that is co-located with the observer (i.e., one in which that individual has an internal perspective on time series) that may, in part, be motived by speakers’ experience of forward motion (Núñez and Cooperrider 2013). Gestures along the lateral axis, on the other hand, are used to take an external perspective on time as if it were a series and appear to be shaped by cultural conventions such as writing systems and other graphical practices (Núñez and Cooperrider 2013).

Gestures along the vertical (top-down) axis may similarly be used to construe external perspectives on time (Núñez and Cooperrider 2013). Such gestures have been observed among speakers of Mandarin, who locate earlier times on top of later times (Gu et al. 2014).

Along with different directional constructions of time, Research in Spanish-speaking contexts suggests time as a spatial construction of length or quantity (Alcaraz Carrión and Valenzuela 2022). Time could be construed as a length, from one point to another (as in the expression, it has been a long time), or as a quantity (as in we have plenty of time). These two constructions are used with different types of gestures. When time is a length, gestures are employed in a lateral (left to right) direction, while gestures that co-occur with time as a quantity are performed in an outward direction (Alcaraz Carrión and Valenzuela 2022). In this study, we focus on time as a spatial construction.

2.2 The role of artefacts for spatial construction of time

Within an English-speaking context, the spoken language suggests that time moves along the sagittal axis (front/back), as, for example, “we leave the old times behind”. However, when spontaneously gesturing, co-speech gestures along a lateral axis (left/right) outnumber sagittal ones (Casasanto and Jasmin 2012). One suggested explanation for this is that the direction of reading affects how time is spatialised. Speakers of English or Mandarin (who read from left to right) spatialise earlier times to the left (Casasanto and Jasmin 2012; Gu et al. 2019). However, speakers of Hebrew (who read from right to left) spatialise earlier times to the right (Fuhrman et al. 2011). Also, a study of children’s spatialisation of time shows that English speaking children preferred the left-to-right direction. Arabic speakers represented time from right-to-left and Hebrew speakers were in between (Tversky et al. 1991).

In sum, converging evidence from linguistic analyses, analyses of co-speech gestures, and behavioural tasks has shown that humans use motion through space to think and talk about time (Casasanto and Jasmin 2012; Coulson and Cánovas 2009; de la Fuente et al. 2014; Fuhrman et al. 2011; Núñez and Cooperrider 2013; Santiago et al. 2007). Still, results from previous research suggest cultural differences in how time is spatially construed, both in speech and in co-speech gestures (Gu 2022).

2.3 Spatial construction of time in the context of human-computer interaction

In a technologically intensive world, we as human beings are surrounded by digital devices that have a graphical interface that shares commonalities with physical objects, such as folders for saving your documents in, the floppy disk symbol to save your work, and the function of cut symbolised with a pair of scissors (Pitt and Casasanto 2022). Such metaphorical expressions influence how we interact with the computer, but also reflect our understanding and conceptualisation of computing concepts (Harper et al. 2024). However, Pitt and Casasanto argue that spatial metaphors influence human-computer interaction, linking non-spatial conceptual domains with different aspects of space on all three axes. They provide examples that vertical space is used to talk about high and low numbers (big or small quantities), lateral space is used to talk about left and right poles on a political scale, and sagittal space is used to talk about moving meetings forward and back in time.

Temporal logic in programming involves reasoning about the order of events. Software programs have a causal nature, meaning that “their internal state is described as changing over the course of time (e.g., regular, recent, immediate)” (Blackwell 2006: 13). Moreover, programming concepts, such as recursive functions (Solomon et al. 2020), to loop an algorithm (Larsson and Stolpe 2023), or to surf into a certain folder on the computer (Stolpe and Larsson 2023) have an inherent meaning of space and time. Moreover, computer programs generally consist of code lines, each representing a command or a process for the computer to execute. Reading the code from top to bottom can be seen as a timeline, with each line executed sequentially, creating a progression of events. However, exceptions and more complex structures in programming, such as loops and conditional statements, will affect the order in which the code is executed.

2.4 Spatial construction of time in Swedish

Hitherto, spatial constructions of time in Swedish have not been thoroughly investigated. To our knowledge, there are only a few studies in which Swedish and Spanish bilingual speakers are compared (Athanasopoulos and Bylund 2023; Bylund and Athanasopoulos 2017). Even though there are similarities between the languages, the researchers also identify differences, as mentioned earlier. There are several reasons why Swedish speakers may be expected to gesture in the same direction as speakers of English. First, Swedish is a North Germanic language closely related to English. Second, numerous Swedish expressions suggest that in Swedish, just like in English, the future (Sw. framtiden, lit. forward time) is ahead and the past behind (e.g., lämna det förflutna bakom oss, En. leave the past behind) along the sagittal axis (cf., Núñez and Sweetser 2006). Third, because speakers of Swedish read from left to right, they may be expected to spatialise earlier times to the left and later times to the right, just like speakers of English do. However, although studies have investigated co-speech gestures on time concepts among speakers of many different languages (e.g., English, Spanish, Mandarin, and Yupno), very little is known about how Swedish speakers produce gestures on time concepts.

3 Methods

This study examines co-speech gestures on time by speakers of Swedish in the setting of a programming classroom. The study uses a case-based research design in which two programming teachers are viewed as individual cases. Although the teachers should be understood as individual cases, the lectures concerned similar programming content at the same difficulty level. Thus, the data collected and selected were similar and comparable. The study is a first exploration of how time concepts are construed in a Swedish programming classroom.

3.1 Data collection

Data for this study constitutes video-recorded classroom observations of two Swedish-speaking teachers’ lectures on programming. The teachers were videoed individually in a naturalistic setting using two tripod-mounted cameras, capturing the teachers from different angles. The teachers were asked to conduct their ordinary lectures without considering the cameras or the research project. To eliminate the risk of impacting the teachers’ production of gestures, they were not informed until after the data collection about our exact research aim. However, after the lesson, we informed them about our interest in gestures, and they gave their renewed informed consent to participate in the research. This research follows the Swedish Research Council’s Good Research Practice (Vetenskapsrådet 2017).

3.2 Context of the study

For this study, sessions where the teachers were active in front of the whole class were selected for further analysis, rendering approximately 30 min of video per teacher. While lecturing, both teachers used whiteboards and screencast technologies to demonstrate code examples and to write and depict central programming concepts and ideas. The text (code lines) was written in horizontal lines (from left to right) and organised vertically (top-down) on the whiteboard or the screen.

3.3 Analytical procedure

Data were transcribed verbatim using MAXQDA Plus 2020. First, two of the researchers performed coding collaboratively. Later, parts of the material were coded individually by the two researchers. First, the video was observed without sound. When a potential co-speech gesture (co-speech gestures will henceforth be referred to as gestures) was identified by a researcher, the video was stopped, and the gesture was observed repeatedly before it was coded as lateral, sagittal, vertical, or other. Second, the speech was turned on to determine the character of the gesture and whether the gestures dealt with time (see below). All gestures were annotated in MAXQDA directly from the video. The analysis process was performed in two steps. First, a gestural analysis was done, and then a linguistic analysis.

3.3.1 Gestural analysis

This study is mainly exploratory and qualitative and employs an inductive approach. Therefore, any co-speech gesture along each of the three axes (gestures along the lateral, vertical, and sagittal axes, respectively) was included in the study. Furthermore, less distinct yet reoccurring gestures were included as data for the study. Examples of such gestures include the teachers bursting out with their hands (henceforth palm-up gestures), gestures depicting containerlike entities (metaphorical gestures), and gestures illustrating concrete objects or actions (iconic gestures). All gestures coded for are described and exemplified in Table 1. Palm-up gestures are distinguished by speakers rotating their forearms with their palms turning upwards (e.g., Cooperrider et al. 2018). Both metaphorical gestures (such as the conduit gestures) and iconic gestures are placed in McNeill’s taxonomy of gestures (McNeill 1992). However, to distinguish between, for example, a reference to an object or a (computing) process, we have chosen to keep them disassociated. Moreover, gestures that were performed hidden behind the computer screen or a student, as well as deictic gestures, such as when the teacher pointed at the whiteboard, the computer screen, or one of the students, were not included in the data. Unclear or small gestures, so-called beats (McNeill 1992), for example, gestures with a low amplitude that did not allow us to decide the direction of a given gesture, were not coded for. In all, the coding of the classroom observations resulted in 329 gestures.

Table 1:

The codes of gestures, with verbal descriptions and illustrative examples from the data.

Gesture type Verbal description Illustrative example
Gesture along the lateral axis A gesture along a lateral axis is performed from right to left (with respect to the speaker’s body) or in the opposite direction. It could also be performed from the body’s centre and outwards, or vice versa. Each gesture’s height (e.g., waist, chest, shoulders, head, or above head) is noted
Gesture along the vertical axis A gesture along the vertical axis is performed top-down or bottom-up, either centred or to the right or left, with respect to the centre line of the speaker’s body. The height could vary from a few centimetres to the total reach of the speaker’s arm
Gesture along the sagittal axis A gesture along a sagittal axis is performed front to back or in the opposite direction (with respect to the position of the speaker’s body). Each gesture’s height (e.g., waist, chest, shoulders, head, or above head) is noted
Palms-up gesture The teacher bursts his hand, either one or both, from the body and out. Palm-up gestures are distinguished from gestures along the three axes in that they have a less distinct demarcation of the direction and no distinct start and end point 1.


2.


3.
Container gesture For example, when the speaker holds one or two hands as a cup or holds the hands as a confined area that can contain an object
Iconic gesture For example, when the speaker moves his fingers as if typing on a computer keyboard, moves his arm as if he is throwing a ball through the air, or holds his index finger in the air as if he is writing something where the finger symbolises a pen while talking about these specific acts

3.3.2 Linguistic analysis

When all gestures had been coded, the accompanying words and phrases were analysed linguistically to identify those that code time concepts. Three linguistic constructions were identified: time adverbials, verb tense, and adverbs denoting order. We used linguistic constructions to establish whether the identified directional gestures involve time or not.

In the linguistic analysis, time adverbials such as (En. then), före (En. before), idag (En. today), sedan (En. later), and med en gång (En. right away), and the adjective asynkron (En. asynchronous) were considered to code time.

Furthermore, because verb tenses provide information on when a given action occurs in time, they were similarly considered to provide information on time (Teleman et al. 1999).

Adverbs such as först (En. first) and sist (En. the last time), and longer phrases such as hade hänt i tur och ordning (En. had happened in order) and har kört program uppifrån och ner (En. had run the program from top to bottom) were considered to code order and hence time as well. As stated in earlier work (Stolpe and Larsson 2023), the word “run” is used with a specific meaning in the programming context to describe a process that has to be carried out in a certain order.

In this way, the gestural and linguistic analyses have guided our systematic exploration of time concepts and the spatial organisation of such concepts as two Swedish-speaking programming teachers construe them.

4 Results

The results section is divided into two subsections. The first section provides an overview of the gestures used by the two teachers. The second section presents the teachers’ use of time-related gestures as independent case studies with a more thorough analysis to show what the teachers said during the lectures.

4.1 Use of time-related gestures

Teacher 1 used 4,291 words and 115 coded gestures (circa three gestures per 100 words). Teacher 2 used 5,082 words and 214 coded gestures (circa four gestures per 100 words). Both teachers gestured along all three axes, but lateral gestures were more frequent than gestures along the other axes.

The results reveal salient differences regarding gesture frequency between the teachers (Table 2). The gestures of Teacher 1 are relatively evenly distributed between gestures along lateral, vertical, or sagittal axes; palm-up, metaphorical, and iconic gestures are less frequent. Teacher 2 used gestures along a lateral axis and palm-up and metaphorical gestures more often than vertical, sagittal, and iconic gestures.

Table 2:

Proportion of gestures by the two teachers.

Type of gesture Teacher 1 (n = 115) Teacher 2 (n = 214)
Lateral gestures 20.0 % 17.8 %
Vertical gestures 14.8 % 8.4 %
Sagittal gestures 16.5 % 8.4 %
Palm-up gestures 20.9 % 19.6 %
Metaphorical gestures 15.7 % 22.4 %
Iconic gestures 4.3 % 2.3 %

4.2 In-depth case studies of teachers’ co-speech gestures related to time

The following sections present a more thorough analysis of the two teachers’ use of co-speech gestures related to time. The different gestures are presented in the order from the most common to the least common for each teacher.

4.2.1 Teacher 1

Teacher 1 primarily gestures along the vertical axis when using linguistic constructions of time. We identified seventeen gestures along the vertical axis. Of those seventeen identified gestures, sixteen were accompanied by a verb from which we learn that some action has been taken or will occur. Seven of the gestures were accompanied by time adverbials (e.g., sen, nu, and samtidigt, En. then, now, and simultaneously). One of the gestures was related to the order in which something is done (i.e., kört ett program uppifrån och ner, En. run a program from the top down) (Table 3, Memo 18). Table 3 shows an episode from Teacher 1’s teaching and the co-speech gestures. As the teacher says Hittills har vi kört program uppifrån och ner (En. Hitherto, we have run programs from the top to the bottom), he gestures from top to bottom along a vertical axis. A program on the screen is ordered from top to bottom, meaning code lines are read top-down. This gesture organises code lines in a spatial and temporal order. Thus, this gesture could be interpreted as both iconic, symbolising the code lines on a screen, and as temporal since it constructs a temporal order for the program to run. As seen in Memo 18, he gestures from top to bottom (Table 3). We thus suggest that his gestures function as a vertical and imaginary ‘timeline’ on which he can locate earlier times higher up than those that follow. This interpretation aligns with when Teacher 1 talks about what the students have done so far, and what they will continue to do later. For instance, when telling the students that they had activated lines in different places (referring to programming syntax available on their screen, thus an iconic gesture) (Table 3, Memo 20, 55), he finishes by saying, “och sen har ni gått tillbaka och fortsatt va” (En. ”and then you have gone back and continued, right?) (Table 3, Memo 21). During the episode, the teacher holds his left hand still as if it is a reference point for the location on the computer screen from which the students once started (Table 3, Memo 20). He then moves his right hand up towards the ‘reference point’ while saying och sen har ni gått tillbaka och fortsatt, En. and then you have returned and continued (Table 3, Memo 21), indicating that the students have returned up to where they started which in turn indicates that earlier times are up. Thus, in this specific context, temporality is suggested as a movement in time and a physical movement in space. The teacher’s gesture also indicates a movement in the computer program; the vertical movement by his right hand from the bottom up to the starting point is indicated by the left hand. Space and time are intertwined, and since spatiality is conceptualised in terms of code lines, time is given a vertical dimension.

Table 3:

An excerpt from Teacher 1’s teaching.

Memo Verbal utterance by the teacher (Sw/En) Gesture Direction
18 Hittills har vi kört program uppifrån och ner He holds a pen in his left hand and moves the pen from the height of his eyes to the height of his chest

Vertical
Hitherto, we have run programs from the top to the bottom
Ni har kallat på funktioner som har gjort att
You have called on functions which have made that
19 ni har hoppat i koden He holds his left hand still, in line with his shoulder. His right hand moves in a bow from the height of his eyes and down to the height of his waist. During the movement, he opens his hand, palm directed downwards

Vertical
you have jumped in the code
20 och aktiverat rader His right hand moves up and down in small movements at the level of his chest while he holds his left hand at the height of his shoulder

Vertical
and activated lines
55 på olika platser His right hand moves up and down in larger movements at the level of his hip while he holds his left hand still at the level of his shoulder

Vertical
55 in different places
21 och sen har ni gått tillbaka och fortsatt va? His right hand moves from the bottom up in a bow back to his left hand. His right hand grabs around his left hand

Vertical
21 and then you have gone back and continued, right?

Teacher 1 also uses gestures along the lateral axis in reference to time and order. A total of 23 such gestures were identified, 20 of which include a verb that provides information on tense (e.g., ska starta, En. will start). Eight gestures were used together with time adverbials (e.g., för några år sedan, En. a few years ago), and one was used together with expressions indicating order (i.e., i tur och ordning, En. in order). He typically gestures from left to right, but occasionally in the other direction to indicate where some previous action or something that still applies is located – i.e., in all the gestures, the teacher spatialises earlier times to the left and later times to the right.

In one episode, Teacher 1 uses an analogy between cleaning one’s room and how to use a function[1] (which is here used to illustrate that the behaviour of a program depends on what data has been put in by an end-user or received from a sub-routine or a database). At this stage, the teacher asks the students whether they have received any instructions for cleaning their rooms prior to cleaning their rooms (“fanns det någon inställning för hur man skulle städa sitt rum?”), for example, if they were supposed to “do so sloppily, carefully, quickly?” (“Ska man städa det hafsigt? Noga? Snabbt?”) ”or before grandma arrives” (” Innan mormor kommer?”). Just as he utters the word innan (En. before), he moves his head, followed by his upper body, from the centre to his left. He holds this position briefly and then returns to his previous position. We suggest that this may indicate a correspondence between a linguistic construction relating to time and a gesture along a lateral axis where innan (before) is located to his left.

Teacher 1’s sagittal gestures are not as closely connected with linguistic constructions of time. Six out of nineteen of these gestures are not accompanied by a phrase that includes a verb, and two are used together with time adverbials (i.e., with före and innan, En. before). While gesturing along this axis, the teacher discusses other real-world situations in which something happens or will happen. He also uses forward gestures in reference to the future (e.g., when something bör ske (En. should happen). Related to this result, some of his gestures suggest that programming implies moving forward into the space ahead of the programmer as if he is moving forward into a computer screen. The teacher holds his right hand close to his body when saying till vad vi gör i programmering och kod är att (En. to what we do in programming and code is to). However, when he finishes the sentence with och kod är att (En. and code is that), he moves his left hand forward in a sagittal gesture to indicate that the code should perform something in the future. One of his sagittal gestures goes in the other direction, where he references his experience with the phrase jag kan ta lite ur egen erfarenhet (En. I can take some from my own experience). Experience is something related to the past, hence having a past temporal connotation.

Teacher 1’s palm-up gestures were particularly present when turning to the class in front of him and asking them to consider the steps of the processes he teaches or a hypothetical scenario in the future or the past. For example, phrases such as Kan ni tänka er? (En. Can you imagine?), hade det varit? (En. would it have been?). Twenty-four palm-up gestures were identified, and nineteen verbs were mentioned together with the palm-up gestures. Of the nineteen verbs, thirteen are in the present tense and pertain to present or future time. Moreover, Teacher 1 uses two verbs which refer to times other than the now – two verbs in the future tense (e.g., vi ska städa, En. we will clean), three in the past (e.g., ni har varit med om, En. you have experienced) and one in the past perfect tense. Only three of his palm-up gestures are used together with time adverbials (e.g., , En. then; tills, En. until, and så länge, En. as long as), and one with words or phrases for order (först när, En. not until). One of these gestures is used with the adjective asynchronous in reference to a function.

Teacher 1’s metaphorical and iconic gestures are not combined with any words or phrases for time apart from verbs that primarily signal present or future tense. Instead, the eighteen identified container gestures were accompanied by single words such as data, information, retrieve, and fetch and used in association with discussing how to use functions and how to retrieve data, making them conduit gestures. His five iconic gestures were primarily used with reference to an analogy between the form and direction of a curly bracket ‘{‘ (i.e., a character that is used to mark where a function starts and ends) and the wings of a seagull between his hands (i.e., a connection between a programming concept, and a verbal expression accompanied by a gesture).

4.2.2 Teacher 2

The majority of Teacher 2’s identified gestures were performed along the sagittal axis; eighteen out of nineteen are accompanied by verbs, and eight out of nineteen are by time adverbials. Nine of the verbs were in the present tense, and six were in the future tense. All but two gestures along the sagittal axis also involve forward bodily motion. The sagittal gestures are paired with language about what the teacher does or what he and the students do while programming (som ni gör nu, En. like you’re doing now) or will do (sen ska vi guida dem, En. then we will guide them). Related to this, five of the adverbs are instances of sen (En. then), but nu (En. now) and idag (En. today) are also used. The two remaining gestures are accompanied by the phrases då ska vi fånga upp det (En. then we will catch it) and som vi tar in (En. that we take in) – i.e., phrases that involve the motion of objects towards the teacher’s body. Also, these two gestures are iconic. The third exception is a gesture used with verbally asking the students to be more communicative. That is, focusing on the student’s future behaviour toward him. These gestures could also be interpreted as conduit gestures in which something is caught or taken. However, it could also be a combination of temporality and conduit gestures.

Teacher 2 used 44 gestures that were coded as palm-up gestures. Like Teacher 1, his palm-up gestures were used, turning to the class when asking them to consider a hypothetical action or how to perform a step in the programming procedure. Furthermore, his palm-up gestures were closely connected with time. Moreover, they are accompanied by language that includes more verbs (41 out of 44) and time adverbials (13 out of 44) than the gestures coded as performed along the three axes. Twenty-two verbs are primarily used in the present tense, and thirteen are used in the future tense. Twelve of the verbs in the present tense include a time adverbal (i.e., ; En. then; sen, En. then; and när, En. when), which either locates the action in the past or the future, while one of them locates the action in the now (i.e., nu, En. now). Five of the verbs referring to the steps of the programming process or something hypothetical include the conjunction om (En. if), which is used together with subclauses that express conditions. Unlike Teacher 1, Teacher 2 does not combine palm-up gestures with language representing order.

Teacher 2’s gestures along the lateral axis are used in reference to time but also to locations on an imaginary computer screen located in front of him (e.g., den här mappen, En. this folder), or to places elsewhere in the world (e.g., in flera länder, En. many countries). In such instances, the gesturing is interpreted as gazing at said locations while facing the students. In contrast to Teacher 2’s sagittal and palm-up gestures, his gestures along the lateral axis are not as closely connected with time (28 of the 38 identified gestures are accompanied by words in different tenses, and time adverbials accompany nine). Generally, Teacher 2’s gestures from left to right are paired with language about what he and the students do, did (och sen skrev jag, En. and then I wrote) or will do (lägg den, En. put it), whilst the gestures are paired with language about what he did in the past, (e.g., då var det enkelt att gå tillbaka, En. then it was easy to go back) are performed the other way around – an indication that time is spatialised from left to right.

Teacher 2 also performs inward gestures along the sagittal axis, using both his hands toward a location in front of him or outward gestures with both hands from this position. In one episode, he makes such a motion while saying (försöker casta, En. are trying to cast), and in another one, he makes such an outward gesture while saying (och så försvann då det där, En. and then that disappeared). Just like Teacher 1’s gestures along the vertical axis appear influenced by the fact that he stands in front of a computer screen looking at programming syntax. Here, for teacher 2, our interpretation is that events do not move in time from left to right but in relation to an imaginary screen. Another example is when he keeps his gesture still while talking about yesterday’s coding in front of the screen (igår satt jag och kodade, En. yesterday I sat coding). By keeping the gesture for a while, this activity becomes a point on a spatial timeline for as long as it lasts.

Teacher 2 uses twenty gestures along the vertical axis, less than half as many as the lateral ones. Eleven gestures along the vertical axis include a verb, and four are time adverbials. Only those that include a time adverbial are related to time, and none is related to order. In one of the episodes that pertain to time, he makes a light downward gesture with his left hand while saying (tio sekunder, En. ten seconds) – a gesture that represents a limited amount of time. The rest of his gestures along the vertical axis are associated with time concepts that concern locations upwards or downwards, representing future and past times. In two such episodes, Teacher 2 makes upward gestures in relation to future results. In another episode, he makes a downward gesture in relation to what should be done later (i.e.,  sen anropar du den, En. then you call it).

In addition, Teacher 2 uses 50 container gestures, either involving opening and closing one hand or holding both hands in front of him as if holding something between them. Twenty of these containerlike areas are accompanied by verbless phrases (e.g., bra för dig, En. good for you, två olika datatyper, En. two different data types), and only five include time adverbials. Container gestures are the only type of gestures that he uses while talking about the order of something (e.g., of doubles and strings). Teacher 2 only uses seven iconic gestures, none of which are used in reference to time. Instead, the iconic gestures are used primarily in reference to real-world things such as full stops, buttons, and credit card numbers.

5 Discussion and conclusions

This study explored the relationship between Swedish speakers’ gestures along different axes and how time is construed in spoken language in the context of programming education. Teachers giving programming lectures were chosen for this study since programming lectures were expected to be naturalistic environments involving talk about time-related concepts (Larsson and Stolpe 2023; Solomon et al. 2020). The case-based design has revealed new insights, paving the way for more extensive research. In the following, we discuss our findings and give some recommendations for future research.

The results show that both teachers gesture along the sagittal, lateral, or vertical axis while using expressions for time, such as verbs in different tenses and time adverbials. However, there were also differences in how much the teachers gesture, which, in line with earlier research, indicates that gesturing is individual.

Both teachers used gestures along the vertical axis when referring to activities performed on a computer screen. The screen may either be physical (real-world) or imaginary computers seemingly visualised by the teachers. The convention in a computer programming context is to organise code lines from top to bottom. Code lines could be conceptualised as events that should be performed in a specific order (Blackwell 2006). Thereby, each code line represents an event on a timeline from the top down. Accordingly, time is represented as events on a vertical timeline. Presumably, the code lines, visualising a sequence of activities in a vertical orientation, impact the teachers’ temporal gestures. Hence, the teachers’ gestures about time might be influenced by this context. This interpretation could be compared with earlier research showing that the convention of writing texts from left to right seems to have an impact on the orientation of temporal gestures (Casasanto 2016; Gu 2022). Moreover, Casasanto and Jasmin (2012) suggest that the direction can be traced to the direction of the flow of time from left to right, for example, in calendars. More research is needed to validate this finding, but our preliminary conclusion suggests that artefacts such as computer screens and the orientation of programming language can influence speakers’ temporal gestures. The introduction of an alternate convention in organising events in a temporal order may influence the speakers’ orientation of time. In a time when human-computer interaction increases, new perspectives need to be explored from a co-speech gesture perspective (see also Pitt and Casasanto 2022). Furthermore, we suggest that gestures could not be interpreted on too small language units. Instead, the meaning is carried from the scene painted by a more extended excerpt, as in Table 3. The orientation of time and spatiality of programming could be hard to distinguish in just one co-speech gesture. However, by investigating a more extended scene, clues from different gestures and utterances can be used to give a more nuanced picture of the conceptualisation of time (cf. Larsson et al. 2022).

Our results suggest that gestures oriented along the lateral axis convey information on the order in which activities should be carried out. Both teachers used gestures along the lateral axis when they talked about the order in which the students should perform certain activities. In contrast to the gestures along the vertical axis, the activities do not pertain to the coding per se. Instead, the teachers seem to employ a lateral timeline for planning the project. The finding that the teachers gesture along a lateral axis aligns with earlier findings in the literature (Casasanto and Jasmin 2012; Núñez et al. 2006; Santiago et al. 2007). The teachers typically place the past to the left and the future to the right.

Gestures along the sagittal axis were typically used when teachers talked about what they and their students did while programming. The students and teachers were active agents in doing things. Hence, those gestures are used when the teacher takes an internal perspective on time (cf., Núñez and Cooperrider 2013). For example, sen ska vi guida dem (En. then we will guide them). The co-speech gesture indicates a perspective where the teacher takes the perspective of moving along a sagittal timeline (the moving-ego perspective). Casasanto and Jasmin (2012) show that gesturing time along a sagittal axis appears when English speakers are asked to deliberately gesture past and future events. Teachers might use gestures more deliberately while teaching to make a pedagogic point. However, since this study is conducted in a natural environment, teachers’ gestures are spontaneous, with the intention of describing things to their students. From the results, we may conclude that the direction of the gesture could be connected to the internal perspective of the speaker.

The findings indicate that palm-up gestures play a communicative role together with talk about time. Earlier research on palm-up gestures indicates a great complexity. Two of the main groups are plam-up presentational and palm-up epistemic (Cooperrider et al. 2018). However, none of these groups aligns with temporality. In contrast to earlier research, our results indicate that the two teachers’ palm-up gestures are sometimes associated with verb tense. This finding suggests that more information is conveyed in palm-up gestures than previously known. It also shows the importance of a more in-depth and narrow analysis.

As expected, the results indicate that Swedish speakers use gestures along all three axes: lateral, sagittal, and vertical, similar to speakers of English. However, this study traced the vertical orientation of co-speech gestures to talk about events in a specific order. Hence, we suggest further investigating time concepts in programming classrooms and other natural settings to contribute more profound knowledge of the role of artefacts and their impact on the orientation of both verbal and gestural timescales. Considering that earlier studies have shown how cultural artefacts influence how we conceptualise time, it is pivotal that co-speech gesture studies are done in naturalistic and diverse environments, with artefacts such as computers available.


Corresponding author: Karin Stolpe, Department of Behavioural Science and Learning, Linköping University, Linköping, Sweden, E-mail:

About the authors

Karin Stolpe

Karin Stolpe is a senior associate professor in science and technology education at the Department of Behavioural Sciences and Learning at Linköping University, Sweden. Her research concerns the interaction between humans and technology, with a special interest in how language develops as humans communicate with artificial intelligence through prompting.

Marlene Johansson Falck

Marlene Johansson Falck is a professor of English linguistics at Umeå University, Sweden. Her main research interests are in the fields of language, spatial cognition, metaphor, and metaphor identification. She was part of the founding committee of the Swedish Association for Language and Cognition (now Scandinavian Association for Language and Cognition) and was a long-time member of the governing board. She is one of the founders of MetNet Scandinavia and associate editor of Metaphor and Symbol.

Andreas Larsson

Andreas Larsson is a post-doctoral scholar in technology education at the Department of Behavioural Sciences and Learning at Linköping University, Sweden. He is also the director of the Swedish National Centre for Science and Technology Education. His main research interests are the relationship between humans and cultural artefacts in educational environments and how teachers’ beliefs and contextual knowledge impact teaching.

Acknowledgment

The authors thank the reviewers for their insightful comments and suggestions for revisions. They have helped improve the manuscript in the final version of this paper.

References

Alcaraz Carrión, Daniel & Javier Valenzuela. 2022. Time as space vs. time as quantity in Spanish: A co-speech gesture study. Language and Cognition 14(1). 1–18. https://doi.org/10.1017/langcog.2021.17.Suche in Google Scholar

Athanasopoulos, Panos & Emanuel Bylund. 2023. Cognitive restructuring: Psychophysical measurement of time perception in bilinguals. Bilingualism: Language and Cognition 26(4). 809–818. https://doi.org/10.1017/S1366728922000876.Suche in Google Scholar

Bender, Andrea & Sieghard Beller. 2014. Mapping spatial frames of reference onto time: A review of theoretical accounts and empirical findings. Cognition 132(3). 342–382. https://doi.org/10.1016/j.cognition.2014.03.016.Suche in Google Scholar

Blackwell, Alan F. 2006. Metaphors we program by: Space, action and society in java. In PPIG18: 18th workshop of the psychology of programming interest group. University of Sussex.Suche in Google Scholar

Boroditsky, Lera. 2000. Metaphoric structuring: Understanding time through spatial metaphors. Cognition 75(1). 1–28. https://doi.org/10.1016/S0010-0277(99)00073-6.Suche in Google Scholar

Boroditsky, Lera & Alice Gaby. 2010. Remembrances of times East: Absolute spatial representations of time in an Australian aboriginal community. Psychological Science 21(11). 1635–1639. https://doi.org/10.1177/0956797610386621.Suche in Google Scholar

Bylund, Emanuel & Panos Athanasopoulos. 2017. The whorfian time warp: Representing duration through the language hourglass. Journal of Experimental Psychology: General 146(7). 911–916. https://doi.org/10.1037/xge0000314.Suche in Google Scholar

Bylund, Emanuel, Gygax Pascal, Steven Samuel & Athanasopoulos Panos. 2020. Back to the future? The role of temporal focus for mapping time onto space. Quarterly Journal of Experimental Psychology 73(2). 174–182. https://doi.org/10.1177/1747021819867624.Suche in Google Scholar

Casasanto, Daniel. 2016. Temporal language and temporal thinking may not go hand in hand. In Barbara Lewandowska-Tomaszczyk (ed.), Conceptualizations of time, 169–186. Amsterdam, Netherlands: John Benjamins Publisher.10.1075/hcp.52.04casSuche in Google Scholar

Casasanto, Daniel & Lera Boroditsky. 2008. Time in the mind: Using space to think about time. Cognition 106(2). 579–593. https://doi.org/10.1016/j.cognition.2007.03.004.Suche in Google Scholar

Casasanto, Daniel & Roberto Bottini. 2014. Mirror reading can reverse the flow of time. Journal of Experimental Psychology: General 143(2). 473–479. https://doi.org/10.1037/a0033297.Suche in Google Scholar

Casasanto, Daniel & Kyle Jasmin. 2012. The hands of time: Temporal gestures in English speakers. Cognitive Linguistics 23(4). 643–674. https://doi.org/10.1515/cog-2012-0020.Suche in Google Scholar

Church, Ruth B., Spencer Kelly & David Holcombe. 2014. Temporal synchrony between speech, action and gesture during language production. Language, Cognition and Neuroscience 29(3). 345–354. https://doi.org/10.1080/01690965.2013.857783.Suche in Google Scholar

Cienki, Alan. 2016. Analysing metaphor in gesture. In Elena Semino & Zsófia Demjén (eds.), The Routledge handbook of metaphor and language, 131–152. London: Routledge.Suche in Google Scholar

Colburn, Timothy & Gary Shute. 2017. Type and metaphor for computer programmers. Techné: Research in Philosophy and Technology 21(1). 71–105. https://doi.org/10.5840/techne20174662.Suche in Google Scholar

Cooperrider, Kensy, Natasha Abner & Susan Goldin-Meadow. 2018. The palm-up puzzle: Meanings and origins of a widespread form in gesture and sign. Frontiers in Communication 3. https://doi.org/10.3389/fcomm.2018.00023.Suche in Google Scholar

Cooperrider, Kensy, James Slotta & Rafael Núñez. 2022. The ups and downs of space and time: Topography in Yupno language, culture, and cognition. Language and Cognition 14(1). 131–159. https://doi.org/10.1017/langcog.2021.25.Suche in Google Scholar

Coulson, Seana & Cristobal Pagán Cánovas. 2009. Understanding timelines: Conceptual metaphor and conceptual integration. Cognitive Semiotics 5(1–2). 198–219. https://doi.org/10.1515/cogsem.2013.5.12.198.Suche in Google Scholar

de la Fuente, Juanma, Julio Santiago, Antonio Román, Cristina Dumitrache & Daniel Casasanto. 2014. When you think about it, your past is in front of you: How culture shapes spatial conceptions of time. Psychological Science 25(9). 1682–1690. https://doi.org/10.1177/0956797614534695.Suche in Google Scholar

Duffy, Sarah E. 2014. The role of cultural artifacts in the interpretation of metaphorical expressions about time. Metaphor and Symbol 29(2). 94–112. https://doi.org/10.1080/10926488.2014.889989.Suche in Google Scholar

Fuhrman, Orly, McCormick Kelly, Eva Chen, Heidi Jiang, Dingfang Shu, Shuaimei Mao & Lera Boroditsky. 2011. How linguistic and cultural forces shape conceptions of time: English and Mandarin time in 3D. Cognitive Science 35(7). 1305–1328. https://doi.org/10.1111/j.1551-6709.2011.01193.x.Suche in Google Scholar

Gentner, Dedre, Mutsumi Imai & Lera Boroditsky. 2002. As time goes by: Evidence for two systems in processing space → time metaphors. Language & Cognitive Processes 17(5). 537–565. https://doi.org/10.1080/01690960143000317.Suche in Google Scholar

Gu, Yan. 2022. Time in Chinese hands: Gesture and sign. In Anna Piata, Adriana Gordejuela & Daniel Alcaraz Carrison (eds.), Time representations in the perspective of human creativity, 209–232. Amsterdam: John Benjamins Publisher.10.1075/hcp.75.10guSuche in Google Scholar

Gu, Yan, Lisette Mol, Marieke Hoetjes & Marc Swerts. 2014. Does language shape the production and perception of gestures? A study on late Chinese-English bilinguals’ conceptions about time. Proceedings of the annual Meeting of the cognitive science society, 547–552. Austin, TX: Cognitive Science Society.Suche in Google Scholar

Gu, Yan, Yeqiu Zheng & Marc Swerts. 2019. Which is in front of Chinese people, past or future? The effect of language and culture on temporal gestures and spatial conceptions of time. Cognitive Science 43(12). e12804. https://doi.org/10.1111/cogs.12804.Suche in Google Scholar

Harper, Colton, Keith Tran & Stephen Cooper. 2024. Conceptual metaphor theory in action: Insights into student understanding of computing concepts. SIGCSE 2024 – Proceedings of the 55th ACM technical Symposium on computer science education, 463–469. Portland, OR: SIGCE: Computer Science Education.10.1145/3626252.3630812Suche in Google Scholar

Kendon, Adam. 2004. Gesture: Visible action as utterance. Cambridge: Cambridge University Press.10.1017/CBO9780511807572Suche in Google Scholar

Lakoff, George & Mark Johnson. 1980. Metaphors we live by. Chicago and London: University of Chicago Press.Suche in Google Scholar

Lakoff, George & Mark Johnson. 1999. Philosophy in the flesh: The embodied mind and its challenge to Western thought. New York: Basic Books.Suche in Google Scholar

Larsson, Andreas & Karin Stolpe. 2023. Hands on programming: Teachers’ use of Metaphors in gesture and Speech make Abstract concepts tangible. International Journal of Technology and Design Education 33(3). 901–919. https://doi.org/10.1007/s10798-022-09755-0.Suche in Google Scholar

Larsson, Andreas, Karin Stolpe & Johansson Falck Marlene. 2022. Analysing the elements of a scene – an integrative approach to metaphor identification in a naturalistic setting. Cognitive Semiotics 15(2). 223–248. https://doi.org/10.1515/cogsem-2022-2014.Suche in Google Scholar

Manches, Andrew, Peter E. McKenna, Gnanathusharan Rajendran & Judy Robertson. 2020. Identifying embodied metaphors for computing education. Computers in Human Behavior 105. 105859. https://doi.org/10.1016/j.chb.2018.12.037.Suche in Google Scholar

McNeill, David. 1992. Hand and mind. What gestures reveal about thought. Chicago and London: The University of Chicago Press.Suche in Google Scholar

Núñez, Rafael. 2004. Do real numbers really move? Language, thought, and gesture: The embodied cognitive foundations of mathematics. In Fumiya Iida, Rolf Pfeifer, Luc Steels & Yasuo Kuniyoshi (eds.), Embodied artificial intelligence lecture notes in computer science, vol. 3139, 54–73. Berlin, Heidelberg: Springer.10.1007/978-3-540-27833-7_4Suche in Google Scholar

Núñez, Rafael & Kensy Cooperrider. 2013. The tangle of space and time in human cognition. Trends in Cognitive Sciences 17(5). 220–229. https://doi.org/10.1016/j.tics.2013.03.008.Suche in Google Scholar

Núñez, Rafael, Benjamin A. Motz & Ursina Teuscher. 2006. Time after time: The psychological reality of the ego-and time-reference-point distinction in metaphorical construals of time. Metaphor and Symbol 21(3). 133–146. https://doi.org/10.1207/s15327868ms2103_1.Suche in Google Scholar

Núñez, Rafael & Eve Sweetser. 2006. With the future behind them: Convergent evidence from Aymara Language and gesture in the crosslinguistic comparison of spatial construals of time. Cognitive Science 30(3). 401–450. https://doi.org/10.1207/s15516709cog0000_62.Suche in Google Scholar

Pitt, Benjamin & Daniel Casasanto. 2022. Spatial metaphors and the design of everyday things. Frontiers in Psychology 13. 1019957. https://doi.org/10.3389/fpsyg.2022.1019957.Suche in Google Scholar

Santiago, Julio, Juan Lupáñez, Elvira Pérez & María Jesús Funes. 2007. Time (also) flies from left to right. Psychonomic Bulletin & Review 14(3). 512–516. https://doi.org/10.3758/BF03194099.Suche in Google Scholar

Solomon, Amber, Miyeon Bae, Betsy DiSalvo & Mark Guzdial. 2020. Embodied representations in computing education: How gesture, embodied language, and tool use support teaching recursion. In The interdisciplinarity of the learning sciences, 14th International Conference of the Learning Sciences (ICLS) 2020, vol. 4. Nashville, Tennessee, USA.Suche in Google Scholar

Stolpe, Karin & Andreas Larsson. 2023. Vi går in och tittar i mappen” – en lärares strukturering av rumslighet i programmeringsundervisning. [“Let’s go look inside the folder”: A teacher’s way of structuring spatiality when teaching programming]. Nordic Studies in Science Education 19(2). 134–147. https://doi.org/10.5617/nordina.9722.Suche in Google Scholar

Teleman, Ulf, Staffan Hellberg & Erik Andersson. 1999. Svenska Akademiens grammatik: 4 Satser och meningar. [The Swedish academy’s grammar: 4 clauses and sentences]. Stockholm: Svenska Akademien.Suche in Google Scholar

Tversky, Barbara, Sol Kugelmass & Atalia Winter. 1991. Cross-cultural and developmental trends in graphic productions. Cognitive Psychology 23(4). 515–557. https://doi.org/10.1016/0010-0285(91)90005-9.Suche in Google Scholar

Valenzuela, Javier, Cristóbal Pagán Cánovas, Inés Olza & Daniel Alcaraz Carrión. 2020. Gesturing in the wild. Review of Cognitive Linguistics Published under the auspices of the Spanish Cognitive Linguistics Association 18(2). 289–315. https://doi.org/10.1075/rcl.00061.val.Suche in Google Scholar

Vetenskapsrådet. 2017. God forskningssed. [Good research practice]. Stockholm: Vetenskapsrådet/Swedish Research Council.Suche in Google Scholar

Published Online: 2025-04-22

© 2025 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Heruntergeladen am 3.2.2026 von https://www.degruyterbrill.com/document/doi/10.1515/cogsem-2025-2002/html
Button zum nach oben scrollen