Startseite Non-anthropogenic mind and complexes of cultural codes
Artikel Öffentlich zugänglich

Non-anthropogenic mind and complexes of cultural codes

  • Sergey Kulikov EMAIL logo
Veröffentlicht/Copyright: 6. Oktober 2016

Abstract

The object of research is to clarify the connections between non-anthropogenic mind and culture as sign systems. Investigation of such an object discloses the perspectives on construction of the generalized model of mind and can help to build the bridge between traditional and digital humanities. The subject of traditional humanities is natural human activity; the subject of digital humanities is computer-based forms of activity and communication. Finding signs created not only by human but also by natural circumstances helps to define the sign system that unites the natural (non-anthropogenic) and artificial kinds of mind. Methodology of research includes the principles of semiotics previously developed by Charles Peirce and Ferdinand de Saussure and expanded by Yuri Lotman and Boris Uspensky. Semiotic interpretation of mind as the object of culture allows the building of a generalized model of mind as one of textual constructions, presenting the history of mankind as the replacement of natural events by secondary models. The author concludes that the revealing of a generalized model of mind opens new opportunities for the construction of the intelligent activity strictly interpreted as special sign systems. Semiotic studies interpret culture as a rationality making machine, and activity of mind is caused by the work of such a machine. Because of that, if traditional meaning sign systems were estimated as human-made kinds of complex of primary signs, then modern statements help to see the absence of an irresistible limit to interpret such complex as a nature-made but non-anthropogenic phenomenon.

1 Introduction

Wittgenstein’s idea that language defines the limits for the mind and theory of sign systems meaning that culture is a complex of codes of communication and behavior can provide for the building of a mind model. But even if we’ve got a natural language, for example, English or German, there is a problem to solve: language must be represented as a basic kind of mind system. Many more problems arise in the case of artificial languages, which are used in mathematics and formal logic for constructing models of artificially intelligent activity. Each time we need a justification of the basic forms of mind that we represent by the opportunities of artificial languages. So, we believe that Peirce’s and Saussure’s theories of signification adapted by Lotman and Uspensky can help to find a common field for natural and artificial variants of constructing mind models and it leads to a generalized model of language determinations of mind.

Discussions of the problems of mind, modeled on the basis of semiotics, began in the 1960s. First, one of the main branches in this field was the reconstruction of intelligent activity as sign systems (Zemanek 1966; Pospelov 1975). Second, and importantly, has been the description of interactions between human and computer on the basis of so-called “computing semiotics” (van der Lubbe and Egbers 1997; Malcolm and Goguen 1998; Rieger 2001; Goguen and Harrell 2004; de Souza 2005; Gudwin and Queiroz 2006; Tanaka-Ishii 2010). Third, we can note discussions on the interaction of culture, intelligent activity, and computer programming (Lotman and Uspensky 1993; Bedau 1998; Parisi and Cangelosi 2002.).

Our work continues this third branch of research. As a result, the article elaborates the methods for the production of bridges between traditional humanities and digital humanities. Digital humanities is a branch of modern research that is rapidly developing, for example, in the field of computational linguistics (Niehaus and Young 2014) and in the fields of cognitive science (Humphreys 2004). Semiotics should find common fields for traditional humanities and digital humanities.

2 Modern philosophy of mind and principles of semiotics

2.1 Mind in general and the limits of philosophy

Mind in general is a complex of rules of communication and behavior providing the process of cognition on the basis of perception, memory, and other structures. Mathematicians define the formal algorithms of such activities. Logicians elaborate the basic principles for the construction of formal algorithms (Hutter 2005). Thereby, nowadays most important models of mind are mathematics-based or logic-based structures within so-called digital humanities (Gent 2013; Feldman and Domshlack 2014; Kotthoff 2014). These models can help to understand the workings of the mind like complex axioms, for example, morality axioms, laid in the basis of making decisions in different situations. But none of these models can finally find whether or not the mind can be like a duplicate of the natural intelligent activity signified in mathematical or logical form. The foundations of such models need new judgment. But neither mathematics nor logic as technical disciplines can explicate the ontological presuppositions of mind, especially the human or non-human basis of intelligent actions. This is one of the tasks of philosophy, which must clarify in general basic notions of some spheres constituted by concrete knowledge, the knowledge of chemical structure of things or, perhaps, the knowledge of moral nature of human actions. Some modern authors mean that is the ontology of philosophy as itself (Grenon and Smith 2009).

Philosophers discuss extremely general notions and they should be able to answer the question about the possibility of a non-human kind of intelligent activity (see, for example, Landor 1981). But philosophy as itself has no real opportunities to construct really strict knowledge. Even discussions within philosophy of mathematics can provide only the construction of phenomenological models of human activity. So, the features of natural mind are approximately known within cognitive science, but modern researchers have no clear idea how an alternative kind of mind can be constructed. Kripke (1980), Rorty (1979), Chalmers (1996) and others demonstrate how an alternative kind of mind could exist. But an absolutely different kind of mind or intelligent activity, for example, like God’s intelligent actions compared to human intelligent actions in the Middle Ages, is not demonstrated. Each time modern philosophical investigations show the alternative only for humanity’s phenomena par excellence (Kripke 1980: 144–145). Besides, Rorty (1979: 71) analyzes the hypothetical case of aliens, who have no idea about mind, but even so have built a well-developed civilization. The building of a civilization does not depend on existence or non-existence of “mind,” “intelligent activity,” etc. Some additional structures of cognition always help to build a civilization, and such a civilization will be similar to a civilization that can be built by human mind.

The construction of a generalized model of mind within philosophy needs the assistance of some formal discipline to elaborate the strict theory of intelligent activity. Mathematics and logic depend on philosophical presuppositions and cannot play such a role. The dependence of mathematics and logic is caused by the necessity of clarifying its basic notions, for example, deduction or logical consequence, within philosophy. On the contrary, the generalization of philosophical knowledge, as with sign systems, has been provided by semiotics since the works of Charles Pierce (1868). We don’t mean that semiotics was the opposite for the mathematics or formal logic; semiotics just included more general knowledge about signs that mathematics and logic used in their own spheres.

2.2 Semiotic model of mind

The key role within research is played by the studies of Lotman and Uspensky (1993) previously developed by Peirce (1868) and Ferdinand de Saussure (1997). Peirce (1868: 287–298) proposes some principles of sign cognition, but Saussure (1997: 32) put forward three especially important theses: language is a social phenomenon; language is a sign system; the existence of a common science for both fields means an actual unity of language knowledge and societal knowledge.

Semiotics opens new opportunities for the construction of a generalized phenomenological model of mind. The semiotic interpretation of culture allows the presentation of humankind’s history using a replacement of natural events by secondary models. Modeling of cultural and historical developments demonstrates the functioning of the textual expression of a world outlook based on some kind of intelligent activity. A generalized model of mind can be built as one of such textual constructions.

It is not a secret that human culture is the result of human activity. The possibility of a linguistic description of culture opens the perspectives on applying a semiotics-based approach to culture. The more specialized structures, namely, moral, politics, etc., are constituted, from that point of view, by the culture as sign systems, which consist of more primitive programs of communication and behavior (Caws 1988; Deely 2010). Understanding of culture as sign systems reveals the possibility that the clearing of some fundamental qualities and properties of the human communication is designated by a certain position in a world outlook in a concrete historical era. Variants of significations of world outlook represent the kinds of world understanding. Systematization of significations of world outlook allows the creation of a generalized image of abilities to find out the world understanding replaced with concrete set of signs and sign systems. The understanding of social and cultural phenomena as sign systems allows the interpretations of the specifics of modern rationality in toto. This way leads to new understanding of mind within the generalization of intelligent activity.

Semiotic studies have some interesting parallels with epistemological studies (Hintikka and Hintikka 1989). Derrida (1967: 12) notices that formal logic traditionally constituted the representation of mind as a knowledge basis. Modern studies put the question of mind and the problem of rationality in general in the context of the “problem of writing” (Derrida 1967: 18–19). Thus, modern researchers consider logical questions as a part of another fundamental problem entitled “problem of language.” Derrida (1967) makes three conclusions, treating the basis of general science about writing or grammatology. Firstly, he maintains that the role of general logic is played nowadays by the phenomenon of “writing” (Derrida 1967: 43). Secondly, he suggests that in grammatology is adopted the idea of time nonlinearity. Thirdly, Derrida (1967: 130) notices difficulties for the transformation of grammatology into a positive science. This means that scientific thought, human mind, intelligent activity, and rationality as their general background determined by a form of writing can be interpreted as kinds of sign systems.

Sentences made by Barthes (1957, 1984) and Deleuze (1969) have a special importance for constructing semiotic models of mind. Firstly, Barthes demonstrates the transformation of “Einstein’s brain” into a mythical object as a characteristic example; so, scientific thought in collective consciousness is signified as a kind of mythology (Barthes 1957: 85–87). Secondly, Barthes supposes that modern society is subordinated to a special kind of rationality named rationality of Fiction and Rhetoric, and it determines the total power of mythology (Barthes 1984: 127–132). Only by the force of primary signs denoting primitive things is it possible to overcome such rationality. Thirdly, by logical completion of Barthes’s views comes the idea of the mutual converting of sense and nonsense put forward by Deleuze (1969: 85). Body sphere determines each kind of signification and the sense in its deepness, from that point of view, is the product of unconsciousness structures such as sexual experience and psychotic disorders. So, the center of modern rationality and the center of intelligent activity in toto are the set of logical operations only from the outside, but inside they are kinds of “myth” or “art creativity.” Modern signification of intelligent activity helps to understand the ways of semiotics-based models of mind.

3 The generalized model of mind

3.1 Model of mind and key role of culture signs

The semiotic interpretation of intelligent activity coincides with the understanding of culture offered by Lotman and Uspensky as a “device for developing information.” A material realization of this device coincides with natural language (Lotman and Uspensky 1993: 328). Technological metaphors for the description of the main functions of culture are attracted by Lotman and Uspensky (1993) as signification of culture description in its historical development. But it could also be realized like sets of the signification variants within modern communications. This interpretation should not mean that statements about semiotics only have an affinity of culture with a century of technical equipment domination. During the pre-technological era, culture was structuring the understanding of world by language. So, texts played the role of “devices” in the Middle Ages, antiquity, and other periods (Lotman and Uspensky 1993: 332–333).

The idea of a “device” or “mechanism” executed on the basis of some technological scheme coincides with attempting to correlate traditional options of interpretation of culture and modern ways of interpretation. Technological metaphors opened the prospect of representation of any phenomena of culture as some order, which was born from chaos. The “chaos” here can be understood not only as the absence of order but also as separation of types of order from the point of view of their relative randomness. The process of structuring is not necessarily following from real chaos, wherefore it is possible to represent any mismatch of two ways of structuring where any preceding emergence of a new order is chaos. Any new structure is an order only concerning the predecessor and can be associated with chaos in difference with own “successor.”

For example, in Egypt in the end of fourth century BC, there were at least two main centers or “kernels” of cultural life, namely, traditional Egyptian culture and Hellenistic culture (Manning 2012: 3). A new cultural center appeared in Alexandria after the Macedonian conquest of Egypt. And a new order for society and culture in Egypt, for its policy and economy was constituted; but, from the Hellenistic point of view, the form of traditional Egyptian cultural life seemed to be a disorder or even “chaos.” So, Macedonians and Greeks were not ready to see in the king a god essence; originally they tried to build the state on rather secular principles. Even just because of that for Egyptians the Hellenistic “device of culture” already worked like an orderless and even senseless “machine” or like a body without a soul. Perhaps, to smooth such impression since the epoch of Ptolemy I Soter, the kings in official Egyptian documents are given titles such as “pharaohs,” i. e., “sons of Ra” (Bevan 1927).

Applying the semiotic concept of a “cultural code” can help to describe culture working as “order” and as “chaos” at the same time. A situation of contradiction between ciphered and non-ciphered information helps to find such codes, and each of them is the codification only for another kind of coding, because each time some addressee who owns a key to original codification exists. The ciphered message is non-ciphered for that addressee, because such an addressee is able to take the message within specifically made sign systems in the standard significations. Ciphered and non-ciphered statuses of messages are random, and one form of cultural order can look like chaos for another. Besides, the concept of a “cultural code” helps to find the interpretation of culture as a kind of “memory” (Lotman and Uspensky 1993: 328–329). Thereby, culture can be understood not only by analogy with information making devices but also as a complex of information storing devices. Such interpretation allows suggesting that translation rules of the cultural experience in text forms are playing the role of some kind of programming (Lotman and Uspensky 1993: 329–330).

Applying technological metaphors demand from Lotman and Uspensky to build an image of cultural life by analogy with computer programming. It is almost erasing the line between the natural mind and artificial intelligent activity. Culture signs start to play key role within the building of a generalized phenomenological model of mind.

3.2 Discussion on mind as culture sign systems

This section discusses the mind as a natural but non-anthropogenic product of culture sign systems. Such a kind of mind has two important features. Firstly, if this kind of intelligent activity is really possible, then, from the outside, it can look like a formal algorithm; from the inside, it is exactly the complex of culture codes or culture programming. Secondly, non-anthropogenic mind can already be working in our space, but the human race is able to find only the effects but not the essence of such activity. Feasibility of the opening of such a version of mind as culture sign system, which looks like a natural but non-anthropogenic product, demonstrates that the bridge between traditional humanities and digital humanities is already prepared for building. This section discusses some historical examples and demonstrates some modern events.

History demonstrates examples of human behavior ruled by non-human or “superhuman” programs on the basis of religion. They are the prototypes of artificial intelligent activity, which has no constant view as a formal algorithm. Each religion in its mystical doctrine has signs of the working of mind, wherefore every mystical doctrine constantly coincides with a special system of behavior and communication. Mystical doctrine has a profile like reasonable code for the adept and like nonsense for others. The behavior of supporters of some mystical doctrines always has signs of ambiguity for people around, and, perhaps, it is perceived as a kind of madness for most of people. For example, members of Christianity were perceived as criminals in Ancient Rome. They maintained that people should not support the Emperor cult, because, from their point of view, it was a sign of betraying the true God. Ancient Romans did not understand why Christians could not be like some respectable citizens and why they did not do some “formalities” (Fox 1986). Another programming code or code of intelligent activity organized the behavior of ancient Christians. They were even ready to die for their own beliefs, but they did not want to follow pagan “habits.”

Religion-based behavior has an analogy with some kinds of modern computer activity ruled by not-quite-human programming. For example, in 2008, the first case of a US presidential election held under the influence of web-based social networks happened. Barack Obama’s supporters developed extensive propaganda about the need for change, and, as a result, Obama won. A similar case is demonstrated by Due (2014) within the processes of the elaboration of new ideas. And similar effects are observable as self-organization of Internet groups, for example, groups of fighters for human rights, groups of fighters for animal rights, etc. They were probably cases of modern events of a non-anthropogenic mind, which has the signification of culture sign. Social networks have special values as cultural background and computer support, but their use of digital devices are executed under the influence of the special coding and decoding of group activity in the directions of these groups. Most contemporaries do not fully understand culture codes and the directions of these groups.

Culture code, i. e. versions of “culture programming” or special “culture scripts” acted in ancient religious communities within mystical doctrine as complexes of rules for the reading of sacred texts and as complexes of rules for communication inside the community. Almost the same codification, which looks like computer programming or “web service protocols,” is found within modern political activity. American people did not know in 2008 whether or not Obama’s politics would really change anything. People just believed in it and voted on the basis of their own beliefs. But it does not mean that members of political communities believe in computer programming just like religious people believe in God. Computer programming and digital devices play organizing roles within political activity just like mystical doctrine was doing earlier within religious activity. Demonstrated cases were examples of real manifestation of non-anthropogenic mind, because all activities, which were trying to start by the desire of social groups outside “computer,” for example, in North Africa and Middle East in 2011–2012, were leading to fatal disorders. Only real intelligent activity, only a real mind has creative abilities, but the simulation of mind provides only losses and tragedies.

4 Conclusion

As a result, we can maintain now that a generalized model of mind has a semiotic-based kind of view. Such model allows the revelation of new opportunities for the construction of intelligent activity that can be strictly interpreted as special sign systems. But if in the traditional meaning such systems were estimated only as human-made kind of complex of signs, then modern statements help to see the absence of an irresistible limit to interpret such a complex as a nature-made but non-anthropogenic phenomenon. It provides the conflict between logical or formal organization and inner qualities of intelligent activity in toto. Semiotic studies interpret culture as a rationality making machine, and activity of mind is caused by the work of such machine. As a result, the presentation of mind as special kind of culture disclosed the methods for the production of bridges between traditional and digital humanities.

Award Identifier / Grant number: 15-18-10002

Funding statement: Russian Science Foundation, (Grant / Award Number: ‘15-18-10002’)

References

Barthes, Roland. 1957. Mythologies. Paris: Editions de Seuil.Suche in Google Scholar

Barthes, Roland. 1984. Le bruissement de la langue: Essais critiques IV. Paris: Editions de Seuil.Suche in Google Scholar

Bedau, Mark. 1998. Philosophical content and method of artificial life. In Terrell Ward Bynum & James H. Moor (ed.), The digital phoenix: How computers are changing philosophy, 135–152. Oxford: Blackwell.Suche in Google Scholar

Bevan, Edwyn. 1927. The house of Ptolemy. London: Methuen.Suche in Google Scholar

Caws, Peter. 1988. Structuralism: The art of the intelligible. Atlantic Highlands, NJ: Humanities Press.Suche in Google Scholar

Chalmers, David. 1996. The conscious mind: In search of a fundamental theory. New York: Oxford University Press.Suche in Google Scholar

Deely, John. 2010. Theses on semiology and semiotics. American Journal of Semiotics 26(1–4). 17–25.10.5840/ajs2010261/46Suche in Google Scholar

Deleuze, Gilles. 1969. Logique du sense. Paris: Les Editions de Minuit.Suche in Google Scholar

Derrida, Jacques. 1967. De la grammatologie. Paris: Les Editions de Minuit.Suche in Google Scholar

de Souza, Clarisse. 2005. The semiotic engineering of human-computer interaction. Cambridge: MIT Press.10.7551/mitpress/6175.001.0001Suche in Google Scholar

Due, Brian. 2014. The development of an idea in a context of rejection. Semiotica 202(1/4). 207–240.10.1515/sem-2014-0036Suche in Google Scholar

Feldman, Zohar & Carmel Domshlack. 2014. Simple regret optimization in online planning for Markov decision processes. Journal of Artificial Intelligence 51. 165–205.10.1613/jair.4432Suche in Google Scholar

Fox, Robin Lane. 1986. Pagans and Christians: In the Mediterranean world from the second century AD to the conversion of Constantine. London: Viking.Suche in Google Scholar

Gent, Ian Philip. 2013. Optimal implementation of watched literals and more general techniques. Journal of Artificial Intelligence 48. 231–252.10.1613/jair.4016Suche in Google Scholar

Goguen, Joseph & Harrell, D. Fox. 2004. Information visualization and semiotic morphisms. In Grant Malcolm (ed.), Multidisciplinary approaches to visual representations and interpretations, 93–106. Amsterdam: Elsevier.Suche in Google Scholar

Grenon, Pierre & Barry Smith. 2009. Foundations of an ontology of philosophy. Synthese 182(2). 185–204.10.1007/s11229-009-9658-xSuche in Google Scholar

Gudwin, Ricardo & Joao Queiroz (eds.). 2006. Semiotics and intelligent systems development. Hershey, PA: Idea Group.10.4018/978-1-59904-063-9Suche in Google Scholar

Hintikka, Jaakko & Merrill B. P. Hintikka. 1989. The logic of epistemology and the epistemology of logic. Berlin: Springer.10.1007/978-94-009-2647-9Suche in Google Scholar

Humphreys, Paul. 2004. Extending ourselves: Computational science, empiricism, and scientific method. Oxford: Oxford University Press.10.1093/0195158709.001.0001Suche in Google Scholar

Hutter, Marcus. 2005. Universal artificial intelligence: Sequential decisions based on algorithmic probability. Berlin: Springer.10.1007/b138233Suche in Google Scholar

Kotthoff, Lars. 2014. Algorithm selection for combinatorial search problems: A survey. AI Magazine 35(3). 48–60.10.1007/978-3-319-50137-6_7Suche in Google Scholar

Kripke, Saul. 1980. Naming and necessity. Oxford: Basil Blackwell.Suche in Google Scholar

Landor, Blake. 1981. Definitions and hypotheses in “posterior analytics” 72 a 19–25 and 76 b 35–77 a 4. Phronesis 26(3). 308–318.10.1163/156852881X00079Suche in Google Scholar

Lotman, Juri & Boris Uspensky. 1993. O semioticheskom mekhanizme kul’tury [About the semiotics mechanism of culture]. Izbrannye stat’i 3. 326–342.Suche in Google Scholar

Malcolm, Grant & Joseph Goguen. 1998. Signs and representations: Semiotics for user interface design. In Paton Ray & Nielson Irene (eds.), Visual representations and interpretations, 163–172. Liverpool: Springer.10.1007/978-1-4471-0563-3_17Suche in Google Scholar

Manning, Joe. 2012. The last Pharaohs: Egypt under the Ptolemies, 305–330 BC. Princeton, NJ: Princeton University Press.Suche in Google Scholar

Niehaus, James & Michael R. Young. 2014. Cognitive models of discourse comprehension for narrative generation. Literary and Linguistic Computing 29(4). 561–582.10.1093/llc/fqu056Suche in Google Scholar

Parisi, Domenico & Angelo Cangelosi (eds.). 2002. Simulating the evolutions of language. Berlin: Springer Verlag.10.1007/978-1-4471-0663-0Suche in Google Scholar

Peirce, Charles. 1868. On a new list of categories. Proceedings of the American Academy of Arts and Sciences 7, 287–298.10.2307/20179567Suche in Google Scholar

Pospelov, Dmitry. 1975. Semiotic models in artificial intelligence problems. In International Joint Conference on Artificial Intelligence, vol. 1, 65–70. San Francisco, USA: Morgan Kaufmann Publishers.Suche in Google Scholar

Rieger, Burghard. 2001. Computing granular word meanings: A fuzzy linguistic approach to Computational Semiotics. In Paul P. Wang (ed.), Computing with words, 147–208. New York: John Wiley.Suche in Google Scholar

Rorty, Richard. 1979. Philosophy and the mirror of nature. Princeton, NJ: Princeton University Press.Suche in Google Scholar

Saussure, Ferdinand de. 1997. Cours de linguistique generale. Saint-Germain, Paris VI: Payot & Rivages.Suche in Google Scholar

Tanaka-Ishii, Kumiko. 2010. Semiotics of programming. Cambridge: Cambridge University Press.Suche in Google Scholar

van der Lubbe, Jan & E. Egbers. 1997. An integrated approach to fuzzy learning and reasoning in hierarchical knowledge structures. In International Conference on Intelligent Systems and Semiotics, 37–43, Gaithersburg, Maryland.Suche in Google Scholar

Zemanek, Heinz. 1966. Semiotics and programming languages. Communications of the ACM 9(3). 139–143.10.1145/365230.365249Suche in Google Scholar

Published Online: 2016-10-6
Published in Print: 2016-11-1

©2016 by De Gruyter Mouton

Artikel in diesem Heft

  1. Frontmatter
  2. A meta-theoretical approach to the history and theory of semiotics
  3. La possibilité d’une étude sémiotique des transhumanités: Une lecture d’un film La Créature céleste, bouddha robot coréen
  4. Non-anthropogenic mind and complexes of cultural codes
  5. Vygotsky, Bakhtin, Lotman: Towards a theory of communication in the horizon of the other
  6. Les deux barricades: Complexité sémiotique et objectivation des faits de style dans un extrait des Misérables
  7. The structural properties of the anagram in poetry
  8. Rethinking the Peircean trichotomy of icon, index, and symbol
  9. Toward an embodied account of double-voiced discourse: The critical role of imagery and affect in Bakhtin’s dialogic imagination
  10. The rhetoric of love and self-narrativesin the cinema image: A Peircean approach
  11. Nature and culture in visual communication: Japanese variations on Ludus Naturae
  12. Semiotics and education, semioethic perspectives
  13. Towards a teleo-semiotic theory of individuation
  14. Dialogue, responsibility and literary writing: Mikhail Bakhtin and his Circle
  15. Becoming a commercial semiotician
  16. Cross-political pan-commercialism in the postmodern age and proposed readjustment of semiotic practices
  17. Meaning-making across disparate realities: A new cognitive model for the personality-integrating response to fairy tales
  18. The rise and fall of metaphor: A study in meaning and meaninglessness
  19. Anthroposemiotics of literature: The cultural nature
  20. McLuhan’s war: Cartoons and decapitations
  21. Leadership as zero-institution
  22. Consumption and climate change: Why we say one thing but do another in the face of our greatest threat
  23. Semiotics of precision and imprecision
  24. Interrelations of codes in human semiotic systems
  25. A-voiding representation: Eräugnis and inscription in Celan
Heruntergeladen am 9.9.2025 von https://www.degruyterbrill.com/document/doi/10.1515/sem-2015-0034/html
Button zum nach oben scrollen