On being a Victorian Companion
-
Yorick Wilks
Abstract
I have argued or suggested: – English Common Law already, in dogs, has a legal category of entities that are not human but are in some degree responsible for their actions and have “characters” that can be assessed. – Users may not want Companions prone to immediately expressed emotions and a restrained personality, like a Victorian Lady’s Companion, might provide a better model. – Language behavior is a complex repository of triggers for emotion, both expressed and causal, and this is often under-rated in the world of ECA and theories of emotion based on them. – Companion-to-Companion communications will be important and helpful to a user, and there is nothing in principle to make one believe that “secrets” cannot be handled sensitively in such an environment. – It is easy to underestimate the role of a user’s preference in selecting the personality appropriate to a Companion: it is not even clear that users want Companions to be polite or agreeable – it may depend on personal choice or their functional role. – For many it may be appropriate for a Companion to become progressively more like its owner in voice, face, personality, memories etc. – exaggerating the way dogs are believed to adapt to owners – and if and when this becomes possible, for the Companion to become a self-avatar of its owner, there may well be other unseen consequences after the owner’s death
Abstract
I have argued or suggested: – English Common Law already, in dogs, has a legal category of entities that are not human but are in some degree responsible for their actions and have “characters” that can be assessed. – Users may not want Companions prone to immediately expressed emotions and a restrained personality, like a Victorian Lady’s Companion, might provide a better model. – Language behavior is a complex repository of triggers for emotion, both expressed and causal, and this is often under-rated in the world of ECA and theories of emotion based on them. – Companion-to-Companion communications will be important and helpful to a user, and there is nothing in principle to make one believe that “secrets” cannot be handled sensitively in such an environment. – It is easy to underestimate the role of a user’s preference in selecting the personality appropriate to a Companion: it is not even clear that users want Companions to be polite or agreeable – it may depend on personal choice or their functional role. – For many it may be appropriate for a Companion to become progressively more like its owner in voice, face, personality, memories etc. – exaggerating the way dogs are believed to adapt to owners – and if and when this becomes possible, for the Companion to become a self-avatar of its owner, there may well be other unseen consequences after the owner’s death
Chapters in this book
- Prelim pages i
- Table of contents vii
- Foreword xi
- Acknowledgements xii
- Contributors xiii
-
Section I. Setting the scene
- In good company? 3
- Introducing artificial Companions 11
-
Section II. Ethical and philosophical issues
- Artificial Companions and their philosophical challenges 23
- Conditions for Companionhood 29
- Arius in cyberspace 35
-
Section III. Social and psychological issues
- Conversationalists and confidants 59
- Robots should be slaves 63
- Wanting the impossible 75
- Falling in love with a Companion 89
- Identifying your accompanist 95
- Look, emotion, language and behavior in a believable virtual Companion 101
- New Companions 107
- On being a Victorian Companion 121
-
Section IV. Design issues
- The use of affective and attentive cues in an empathic computer-based Companions 131
- GRETA 143
- A world-hybrid approach to a conversational Companion for reminiscing about images 157
- Companionship is an emotional business 169
- Artificial Companions in society 173
- Requirements for Artificial Companions 179
- You really need to know what your bot(s) are thinking about you 201
-
Section V. Special purpose Companions
- A Companion for learning in everyday life 211
- The Maryland virtual patient as a task-oriented conversational Companion 221
- Living with robots 245
-
Section VI. Afterword
- Summary and discussion of the issues 259
- References 287
- Index 309
Chapters in this book
- Prelim pages i
- Table of contents vii
- Foreword xi
- Acknowledgements xii
- Contributors xiii
-
Section I. Setting the scene
- In good company? 3
- Introducing artificial Companions 11
-
Section II. Ethical and philosophical issues
- Artificial Companions and their philosophical challenges 23
- Conditions for Companionhood 29
- Arius in cyberspace 35
-
Section III. Social and psychological issues
- Conversationalists and confidants 59
- Robots should be slaves 63
- Wanting the impossible 75
- Falling in love with a Companion 89
- Identifying your accompanist 95
- Look, emotion, language and behavior in a believable virtual Companion 101
- New Companions 107
- On being a Victorian Companion 121
-
Section IV. Design issues
- The use of affective and attentive cues in an empathic computer-based Companions 131
- GRETA 143
- A world-hybrid approach to a conversational Companion for reminiscing about images 157
- Companionship is an emotional business 169
- Artificial Companions in society 173
- Requirements for Artificial Companions 179
- You really need to know what your bot(s) are thinking about you 201
-
Section V. Special purpose Companions
- A Companion for learning in everyday life 211
- The Maryland virtual patient as a task-oriented conversational Companion 221
- Living with robots 245
-
Section VI. Afterword
- Summary and discussion of the issues 259
- References 287
- Index 309