umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Coercion and deception in persuasive technologies
Umeå University, Faculty of Science and Technology, Department of Computing Science. (User Interaction and Knowledge Modelling (UIKM))ORCID iD: 0000-0002-6458-2252
Umeå University, Faculty of Science and Technology, Department of Computing Science. (User Interaction and Knowledge Modelling (UIKM))ORCID iD: 0000-0003-4072-8795
Umeå University, Faculty of Science and Technology, Department of Computing Science. (Institutionen för datavetenskap, Department of Computing Science)ORCID iD: 0000-0002-8430-4241
2018 (English)In: Proceedings of the 20th International Trust Workshop / [ed] Robin Cohen, Murat Sensoy, Timothy J. Norman, CEUR-WS , 2018, p. 38-49Conference paper, Published paper (Refereed)
Abstract [en]

Technologies that shape human behavior are of high societal relevance, both when considering their current impact and their future potential. In information systems research and in behavioral psychology, such technologies are typically referred to as persuasive technologies. Traditional definitions like the ones created by Fogg, and Harjumaa and Oinas-Kukkonen, respectively, limit the scope of persuasive technology to non-coercive, non-deceptive technologies that are explicitly designed for persuasion. In this paper we analyze existing technologies that blur the line between persuasion, deception,and coercion. Based on the insights of the analysis, we lay down an updated definition of persuasive technologies that includes coercive and deceptive forms of persuasion. Our definition also accounts for persuasive functionality that was not designed by the technology developers. We argue that this definition will help highlight ethical and societal challenges related to technologies that shape human behavior and encourage research that solves problems with technology-driven persuasion. Finally, we suggest multidisciplinary research that can help address the challenges our definition implies. The suggestions we provide range from empirical studies to multi-agent system theory.

Place, publisher, year, edition, pages
CEUR-WS , 2018. p. 38-49
Series
CEUR Workshop Proceedings, ISSN 1613-0073 ; 2154
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:umu:diva-151230Scopus ID: 2-s2.0-85051397631OAI: oai:DiVA.org:umu-151230DiVA, id: diva2:1243121
Conference
20th International Trust Workshop (co-located with AAMAS/IJCAI/ECAI/ICML 2018), Stockholm, Sweden, 14 July, 2018
Available from: 2018-08-30 Created: 2018-08-30 Last updated: 2018-08-30Bibliographically approved

Open Access in DiVA

fulltext(442 kB)114 downloads
File information
File name FULLTEXT01.pdfFile size 442 kBChecksum SHA-512
f00dc9f65172bb3460f2991e01c4b4c4123f58ff9e01a27fba56cfab62fe0030a9b5519d4d642ef8738855123d1746f837f1eb2565a555bc66bc5806d1fc39d0
Type fulltextMimetype application/pdf

Other links

ScopusURL

Authority records BETA

Kampik, TimotheusNieves, Juan CarlosLindgren, Helena

Search in DiVA

By author/editor
Kampik, TimotheusNieves, Juan CarlosLindgren, Helena
By organisation
Department of Computing Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 114 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 751 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf