Open this publication in new window or tab >>Show others...
2025 (English)In: Memory, Mind & Media, E-ISSN 2635-0238, Vol. 4, article id e18Article in journal (Refereed) Published
Abstract [en]
Artificial Intelligence (AI) has reached memory studies in earnest. This partly reflects the hype around recent developments in generative AI (genAI), machine learning, and large language models (LLMs). But how can memory studies scholars handle this hype? Focusing on genAI applications, in particular so-called ‘chatbots’ (transformer-based instruction-tuned text generators), this commentary highlights five areas of critique that can help memory scholars to critically interrogate AI’s implications for their field. These are: (1) historical critiques that complicate AI’s common historical narrative and historicize genAI; (2) technical critiques that highlight how genAI applications are designed and function; (3) praxis critiques that centre on how people use genAI; (4) geopolitical critiques that recognize how international power dynamics shape the uneven global distribution of genAI and its consequences; and (5) environmental critiques that foreground genAI’s ecological impact. For each area, we highlight debates and themes that we argue should be central to the ongoing study of genAI and memory. We do this from an interdisciplinary perspective that combines our knowledge of digital sociology, media studies, literary and cultural studies, cognitive psychology, and communication and computer science. We conclude with a methodological provocation and by reflecting on our own role in the hype we are seeking to dispel.
Place, publisher, year, edition, pages
Cambridge University Press, 2025
Keywords
critical AI studies, generative AI, chatbots, mnemonic agency, socio-technical assemblages, interdisciplinarity, ethics, methodology
National Category
Information Systems, Social aspects
Identifiers
urn:nbn:se:umu:diva-245934 (URN)10.1017/mem.2025.10018 (DOI)2-s2.0-105019926382 (Scopus ID)
Funder
Swedish Research CouncilEU, Horizon 2020, 677955
2025-10-272025-10-272025-11-24Bibliographically approved