Computational approaches to language modeling and translation, an active area of research for more than 75 years, have hit the mainstream. Writers, teachers, and readers have been encountering writing evaluators in wide deployment since the late 1990s, machine translation made readily available since the mid-2000s, procedurally generated text adventures since the mid 2010s, and now, over the last eight months, an explosion of medium-crossing text to image to video to speech and back again generators with names like Dall-e, GPT, and Bard. This embedding of algorithms across representational praxis has the capacity to change the ways to write, teach and evaluate writing, and most fundamentally, think about the nature of language. By considering the history of computational models of language and contemporary, machine-readable representations, scholars have the potential to query theoretically-driven understandings of concepts like metaphor, models, and narrative. In this talk, I would for a humanistic audience describe the history of contemporary transformer based language architectures, some experiments they enable, and some implications of these models for prior work on conceptual metaphor, testimony, and radicalization.
Assoc. Prof. Ben Miller (Emory) is a visiting Canterbury Fellow in the Digital Humanities programme.