Date of Award

Spring 2022

Document Type

Open Access Thesis



First Advisor

Amit Almor


The current study is an attempt to empirically test the predictions of Talmy’s (1988) force dynamics. Specifically, Talmy argues that causal sentences are understood by reference to basic image schemas, such as Starting and Stopping. While many of the predictions of other cognitive linguistic models, such as Lakoff and Johnson’s (1980) conceptual metaphor theory, have been tested empirically by psycholinguists (Fischer & Zwaan, 2008; Gibbs, 2006; Boroditsky & Ramscar, 2002), force dynamics has received very little empirical attention (I am only aware of Wolff & Song, 2003), in spite of its productivity in formal linguistics.

The current study aims to fill this gap by employing a priming paradigm in which participants experience a force dynamic prime followed by self-paced reading of a sentence with a causative verb which referenced a compatible or incompatible force dynamic schema. In order to strengthen the priming manipulation, the primes used were short interactive 2D computer games created using Unity and the Unity Experiment Framework as primes (Brookes et al., 2019). In each game, participants had to either prevent or cause an object to move against the force of an antagonist object. Following the prime, participants read sentences that described events that had the same or the opposite force dynamic schema as the prime as well as neutral sentences that did not describe events with a force dynamic schema. Each sentence was presented region by region, and the response time for each region was recorded. It was predicted that participants would respond more quickly when the target sentence matches the interactive force dynamic prime. However, the results show no priming effect from force dynamic primes to causal sentences which suggests either that force dynamic schemas are not used in online sentence comprehension or that their influence is weak enough that an even stronger manipulation is required to uncover it.


© 2022, Dawson Petersen

Included in

Linguistics Commons