Neuro(a)typicality and the Google Home Mini
Radical Embodied Cognition (REC) vs. Computationalist Cognition
(I originally published this essay on Medium as a 2-part series in November 2021 in response to Eric Hoel’s Substack post Futurists have their head in the clouds. This version compresses the two parts into one and includes only minor grammar and spelling corrections)
Nov 21, 2021
The boundary-line of the mental is certainly vague. It is better not to be pedantic, but to let the science be as vague as its subject.
- William James, 1890
The experience of neuroatypical individuals as users of products is not an edge case. Design for inclusivity is central to responsible future-proof design methodology. Failures in voice assistants, like the Google Home Mini, misunderstanding the user can easily be dismissed as a downside of early adoption, “the tech just isn’t there yet” — for neurotypicals.
As environments are increasingly automated by voice control alone, the neuroatypical individual could become increasingly alienated and excluded from their environment. Consider autism conditions like nonverbal autism, apraxia, and monotone speech which limit the accessibility of voice-automated products. And when voice assistance is the only way to turn on the lights, agency over one’s spatial conditions is denied.
Historial Perspectives on Cognition
The mid-century was a time when machinery was radically reconstituting its meaning in society which allowed for a fuzzily defined1 “cognitive science” to adopt first the mind-as-machine metaphor2, and then with the ubiquity of computing in the 70s, the mind-as-computer metaphor.
REC scientists and Enactivits reject the idea that the brain is an analog of computer vision, simply ingesting symbols and printing actions in the form of speech, movement, etc. While Cognitive Scientists are increasingly moving away from a computational understanding of cognition3, the prominence of mind-as-computer metaphor has irreversibly affected how we design products and interface devices, people, and our environment.
[The] simple fact that [mechanized ideas of the mind] already exist in people’s minds affects how we see the world and how we see ourselves.
- Jean-Pierre Dupuy
An Argument for Radical Embodied Cognition
If you were asked to draw what a human looked like only based on the primary devices that contemporary humans use daily (phones, computers, keyboards, mice) would you draw legs?
What we see is that many of our primary tools-for-living don’t require us to use our whole body, nor do they afford us the opportunity to. One arena where this is being challenged is in voice assistants.
When critically examined, even while affording the use of an underutilized modal function (voice), contemporary voice-activated technology presupposes a computationalist take on cognition.
When I awake to my Google Home Mini sounding my alarm, I must effectively shout a piece of code, a command line, “Hey Google, stop.” {Object; Action}. What I can tell you from using my Mini each morning, is that I notice a stark confrontation with my own inability to externalize my thoughts into the desired input form the device has to receive in order to work. I think something vague, about wanting some ambient sound?, music?, white noise?, a podcast?, but the thought is barely formed in my mind by the time a parallel and competing realization begins to gel: I am going to have to choose, with only my words. No browsing through records, CDs, mp3 titles so that I might formalize it into an input that my device understands. And stat. Lest I hear in return, from my silence as I try to form words, a devastating, “I’m sorry, I didn’t get that.”
Klemmer et. al. suggest this phenomenon of struggling to articulate a desired interaction and outcome happens because “reflective reasoning is too slow”4 . In their seminal article, How Bodies Matter: Five Themes for Interaction Design, their second theme, performance, discusses how engaging physical artifacts is a way of making cognitive processes more fluid and tacit. Turning a light switch, thus, is a more reliable and embodied trust in what will happen with the lights than my current, “Hey Google, turn on the lights.”
Tangible interfaces that engage the body can
leverage bodycentric experiential cognition.— Klemmer et. al.5
Instead of only voice activation and buttons to change the volume of media already playing, voice-activated devices might consider having QR codes that allow users to control central functions of their environment from their own device. Instead of, “I’m sorry, I didn’t get that,” Google might respond, “My apologies, I didn’t get that. But I’d like to help. Please find a unique QR code on the bottom of this device that will allow you to control the ambient features of this environment yourself.”
This is not a perfect nor foolproof solution (and I am not blind to the fact that my suggestion recommends yet another interface with the unembodied-smartphone), but rather, this an idea that expands accessibility and doesn’t presume the user will be able to, or interested in interacting unimodally with the device.
With that, may your most-used devices operate in your preferred modality,
As always,
Caseysimone
Dupuy — 1994 — On the Origins of Cognitive Science: The Mechanization of the Mind
Chemero — 2009 — Radical Embodied Cognitive Science
McGann et al. — 2013 — Enaction and Psychology
O’Sullivan & Igoe — 2004 — Physical Computing
Klemmer et al. — 2006 — How Bodies Matter: Five Themes for Interaction Design