

The MIT Press Podcast
The MIT Press
Interviews with authors of MIT Press books.
Episodes
Mentioned books

May 6, 2013 • 1h 3min
Jonathan Bobaljik, “Universals of Comparative Morphology” (MIT Press, 2012)
Morphology is sometimes painted as the ‘here be dragons’ of the linguistic map: a baffling domain of idiosyncrasies and irregularities, in which Heath Robinson contraptions abound and anything goes. In his new book, Universals of Comparative Morphology: Suppletion, Superlatives, and the Structure of Words (MIT Press, 2012), Jonathan Bobaljik reassesses the terrain, and argues that there are hard limits on the extent to which languages can vary in the morphological domain.The book is a comparative study of comparatives and superlatives with a broad typological base. Bobaljik’s contention is that, at an abstract cognitive level, the representation of the comparative is contained within that of the superlative. From this hypothesis, couched within the theoretical framework of Distributed Morphology, a number of generalizations immediately follow: for instance, in a language which, like English, has forms of the type “good” and “better”, the superlative cannot be of the type “goodest”. As he shows, these generalizations are solid candidates for the status of exceptionless linguistic universals.In this interview, Jonathan outlines the generalizations and their evidential basis, and we go on to discuss apparent counterexamples (including the mysterious Karelian quantifiers), why the comparative should be contained within the superlative, how the generalizations extend to change-of-state verbs, and how similar generalizations can be found in domains as diverse as verbal person marking and pronominal case.

Apr 30, 2013 • 1h 14min
Alexandra Hui, “The Psychophysical Ear: Musical Experiments, Experimental Sounds, 1840-1910” (MIT Press, 2013)
In The Psychophysical Ear: Musical Experiments, Experimental Sounds, 1840-1910 (MIT Press, 2013), Alexandra Hui explores a fascinating chapter of that history in a period when musical aesthetics and natural science came together in the psychophysical study of sound in nineteenth century Germany. Though we tend to consider the performing arts and sciences as occupying different epistemic and disciplinary realms, Hui argues that the scientific study of sound sensation not only was framed in terms of musical aesthetics, but became increasingly so over time. The book traces a series of arguments by practitioners of the study of sound sensation as they sought to uncover universal rules for understanding the sonic world: How much epistemic weight ought to be placed on the experiences of an individual listener? What sorts of expertise were relevant or necessary for a sound scientist’s experimental practice? Did musical training matter? Was there a proper way to listen to music? The Psychophysical Ear follows sound scientists as they grappled with these and other questions, struggling with the consequences of understanding the act of listening as a practice that was fundamentally grounded in particular historical contexts as phonographic technology and the increasing number of performances of non-Western music in Europe were transforming the sonic world of Europe. Hui’s story often involves the reader’s own sensorium in the story, urging us to imagine or play sequences of musical notes that prove crucial to some of the arguments of the actors in the story. Enjoy!

Apr 13, 2013 • 1h 3min
Stephen E. Nadeau, “The Neural Architecture of Grammar” (MIT Press, 2012)
Although there seems to be a trend towards linguistic theories getting more cognitively or neurally plausible, there doesn’t seem to be an imminent prospect of a reconciliation between linguistics and neuroscience. Network models of various aspects of language have often been criticised as theoretically simplistic, custom-made to solve a single problem (such as past tense marking), and/or abandoning their neurally-inspired roots.In The Neural Architecture of Grammar (MIT Press, 2012), Stephen Nadeau proposes an account of language in the brain that goes some way towards answering these objections. He argues that the sometimes-maligned Parallel Distributed Processing (PDP) approach can genuinely be seen as a way of modelling the brain. Combining theoretical, experimental and biological perspectives, he proposes a model of language function that is based upon these principles, proceeding concisely all the way from concept meaning to high-level syntactic organisation. He proposes that this model offers a plausible account of a wealth of data from studies of normal language functioning and, at the same time, a convincing perspective on how language breaks down as a consequence of brain injury.Within an hour, it’s hard to do justice to the full complexity of the model. However, we do get to discuss much of the background and motivation for this approach. In particular, we talk about the emergence of PDP models of concept meaning and of phonological linear order. We consider the relations between this concept of meaning and the increasingly well-studied notion of ’embodied cognition’. And we look at the aphasia literature, which, Nadeau argues, provides compelling support for a view of language that is fundamentally stochastic and susceptible to graceful degradation – two automatic consequences of adopting a PDP perspective. We conclude by touching on the potential relevance of this type of account for treatments for aphasia.

Feb 26, 2013 • 1h 9min
Matthew Wisnioski, “Engineers for Change: Competing Visions of Technology in 1960s America” (MIT Press, 2012)
In his compelling and fascinating account of how engineers navigated new landscapes of technology and its discontents in 1960s America, Matthew Wisnioski takes us into the personal and professional transformations of a group of thinkers and practitioners who have been both central to the history of science and technology, and conspicuously under-represented in its historiography. Between 1964 and 1974, engineers in America wrestled with the ethical and intellectual implications of an “ideology of technological change.” Engineers for Change: Competing Visions of Technology in 1960s America (MIT Press, 2012) takes us into the debates among engineers over their responsibilities for crafting a future in a world where nuclear weapons and chemical pollutants were now facts of life, as citizens were rising in support of environmental and civil rights, and in protest of war and violence. Wisnioski introduces us to the changing resonances of and debates over key concepts in the print culture of engineers in mid-century America, key experiments in the pedagogy and training of engineers at major US institutions, and key efforts to promote creativity in the profession by collaborating with artists, social activists, and others. The book situates all of this within a wonderful introduction to the classic historiography of social studies of technology and engineering, and is illustrated with striking images from the visual culture of engineering in the 1960s. Readers interested in how these issues extend into the more recent history of technology will also find much of interest in Wisnioski’s accounts of Engineers Without Borders and the Engineering, Social Justice, and Peace (ESJP) Network. Enjoy!

Sep 15, 2012 • 1h 6min
Kristin Andrews, “Do Apes Read Minds?: Toward a New Folk Psychology” (MIT Press, 2012)
The ability to figure out the mental lives of others – what they want, what they believe, what they know — is basic to our relationships. Sherlock Holmes exemplified this ability by accurately simulating the thought processes of suspects in order to solve mysterious crimes. But folk psychology is not restricted to genius detectives. We all use it: to predict what a friend will feel when we cancel a date, to explain why a child in a playground is crying, to deceive someone else by saying less than the whole story. Its very ubiquity explains why it is called folk psychology.But how in fact does folk psychology work? On standard views in philosophy and psychology, folk psychology just is the practice of ascribing or attributing beliefs and desires to people for explaining and predicting their behavior. A folk psychologist is someone who has this “theory of mind”. In her new book, Do Apes Read Minds?: Toward a New Folk Psychology (MIT Press, 2012), Kristin Andrews, associate professor of philosophy at York University in Toronto, argues that the standard view is far too narrow a construal of what’s going on. It leaves out a wide variety of other mechanisms we use to understand the mental lives of others, and a wide variety of other reasons we have for engaging in this social competence. Moreover, what’s necessary to be a folk psychologist is not a sophisticated metacognitive ability for ascribing beliefs, but an ability to sort the world into agents and non-agents – an ability that greatly expands the class of creatures that can be folk psychologists. Andrews draws on empirical work in psychology and ethology, including her own field work observing wild primates, to critique the standard view and ground her alternative pluralistic view.

Aug 15, 2012 • 1h 12min
Lee Braver, “Groundless Grounds: A Study of Wittgenstein and Heidegger” (MIT Press, 2012)
Ludwig Wittgenstein and Martin Heidegger are both considered among the most influential philosophers of the twentieth century. Both were born in 1889 in German-speaking countries; both studied under leading philosophers of their day – Bertrand Russell and Edmund Husserl, respectively – and were considered their philosophical heirs; and both ended up critiquing their mentors and thereby influencing the direction of thought in both the Analytic and Continental traditions. In Groundless Grounds: A Study of Wittgenstein and Heidegger (MIT Press, 2012), Lee Braver, associate professor of philosophy at Hiram College attempts to build what he calls a “load-bearing bridge” between these often polarized traditions. He argues that both thinkers have similar arguments for similar conclusions on similar fundamental issues. Both blame the disengaged contemplation of traditional philosophy for confusion about the nature of language, thought and ontology, and that attention to normal, ongoing human activity in context presents alternative fundamental insights into their nature. The groundless grounds of the title is the idea that finite human nature gives us everything we need to understand meaning, mind and being, and that to insist that this ground requires justification itself betrays confusion.

Jul 2, 2012 • 1h 5min
David A. Kirby, “Lab Coats in Hollywood: Science, Scientists, and Cinema” (MIT Press, 2011)
First things first: this was probably the most fun I’ve had working through an STS monograph. (Really: Who doesn’t like reading about Jurassic Park and King Kong?) In addition to being full of wonderful anecdotes about the film and television industries, David Kirby‘s Lab Coats in Hollywood: Science, Scientists, and Cinema (MIT Press, 2011) is also a very enlightening exploration of the role of science consultants on television and in film, and the negotiations of expertise involved in relationships between scientists and the cinema. Scholars of STS will recognize some of the major themes that Kirby raises in the course of a fascinating look behind the scenes of the cinematic production of “science”: negotiated definitions of accuracy and plausibility, technologies of virtual witnessing, the social construction of knowledge. Many of the chapters will change the way you see representations of scientists and their work in the movies and on TV, and Kirby’s description of the filmic use of “diegetic prototypes,” or cinematic depictions of future technologies, is a stand-alone contribution in itself. This is a must-read for anyone interested in popular representations of science. Kirby describes the ways that visual media interpret, naturalize, and engage with scientific theories (be they well-accepted, controversial, or fantastical), and how some scientists in turn manipulate cinematic depictions for their own ends. Plus, have I mentioned how much fun it is?Check out David’s recent discussion of the film Prometheus!

May 15, 2012 • 1h 7min
Paul Thagard, “The Cognitive Science of Science: Explanation, Discovery, and Conceptual Change” (MIT Press, 2012)
We’ve all heard about scientific revolutions, such as the change from the Ptolemaic geocentric universe to the Copernican heliocentric one. Such drastic changes are the meat-and-potatoes of historians of science and philosophers of science. But another perspective on them is from the point of view of cognition. For example, how do scientists come up with breakthroughs? What happens when a scientist confronts a new theory that conflicts with an established one? In what ways does her belief system change, and what factors can impede her acceptance of the new theory?In his latest book, The Cognitive Science of Science (MIT Press, 2012), Paul Thagard considers the nature of science from this cognitive scientific perspective. Thagard, who is a professor of philosophy at the University of Waterloo, presents a comprehensive view of such aspects of scientific thinking as the process of discovery and creativity, the nature of change in scientific beliefs, and the role of emotions and values in these processes. He defends an explanatory coherence model of belief revision, proposes a model for explaining resistance to new scientific ideas, and even suggests why so much creative thinking goes on in the shower.

Apr 27, 2012 • 1h 12min
Michael Lynch, “In Praise of Reason” (MIT Press, 2012)
Modern society seems in awe of the advances of science and technology. We commonly praise innovations that enable us to live longer and more comfortable lives, we look forward to the release of new gadgets, we seek out new ways to employ technology in our everyday lives. These developments depend upon a set of intellectual practices that are commonly associated with the methods of the natural sciences. We are able to invent and create precisely because we are able to gather evidence and reason competently.But this fascination with technology and science is accompanied by various forms of skepticism about reason itself. Some hold that reason is a kind of Promethean hubris. Others claim that what passes for reason is really just rationalization or power. Still others contend that reason is at best of limited value, and that other, non-rational, sources of cognitive guidance are more authoritative than reason.Michael Lynch‘s new book, In Praise of Reason (The MIT Press, 2012), launches a compelling and deeply engaging defense of the idea that our cognitive lives are properly managed when they are aimed at believing in accordance with reason. In making his case for reason, Lynch emphasizes the importance of reason for the maintenance of a democratic society. In Praise of Reason resides at the intersection of political philosophy and epistemology, and for this reason will be of interest to a wide range of philosophers and non-philosophers alike.

Apr 16, 2012 • 1h 2min
Lawrence Busch, “Standards: Recipes for Reality” (MIT Press, 2011)
As Lawrence Busch reminds us, standards are all around us governing seating arrangements, medicine, experimental objects and subjects and even romance novels. In Standards: Recipes for Reality (MIT Press, 2011) Busch provides a wide ranging and accessible analysis of the ways that standards structure the world. More than simply providing a typology of standards, Busch shows the ways that the impetus to standardization and standardized differentiation have transformed as a part of historical and political changes. Under contemporary neo-liberalism the drive to standardization has generated sophisticated relationships between standards, certified professional bodies and accrediting agencies, relationships that Busch provides the resources for thinking about politically. Using plenty of accessible and insightful examples and clearly in contact with much of the literature in Science and Technology Studies Busch’s book is a great read and a great entry into thinking about technoscience, power and neo-liberalism.Give it a read.