E44: The offloaded brain, part 4: an interview with David Chapman
Dec 4, 2023
auto_awesome
David Chapman, an AI researcher, discusses his work on the Pengi program. They explore ecological software design, indexical expressions, and the implementation of a game controller. They also discuss the behavior of bee wolves and the maintenance of the AI program Pangy.
Pangy showcased how observing external behavior and affordances can provide insights into intelligent behavior without relying on mental representations.
The concept of affordances, rather than comprehensive representations, is crucial in AI systems for effective interaction with the environment.
Deep dives
The Offloaded Brain: Minimalist Approach to AI
David Chapman and Phil Agree pioneered an ecological and embodied cognition approach to AI. They developed Pangy, a program that played a video game, with a minimalistic, indexical, and purpose-laden model. Instead of relying on mental representations, they emphasized the importance of observing external behavior and affordances. By applying phenomenology and ethnomethodology, they showcased how just looking at actions and interactions can provide insights into human and non-human behavior. Their work challenged the traditional rationalist view and demonstrated that simple combinatorial logic can produce interesting behavior. Pangy, though not extensively developed, showed superhuman visual tracking abilities but had limitations in routine repertoire and maintenance.
Indexical Functional Representations in Pangy
Pangy used indexical functional representations, where meaning and actions were dependent on context and affordances. This approach contrasted the objective and purpose-independent mental representations assumed by cognitive science. The visual system in Pangy played a crucial role, registering and tracking markers for objects of interest, such as enemy bees and blocks. The central system, driven by combinatorial logic, responded to affordances detected by the visual system, not as part of a pre-determined plan, but as routine patterns of interaction. The lack of memory in the central system allowed for flexible adaptation to changing situations without requiring plan adjustments.
The Role of Affordances in AI Behavior
AI behavior in Pangy and similar approaches relied on the concept of affordances. Affordances are perceived possibilities for action, specific to the current context and purpose. By focusing on affordances, rather than comprehensive representations, AI systems can effectively interact with the environment. The coordinated interaction between the visual system and the central system in Pangy allowed for the identification of affordances, such as aligning blocks to trap enemy bees. The absence of explicit planning in Pangy showcased how routines emerged from exploiting affordances without the need for conscious decision-making or step-by-step plans.
Implications and Limitations of Pangy
While Pangy demonstrated the power of a minimalist approach in AI, it had limitations. The program exhibited superhuman visual tracking abilities, but lacked a large repertoire of routines due to time constraints during development. The maintenance and extensibility of Pangy were also challenging due to the disregard for software engineering principles at that time. Despite these limitations, Pangy highlighted the potential of ecological and embodied cognition, where just looking and responding to affordances can provide valuable insights into intelligent behavior. Further research in this direction could yield more nuanced models and expand the practical applications of minimalist AI.
In the '80s, David Chapman and Phil Agre were doing work within AI that was very compatible with the ecological and embodied cognition approach I've been describing. They produced a program, Pengi, that played a video game well enough (given the technology of the time) even though it had nothing like an internal representation of the game board and barely any persistent state at all. In this interview, David describes the source of their crazy ideas and how Pengi worked.
Pengi is more radically minimalist than what I've been thinking of as ecologically-inspired software design, so it makes a good introduction to the next episode.
Sources
Philip E. Agre, Computation and Human Experience, 1997, contains a description of Pengi, but is much more about the motivation behind it and also a discussion of "critical technical practice" that I think is nicely compatible with Schön's "reflective practice". I intend to cover both eventually.
The foundational text of ethnomethodology is notoriously (and, some – waves – think, gratuitously) opaque. I found Heritage's Garfinkel and Ethnomethodology far more readable. I've enjoyed the Em does Ca (conversational analysis) Youtube series. The episode on turn-construction units hits me where I live. She talks about how people know when, in a conversation, they're allowed to talk. I'm mildly bad at that in person. I'm somewhat worse when talking to a single person over video. I'm horrible at it when on a multiple-person conference call, with or without postage-stamp-sized video images of faces.