Metamuse

5 // Gesture programming for the iPad

May 26, 2020
Julia Rogatz, an iOS engineer specializing in gesture handling and touch interactions for the Muse app, discusses the intricate development process of designing a rich gesture space for iPad. She delves into defining gestures based on Apple’s guidelines, the challenges of gesture ambiguity, and the balance between instant responses and user experience. Julia also shares insights on testing techniques, the conflicts with OS gestures, and the need for a centralized gesture state machine, revealing the technical artistry behind seamless interactions.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Rely On The iOS Gesture Toolkit

  • iOS provides a rich SDK including gesture recognizers, saving developers from reimplementing raw touch logic.
  • Muse uses mostly built-in tools with minimal external dependencies to keep control and privacy.
ADVICE

Prioritize User Data Ownership

  • Avoid sending user data to third parties by choosing local or trusted storage solutions.
  • Replace external backends if they risk exposing user data to large providers like Google.
INSIGHT

Gestures Are Time-Based Signals

  • Gestures are temporal sequences, not single events, and require interpreting movement over time.
  • Disambiguation between gestures often depends on accumulated positions and timings, not just instant input.
Get the Snipd Podcast app to discover more snips from this episode
Get the app