From natural language to UI tests: A deep dive into Journeys for Android Studio
12 snips
Oct 14, 2025 Adarsh Fernando, a Product Manager leading the Journeys initiative, and Ray Buse, a Software Engineer on Core Developer, unveil an AI-driven approach to simplify UI testing in Android Studio. They discuss how Journeys converts natural language into test scripts, significantly easing the testing process. Real-world applications like Google Maps showcase its potential, while the integration with Gemini powers intelligent interactions. The duo emphasizes the platform's future, inviting developers to explore and provide feedback on this innovative testing solution.
AI Snips
Chapters
Transcript
Episode notes
Natural Language Tests Run By An AI Agent
- Journeys uses Gemini to turn plain English test specs into adaptive UI actions without writing code.
- The agent reasons over screenshots and UI hierarchy to follow and maintain end-to-end user journeys.
Multimodal Context Powers Reliable Actions
- Journeys combines pixels and the UI hierarchy as multimodal inputs to Gemini for better decisions.
- That multimodal context is essential for the agent to know what actions to take next on a screen.
Thirty-Second Clock App Demo
- Adarsh wrote a Tokyo-clock test in plain English and Gemini performed the steps and validated the time difference.
- The results included Gemini's reasoning showing the math used to assert the Tokyo time was accurate.
