The hosts discuss challenges with recording podcasts on Zoom and surprise one of the hosts. They talk about viewing screens at different distances, battery drain issue of AirPods Max, and the developer strap for Vision Pro. They explore debugging and hardware design challenges for Apple devices, shooting and converting films into 3D, and the implications of the Digital Markets Act on iMessage. They share experiences with guest mode on Vision Pro, trying out a 3D demo app, and VR headset features. They discuss Fitts Law on the Mac interface and their project and library experiences.
Read more
AI Summary
AI Chapters
Episode notes
auto_awesome
Podcast summary created with Snipd AI
Quick takeaways
Creating 3D movies involves a complex process of converting 2D footage into 3D through manual editing, presenting challenges in maintaining consistency and achieving depth.
The new Apple Developer Strap for the Apple Vision Pro may have Thunderbolt capabilities, enabling wired debugging for app development.
The Vision Pro headset's pass-through feature shows better resolution when stationary but becomes blurrier in motion, indicating a need for improvement.
When using the Vision Pro headset in public settings, it can limit interaction with the environment and other people, leading to a more isolating experience compared to using a laptop.
Deep dives
Title: The Process of Creating 3D Movies
Summary:
This podcast episode discusses the process of creating 3D movies. The hosts explore the difference between real and fake 3D, highlighting that many 3D films, including the recent Star Wars trilogy, are actually 3D conversions. In a 3D conversion, the film is shot in 2D with a single camera, and then artists manually cut out each element of every frame and create 3D objects and textures. This process allows for control and adjustments after the film is shot, but can also present challenges in terms of interaxial distance, lens flares, and maintaining consistency between shots. The hosts also mention the potential use of Thunderbolt capabilities in the new Apple Developer Strap for the Apple Vision Pro, and discuss the benefits of wired debugging for app development. The episode concludes with an explanation of how filming in 3D differs from rendering a 3D scene, highlighting the complexities involved in achieving depth and avoiding abrupt changes in perspective.
Mixed reactions to Vision Pro from demos
During demos, some people found the Vision Pro impressive, particularly the 3D video content. The pass-through feature received mixed reviews, with the resolution being better when stationary but blurrier in motion. The eyesight feature, displaying the user's eyes on the outside, was seen as creepy by most adults. However, some think there is potential for improvement to make it less uncanny. The guest mode and IPD adjustment process were found to be somewhat cumbersome. Overall, the lack of gaming content was noted, and more interactive experiences were desired.
Challenges with the Eyesight Feature
The eyesight feature, displaying the user's eyes on the outside, received mixed feedback. It was seen as useful in indicating whether the user can see others and when they are in an immersive environment. However, many found it creepy, while others thought it had room for improvement. It was observed that the feature is not impactful enough for most adults, but may fool younger users. There were suggestions of simpler alternatives, such as monochrome googly eyes or text displays, that could serve the same purpose in a lighter and more cost-effective way.
Motion Blur and Pass-Through Quality
Motion blur in the Vision Pro was noted by users, with the pass-through quality being better when stationary but blurrier in motion. Some attributed this to the difference in eye position between the cameras and actual eye placement. The dimness of the eyes in the pass-through also made it challenging to see the images from certain angles. It was highlighted that content addressing motion blur and improving the pass-through experience will be important for future iterations.
Impact and Utility of Vision Pro Demos
The 3D video content and immersive experiences like the dinosaur encounter were considered impressive during demo sessions. However, some attendees had limits with the immersive experiences due to motion discomfort or disinterest in non-gaming content. The limited amount of content available for demos was also noted, and the desire for more interactive experiences to showcase the capabilities of the Vision Pro. The solo knit band was found to be more practical than the dual loop band for quick adjustments.
Vision Pro in Public Settings
Using the Vision Pro headset in public settings, like coffee shops or libraries, provides an immersive working experience, but can be isolating and limit interaction with the environment and other people. The headset covers both the eyes and ears, making it important to use AirPods for audio. This limits the ability to see and be seen by others, reducing the value of being in a public space.
Preferred Device for Work in Public
For working in public settings, using a laptop, optionally with headphones, is often a better choice. It allows for unobstructed vision, maintaining interaction with the environment and other people. When using Vision Pro, with both eyes and ears covered, it can lead to antisocial behavior and a reduced connection to the surroundings. Opting for a laptop provides a more social and immersive working experience in public places.
Benefits of Varying Eye Focus
Working on a computer for extended periods can cause eye strain. Taking breaks and focusing on different distances, such as looking out of a window at a distant object, can help relax the eyes and improve focus. This is challenging when using a headset like Vision Pro, which has a fixed focal length. Varying eye focus is important for eye health and can be more easily achieved when using traditional monitors or laptops.