Exploring the Apple Vision Pro: Essential Developer Tools and Resources

Explore the Apple Vision Pro SDK and unleash the power of mixed reality development. Create immersive apps with advanced tools and APIs for Apple's groundbreaking headset.

Exploring the Apple Vision Pro: Essential Developer Tools and Resources

Apple has released the SDK for its Vision Pro mixed reality headset. Interested parties can apply for developer kits starting in July.

The software development kit for visionOS is now available on Apple's developer portal. Interested parties will not only get tools for applications on the Apple Vision Pro operating system. Existing tools such as Xcode, SwiftUI, RealityKit, ARKit, and TestFlight have also received updates for Apple's upcoming mixed-reality headset.

"These tools enable developers to create new types of apps that span a spectrum of immersion", the company explains, "including windows, which have depth and can showcase 3D content; volumes, which create experiences that are viewable from any angle; and spaces, which can fully immerse a user in an environment with unbounded 3D content."

SDK available for the Apple headset

Reality Composer Pro, included in Xcode, lets you preview and prepare 3D models, animations, images, and sounds for use with Apple Vision Pro. It also includes accessibility support and the visionOS Simulator" for testing different room layouts and lighting conditions. Starting next month, applications created with the Unity engine will also be compatible.

"Next month, Apple will open developer labs in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo to provide developers with hands-on experience to test their apps on Apple Vision Pro hardware and get support from Apple engineers. Development teams will also be able to apply for developer kits to help them quickly build, iterate, and test right on Apple Vision Pro."

Preview applications for the visionOS SDK and its APIs include Complete HeartX, which provides realistically animated 3D heart models for medical students. Apple also showed off JigSpace for displaying spatial CAD files and PTC for virtual assembly line collaboration.

On the other hand, a movie snippet from Algoriddim's disc jockey app "djay" already shows that hand tracking latency could become an issue. At least in the visual animation, the delay between finger and fader movement still seems far too high for usable cuts or scratches. In contrast to the competition, Apple relies completely on hand and eye tracking as well as voice input for its headset.

The visionOS SDK, updated Xcode, Simulator and Reality Composer Pro are available behind the link for members of the Apple Developer Program. Reality Composer Pro is essentially an engine editor. For the first time, VR games are mentioned in the documentation. However, these may suffer from restrictive playfield limitations of the Boundary system.

ALSO READ l

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies.