Зарегистрируйтесь сейчас для лучшей персонализированной цитаты!

Inside VisionOS: 17 things developers need to know right now

Июнь, 06, 2023 Hi-network.com
Jason Hiner/

Apple's new Vision Pro headset is a completely new device in the Apple ecosystem. While the company provided details about the device itself during its 2 hour+ WWDC 2023 keynote, the company saved developer details for a later event, the Platforms State of the Union.

I watched that presentation, and in this article, I've cataloged the key takeaways about developing for VisionOS that coders need to know. Let's kick it off with a bit of background.

Also: Apple Vision Pro first take: 3 reasons this changes everything

VisionOS is the operating system designed for what Apple calls "spatial computing." The company separates this computing paradigm from the two we're most familiar with, desktop and mobile computing. The idea with spacial computing is that your work environment exists floating in front of you.

The "shared space" is where apps float side by side. Think of it as multiple side-by-side windows, but instead of on a desktop, they're in midair.

The shared space

Screenshot by David Gewirtz/

Users can open one or more windows that exist as planes in space. They support traditional views and controls, but also support 3D content, which can live in a window with 2D content. In a CAD program, the object might be 3D, but the toolbar might be 2D, for example.

Screenshot by David Gewirtz/

Beyond windows, apps can create three dimensional volumes. These can contain objects and scenes. The key difference is that volumes can be moved around in 3D space and can be viewed from all angles. It's the difference between looking into a store's window display compared to walking around a car and peering in through front, back, and side windows.

Also: Meet your Digital Persona: Apple's Vision Pro users to get real-time animated avatars

For developers who want full immersion, it's possible to create a dedicated full space. This is like when a game takes over the entire screen, but in the VisionOS experience, that screen is fully immersive. Here, apps, windows, and volumes operate inside the fully immersive environment.

Now that you understand the virtual paradigms used by VisionOS, let's look at seventeen things developers need to know about developing for VisionOS.

1. Development tools and libraries used for VisionOS will be familiar to many Apple developers.

Development is based on Swift UI, RealityKit, and ARKit, which are existing APIs that have been around for a while. Apple has extended these frameworks for VisionOS, adding support for the new hardware and full-space paradigm.

Apple

2. VisionOS is fundamentally an extension of iOS and iPad OS development. 

Developers will use SwiftUI and UIKit to build the user interface. RealityKit is used to display 3D content, animations, and visual effects. ARKit provides apps with an understanding of the real-world space around the user and make that understanding available to code within an app.

3. All apps will need to exist in 3D space. 

Even basic 2D apps ported over from iOS or iPad OS will float in space. Whether that space is a view of the room where the user is using the Vision Pro environment, or some simulated environment that blocks the real world out, even traditional apps will "float" in 3D space.

4. VisionOS adds a new destination for building apps. 

Previously, Xcode developers could choose iPhone, iPad, and Mac as destinations (i.e., where the app would run). Now, developers can add VisionOS as a destination. As soon as the app is rebuilt, the new destination adds VisionOS features including resizable windows and the adaptive translucency features of VisionOS.

5. Older UIKit apps (not built with Swift and SwiftUI) can be recompiled for VisionOS. 

When they do, they'll get some highlight and 3D presence features from VisionOS. So while UIKit and Objective C-based apps may not be able to provide a fully immersive 3D experience, they will gain a native VisionOS look and feel, and be able to coexist reasonably seamlessly with more modern SwiftUI-based applications.

6. Traditional UI elements (like controls) get a new Z-offset option. 

This allows developers to push panes and controls into 3D space, allowing certain interface elements to float in front of, or behind other elements. This can allow developers to bring attention to certain elements.

7. VisionOS uses eye tracking to enable dynamic foveation. 

Foveation describes an image processing technique where certain areas of an image get more detail than other areas. With VisionOS, the Vision Pro uses eye tracking to render the area of the scene being looked at in very high resolution, but reduce the resolution in peripheral vision. This reduces processing time in areas where the user is not putting full attention. Developers don't need to code for this. It's built into the OS.

8. Object lighting is derived from current spatial conditions. 

By default, objects floating in 3D space gain the lighting and shadow characteristics of the space where the user is wearing the headset. Developers can provide an image-based lighting asset if they want to customize how objects are lit in virtual space.

9. ARKit provides apps with a usable model of the real-world room where the device is being used. 

It uses plane estimation to identify flat surfaces in the real room. Scene reconstruction builds a dynamic 3D model of the space in the room that apps can interact with. Image anchoring allows 2D graphics to be locked into a location in the 3D space, making it appear to be part of the real world.

10. ARKit on VisionOS adds skeletal hand tracking and accessibility features. 

This provides apps with positioning data and joint mapping, so that gestures can more fully control the virtual experience. Accessibility features allow users to interact with eye movement, voice, and head movement in addition to hand movements.

11. Unity has been layered on top of RealityKit. 

Apple has partnered with Unity so Unity developers can directly target VisionOS from within Unity, allowing all of the Unity-based content to migrate into VisionOS-based apps without much conversion effort. This is actually pretty big because it allows developers with deep Unity experience to create Unity-based apps alongside VisionOS apps.

Reality Composer Pro

Apple

12. Reality Composer Pro is a new development tool for previewing and preparing 3D content. 

This is essentially an asset manager for 3D and virtual content. This also allows developers to create custom materials, test out shaders, integrate these assets into the Xcode development process, and preview on the Vision Pro.

13. Shared-space processing takes place on-device. 

This means that the room visuals and mapping is kept private. Cloud processing is not used for 3D mapping. All personal information and room spatial dynamics are managed entirely inside the Vision Pro device.

14. For those without devices, Xcode provides previews and a simulator. 

This allows you to get a feel for what your app will look like and test your app. The preview mode lets you see your layout in Xcode, while the simulator is a dedicated on-screen environment for testing overall app behavior. You can simulate gestures using a keyboard, trackpad, or game controller.

Xcode simulator

Apple

15. For those with a Vision Pro, it's possible to code entirely in virtual space. 

The Vision Pro extends Mac desktops into virtual space, which means you can have your Xcode development environment side-by-side with your Vision Pro app.

16. There will be a dedicated app store for Vision Pro. 

Apps, complete with in-app purchases, will be available to download and buy from the Vision Pro's dedicated app store. Additionally, Test Flight runs with Xcode and Vision OS as expected, so developers will be able to distribute betas of apps in exactly the same way as for iPhone and iPad.

17. Apple is preparing a number of coding support resources. 

The VisionOS SDK, updated Xcode, simulator, and Reality Composer Pro will be available later this month. Apple is also setting up Apple Vision Pro developer labs. Located in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino, developers will be able to visit and test applications. For those who don't travel to Apple sites, developers can submit requests for Apple to evaluate and test app builds and provide feedback. Apple made no mention of turnaround time on these requests.

What do you think?

More information on developing for the Vision Pro and VisionOS is available on Apple's developer website.

Also: Will companies use low code to run their businesses?

So what do you think? Are you a developer? If so, are you planning on developing for the Vision Pro? Are you a user? Do you see an immediate use for this device or does the$3,500 price tag and goggle-like usage experience put you off? Please let us know in the comments below.


You can follow my day-to-day project updates on social media. Be sure to follow me on Twitter at @DavidGewirtz, on Facebook at Facebook.com/DavidGewirtz, on Instagram at Instagram.com/DavidGewirtz, and on YouTube at YouTube.com/DavidGewirtzTV.

Apple

Unlock the best new features in iPhone 15, iOS 17, and Apple Watch with these tipsiOS 17: The most impactful new iPhone features are also the ones you'll notice the leastNew iPhone 15 models compared: iPhone 15 vs. Plus vs. Pro vs. Pro MaxOne subtle (but important) reason to buy the iPhone 15 Pro instead of the iPhone 14 Pro
  • Unlock the best new features in iPhone 15, iOS 17, and Apple Watch with these tips
  • iOS 17: The most impactful new iPhone features are also the ones you'll notice the least
  • New iPhone 15 models compared: iPhone 15 vs. Plus vs. Pro vs. Pro Max
  • One subtle (but important) reason to buy the iPhone 15 Pro instead of the iPhone 14 Pro

tag-icon Горячие метки: По вопросам бизнеса Компания-разработчик

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.