
At its recent "Create Immersive" event, Apple revealed for the first time its detailed production process for the "Apple Immersive Video" format designed for Apple Vision Pro.
According to reports, due to the lack of a publicly available, standardized production workflow in the industry, the production of Apple immersive videos was extremely difficult, almost entirely driven by Apple itself. Apple's internal teams relied on pre-released Blackmagic cameras, custom shooting equipment, and proprietary software still in the testing phase to complete the production. The entire process was like "glued together with tape"—complex and unstable, completely unsuitable for distribution to third-party developers, resulting in extreme scarcity of content on the platform.
This situation has now completely changed. Apple's published production process clearly defines three key stages:
On the hardware side, the $33,000 Blackmagic URSA Cine Immersive camera is the core equipment for shooting Apple immersive video content.
On the software side, DaVinci Resolve Studio is deeply integrated with AIV's post-production workflow, providing native support for everything from media management, editing, color grading, visual effects (Fusion) to spatial audio production (Fairlight). On the distribution side, third-party tools such as Compressor, ColorFront, and SpatialGen are gradually improving their support for Apple's Immersive Video format, jointly building a complete solution from shooting to cloud distribution. In addition, the video platform Vimeo will also support creators uploading and sharing Apple Immersive Videos.
Technically, Apple Immersive Video is an 8K resolution 180-degree video format, one of its core technologies being "Static Foveation."
This technology non-linearly distorts the original fisheye lens image, preserving the high pixel density in the center of the image while appropriately reducing details in the edge areas, thus achieving near-native 8K camera sharpness at a 4K distribution resolution. Furthermore, Apple has clarified a dual standard for the Apple Immersive Video format: a "production format" containing RAW data for professional post-production; and a "delivery format" (.aivu) for streaming distribution, using efficient compression (MV-HEVC encoding), aiming to balance the flexibility of the production workflow with the end-user viewing experience.
At the creative level, Apple has summarized four cornerstones of high-quality immersive video: Presence, Authenticity, Proximity, and Connection.
Apple emphasizes that traditional film composition rules (such as the rule of thirds) are no longer applicable in a 180-degree field of view. Creators need to focus more on guiding the viewer's eye and utilizing spatial audio to enhance the narrative.
To this end, Apple has introduced the new Apple Spatial Audio Format (ASAF) and its compression codec (APAC), supporting the mixing of high-order surround sound and independent audio objects, allowing for more precise sound positioning and thus greatly enhancing the realism of the immersive experience.
With the openness of the production process, a large amount of third-party immersive content is expected to emerge. Apple has already begun collaborating with the NBA to provide immersive live streams of some Los Angeles Lakers games. It is foreseeable that more sporting events (such as MLB and F1) and short films produced by independent studios will be available on Vision Pro in the future.