Apple has obtained a patent for immersive video streaming for its future mixed reality headset

Today, the United States Patent and Trademark Office officially granted Apple a patent relating to the streaming of immersive video content intended for presentation to a user wearing a headset.

Immersive video streaming

According to Apple, immersive video content can be presented to a user in three dimensions using a wearable display device, such as a virtual reality headset or augmented reality headset. Further, different portions of the immersive video content may be presented to a user, depending on the position and orientation of the user’s body and/or user inputs.

Apple’s patent FIG. 1 below shows an example system #100 to present immersive video content to a user #102. System 100 includes video content source #104 communicatively coupled to portable display device #106 via network #108.

The portable display device can be any device that is configured to be worn by a user and to display visual data to the user. For example, the wearable display device can be a wearable headset, such as a virtual reality headset, augmented reality headset, mixed reality headset, or wearable holographic display.

2 Apple patent figs 1 2a b

Apple’s patent FIG. 2A above is a diagram of an example window for presenting immersive video content; FIG. 2B is a diagram of exemplary degrees of freedom of movement of a user’s body.

Apple HMD Window

Following patent FIG. 2A, Apple notes that immersive video content #200 may include visual data that may be presented in a range of viewing directions and/or viewing locations relative to a user. A window #202 can be selected to present some of the immersive video content to the user (eg, based on the position and/or orientation of the user’s head) to give the user the impression that it visualizes the visual data according to a particular view field of view and/or viewing perspective.

Further, the viewport may be continuously updated based on the user’s movements to give the user the impression that they are moving their gaze through a visual environment.

The helmet sensors can also be configured to detect the position and/or orientation of a user’s head in multiple dimensions. For example, referring to FIG. 2B, a Cartesian coordinate system can be defined so that the x-axis, y-axis, and z-axis are orthogonal to each other and intersect at an origin point O (for example, corresponding to the position of the user’s head).

The sensors (#120, FIG. 1) can detect the user translating their head along one or more of these axes and/or rotating their head around one or more of these axes (for example , with six degrees of freedom, 6DoF).

For example, sensors can detect when a user moves their head forward or backward (eg, along the x-axis), sometimes referred to as a “snap” movement. As another example, sensors can detect when a user translates their head left or right (for example, along the y-axis), sometimes referred to as a “rocking” motion. As another example, sensors can detect when a user translates their head in an up or down direction (eg, along the z axis), sometimes called a “lifting” motion.

As another example, sensors can detect when a user turns their head around the x axis, sometimes called a “roll” motion. As another example, sensors can detect when a user turns their head around the y axis, sometimes called a “pitch” motion.

As another example, sensors can detect when a user turns their head around the z axis, sometimes called a “yaw” movement.

Developers and/or engineers could dig deeper into the details of this invention under US Patent 11570417 B2 granted by Apple.

Apple inventors

  • Fanyi Duanmu: video coding and processing engineer
  • Jun Xin: director of engineering, coding and video processing
  • Xiaosong Zhu: Senior Software QA Engineer (many years of experience in broadcasting, digital video encoding process and IPTV industry)
  • Hsi-Jung Wu: No LinkedIn profile found.

10.52FX - Patent Bar Granted

Leave a Comment