A new Apple patent reveals that their work on Lenticular Displays now extends through to a possible future HMD Display
HomeHome > Blog > A new Apple patent reveals that their work on Lenticular Displays now extends through to a possible future HMD Display

A new Apple patent reveals that their work on Lenticular Displays now extends through to a possible future HMD Display

Jun 12, 2023

In September 2021, Patently Apple posted a report titled "Apple Invents what could be a Revolutionary next-gen Lenticular Display for Macs, HMDs & iDevices."

In November 2021 Patently Apple posted a second report about a possible future Lenticular display in context with an actual Apple TV display being able to present 3D-style content without 3D glasses. On paper alone, I found it a little difficult to wrap my head around the concept, so I added a video that could make anyone understand what Apple was trying to achieve. The video below relates to a lenticular display that Sony describes as a "Spatial Reality Display." Apple now has Spatial Audio and so calling their technology a Spatial Display wouldn't be a stretch. Whatever they call it, it's an exciting leap for displays.

Today Patently Apple discovered a patent application from Apple in the World Intellectual Property Organization's database titled "Lenticular Image Generation." The patent, published on May 03, 2023, relates to methods and apparatus for generating images to be displayed on lenticular displays. Instead of a television being the focus of this patent this time, the term HMD is listed some 30 times. While not limited to an HMD, it is a large focus of this patent.

In these methods, a fixed mesh is generated offline, and in real-time texture information is mapped to the fixed mesh. In an offline process, texture and 3D mesh information for an object is used to render UV map views for multiple viewpoints of the object, view maps are generated from display calibration data, and a lenticular to UV map is generated from the UV map views and view maps.

In real-time, texture information is captured, and a composite process is performed that generates a lenticular image for multiple viewpoints by sampling pixels from the texture based on the lenticular to UV map.

The lenticular image is then displayed on the lenticular display. Detected positions of persons in the environment may be used to limit the number of viewpoints that are generated during the real-time composite process.

Apple's patent FIGS. 1A and IB below illustrate an example lenticular display #120. FIG. 1 A shows a 3D front view of an example lenticular display and FIG. IB shows an example top view of the example lenticular display.

As shown in FIG. 1A, a lenticular display may include a display panel #122 could be an LCD, OLED, DLP or LCoS (liquid crystal on silicon). As shown in FIGS. 1 A and IB, the lenticular lens #126 may be a sheet or array of magnifying lenses (also referred to as lenticules) #128 configured so that, when the lenticular display is viewed from slightly different angles, different views of a lenticular image being displayed on display panel are visible from different viewpoints or viewing angles (e.g., VI, V2, and V3) in front of the display.

Further, the viewing radius may, for example, be 30 degrees; however, wider or narrower viewing angles may be used, for example within a range of 15 to 65 degrees. As a non-limiting example, a lenticular display 120 may provide 22 viewing angles spread through a 30 degree viewing radius, with a different view every 1.6 degrees.

Apple's patent FIG. 1C above shows a top view of an example device #100 that includes a lenticular display #120 as illustrated in FIGS. 1 A and IB, according to some embodiments. The device may include a lenticular display, one or more sensors #140, and a controller #160. Sensors may collect information about an object #190 to be images on the lenticular display.

Sensors #140 may include, but are not limited to, one or more cameras that capture depth information for the object #190. The controller may include one or more processors that process data captured by the sensors to generate texture (e.g., coloring and shading) and mesh (e.g., a 3D representation) information for the object. From the texture and mesh information, the controller may generate lenticular images to be displayed on the lenticular display.

The patent gets more interesting when they reveal in Apple's patent FIG. 2A that device #200 may an HMD to be worn on a user's head so that the lenticular display #210 is disposed in front of the their eyes.

As another example, display system 210 may be a direct retinal projector system that scans left and right images, pixel by pixel, to the subject's eyes. To scan the images, projectors generate beams that are directed to reflective components that redirect the beams to the user's eyes. The term HMD is listed 30 times in this patent.

Apple's patent FIG. 5 above graphically illustrates a method for offline generation of a lenticular to UV map.

For more details, review Apple's patent application number EP4173281. Apple's patent application that was published this week was originally filed in June 2021. This is a forward looking patent and the technology discussed isn't likely to be a part of Apple's first-gen XR Headset debuting at WWDC23.

Posted by Jack Purcher on May 06, 2023 at 02:24 PM in 1A. Patent Applications, Display Technology , HMDs, Apple Vision Pro, Smartglasses + | Permalink | Comments (0)