The upcoming Vision Pro may be able to see the invisible — and ‘smell’ objects – Tech Live Trends
Forget virtual reality, extended reality, and mixed reality. Apple has been granted a patent (US 11715301 B2) for “Visualization of Non-Visible Phenomena” that show the company wants the upcoming Apple Vision Pro could “see” the invisible — and have the ability to smell.
The invisible in question being electrical currents, radios signals from Wi-Fi and other “non-visible phenomena.” Even more interesting is Apple’s idea that the Vision Pro could be equipped with features allowing it to detect scents.
The Vision Pro is Apple’s US$3,499 (and up) “spatial computer.” It’s due in early 2024, though it will apparently only be available in limited quantities at first.
About the patent
In the patent Apple notes that extended reality technology aims to bridge a gap between virtual environments and a physical environment by providing an enhanced physical environment that is augmented with computer-generated content that is not part of the physical environment. As a result, the computer-generated content that is not part of the physical environment appears to be part of the physical environment as perceived by a user.
Implementations of the subject technology described in Apple’s patent provide an extended reality (XR) system that displays a virtual representation of non-visible features of a physical environment, such that a user of the XR system perceives the non-visible features at the location of the non-visible features in the physical environment.
FIG. 6 illustrates a flow chart of an example process for providing computer-generated visualizations of non-visible phenomena.
For example, a device may detect, and/or receive information regarding, one or more non-visible features within a direct or pass-through field of view of a physical environment, and display a visualization of those detected non-visible features at the correct location of those features in the physical environment.
For example, responsive to a detection of non-visible feature of a physical environment, the device may display a visualization of the non-visible feature overlaid on the view of the physical environment at a location that that corresponds to the detected non-visible features. The non-visible features may correspond to, for example, electromagnetic signals such as Wi-Fi signals, airflow from an HVAC system, temperatures of physical objects, fluids or gasses, an audible fence created for a pet (e.g., using ultrasonic pitches), sounds generated by a musical instrument, and/or hidden physical objects such as objects with known locations that are obscured from view by other physical objects (as examples).
What’s really interesting (to me, anyway) is the patent’s statement that non-visible features can be added to a physical environment to provide scent experiences for Vision Pro users. For example, an electronic device such as electronic device may be provided with an artificial scent device (a device configured to release one or a combination of gases, vapors, or particulates that mimic one or more predefined scents).
The non-visible features added to the physical environment to trigger a scent experience may be detected by the Vision Pro causing the artificial scent device to generate a corresponding scent. For example, a tea shop that links fruits and/or seasons to a scent may generate (e.g., using non-visible light and/or ultrasonic signals) non-visible depictions of fruit that can be visualized by a user of a Vision Pro that’ passing by or within the tea shop.
The detection of the non-visible depictions of fruit can also trigger generation of a scent corresponding to the depicted fruit, by the artificial scent device, in one or more implementations. This begs the question of how many folks will wear a Vision Pro when they’re out and about in public. But that’s a topic for another article.
Summary of the patent
Here’s Apple’s abstract of the patent: “Implementations of the subject technology provide visualizations of non-visible features of a physical environment, at the location of the non-visible features in the physical environment. The non-visible features may include wireless communications signals, sounds, airflow, gases, subsonic and/or ultrasonic waves, hidden objects, or the like. A device may store visual contexts for visualizations of particular non-visible features.
“The device may obtain a depth map that allows the device to determine the location of the non-visible feature in the physical environment and to overlay the visualization on a user’s view of that location. In this way, the non-visible feature can be visualized its correct location, orientation, direction and/or strength in the physical environment.”