Kuo: Apple unlikely to integrate rear-facing 3D sensor in 2019 iPhone




 

Contrary to industry expectations, Apple analyst Ming-Chi Kuo believes the company will not integrate a rear-side time of flight (TOF) solution in its 2019 iPhone lineup, saying the technology is not yet ready for the augmented reality revolution.

In a note to investors seen by AppleInsider, Kuo says industry analysts expect Apple to incorporate rear-side TOF as it looks to develop next-generation augmented reality experiences. For example, Apple is thought to be developing an AR version of Apple Maps, potentially for use with a rumored AR headset.

According to Kuo, the distance and depth information provided by existing rear-side TOF hardware is insufficient for creating the “revolutionary AR experience” that Apple is presumably working toward.

The analyst believes a comprehensive AR ecosystem is one that integrates 5G connectivity, AR glass (a wearable, head-mounted device) and a “more powerful Apple Maps database” that includes appropriate distance and depth information. It appears Kuo, like others, assumes Apple Maps will be marketed as a “killer app” for Apple’s next-gen AR experience.

Additionally, TOF tech does not improve photo taking functionality, a major consideration for a company that touts its handsets as the best portable cameras in the world.

As such, Kuo says Apple will likely forego rear-side TOF in 2019, instead relying on a dual-camera system first introduced with iPhone 7 Plus in 2016.

“We believe that iPhone’s dual-camera can simulate and offer enough distance/depth information necessary for photo-taking; it is therefore unnecessary for the 2H19 new iPhone models to be equipped with a rear-side ToF,” Kuo says.

Rumors of a rear-facing TrueDepth-style camera date back to last July, when reports claimed Apple planned to debut a rear-facing VCSEL system for AR applications and faster camera autofocus. That solution was due to arrive in what would become iPhone X, but Apple’s flagship smartphone uses a single VCSEL module in its front-facing TrueDepth camera array.

Unlike TrueDepth, which measures distortion in structured light, a TOF system calculates the time it takes pulses of light to travel to and from a target. Such systems allow for extremely accurate depth mapping and can therefore assist in AR applications.

The July rumor was followed by a second report in November claiming much the same, while analysts jumped on the bandwagon in February. </span>

Let’s block ads! (Why?)


Source link

What's Your Reaction?

Cry Cry
0
Cry
Cute Cute
0
Cute
Damn Damn
0
Damn
Dislike Dislike
0
Dislike
Like Like
0
Like
Lol Lol
0
Lol
Love Love
0
Love
Win Win
0
Win
WTF WTF
0
WTF

Comments 0

Your email address will not be published. Required fields are marked *

You may also like

More From: Technology

DON'T MISS

Choose A Format
Personality quiz
Series of questions that intends to reveal something about the personality
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Poll
Voting to make decisions or determine opinions
Story
Formatted Text with Embeds and Visuals
List
The Classic Internet Listicles
Countdown
The Classic Internet Countdowns
Open List
Submit your own item and vote up for the best submission
Ranked List
Upvote or downvote to decide the best list item
Meme
Upload your own images to make custom memes
Video
Youtube, Vimeo or Vine Embeds
Audio
Soundcloud or Mixcloud Embeds
Image
Photo or GIF
Gif
GIF format