The Future of Smartphones is in the Camera

Friday, September 01, 2017 Unknown 0 Comments Category : , ,


 For the last decade, smartphones have gotten thinner and faster and thinner and faster and, well, you get the picture.
But it’s too soon to write off our smartphones as boring. The gadgets are still evolving with new technologies. And for a clue as to what the smartphone of the future might look like, turn your attention to the device’s cameras and the software and sensors that make them tick.

Here’s a peek into how the camera may come into play: As soon as you pick up your gadget, it will see you and know you are the owner and unlock the screen. Overseas, you will be able to point the camera at a restaurant menu to translate items into your native language. When shopping for furniture, you can point your phone camera at your living room floor and place a virtual rendering of a coffee table down to see how it looks and move around and peek underneath it.
Some of this futurism is already starting to happen.
Next month, Apple plans to hold a special event to introduce a set of new iPhones, including a premium model that can scan 3-D objects — including your face. Samsung, the No. 1 phone maker, also recently introduced the Galaxy Note 8, highlighting its fast dual-lens camera as the signature feature. And rivals will soon work to catch up with Samsung and Apple.

"2018 will be the year where the smartphone camera takes a quantum leap in technology,” said Philip-James Jacobowitz, a product manager for Qualcomm, a chip maker that provides components to smartphone makers.
Mr. Jacobowitz added that emerging camera technologies would be the key to stronger security features and applications for so-called augmented reality, which uses data to digitally manipulate the physical world when people look through a smartphone lens.
Here’s a rundown on what this all means for how your next smartphone will work.

Face Scanning

For the last few years, we have become accustomed to unlocking our smartphones by scanning our fingerprints or entering a passcode. But when Apple shows its new iPhones next month, including a premium model with a starting price of $999, the company will introduce infrared facial recognition as a new method for unlocking the device.
How would the new iPhone do that exactly? Apple declined to comment. But Qualcomm’s Spectra, a so-called depth-sensing camera system, is one example of how face scanning works.
The Spectra system includes a module that sprays an object with infrared dots to gather information about the depth of an object based on the size and the contortion of the dots. If the dots are smaller, then the object is farther away; if they are bigger, the object is closer. The imaging system can then stitch the patterns into a detailed 3-D image of your face to determine if you are indeed the owner of your smartphone before unlocking it.

“You’re seeing the contours of the head — it’s not just the front of the face as you’re typically thinking about,” said Sy Choudhury, a senior director of product security for Qualcomm.

Because of the uniqueness of a person’s head shape, the likelihood of bypassing facial recognition with the incorrect face is 1 in a million, he added. That compares with a false acceptance rate of 1 in 100 for previous facial recognition systems, which had very poor security.
Older facial recognition systems worked by simply using the camera to take a photo of yourself and comparing that with an image that was stored on the device. All a thief would need to do to fool the system was hold a photo of your face in front of the camera — which some people already did with Samsung’s facial-recognition feature.
There are, however, limitations to infrared-scanning technologies. For example, objects that you wear, like a hat or a scarf, might throw off the camera, according to Qualcomm. In addition, experts said infrared light can get drowned out by bright sunlight outdoors, so face scanning might work less reliably on the beach.
It remains to be seen how exactly face scanning will work in the next iPhone. But Apple is well acquainted with depth-sensing camera technologies. In 2013, the iPhone maker acquired PrimeSense, a company that developed sensors for Microsoft’s Kinect, a depth-sensing camera system that let Xbox players control games using body movements. Analysts expect some rendition of PrimeSense’s technology to appear in future iPhones.

Augmented Reality

Depth-sensing cameras may be crucial to enhancing augmented reality, a jargony industry term that probably makes your eyes glaze over. But bear with me for one moment: Augmented reality will have major implications for future mobile apps.
It’s no secret that Apple is bullish about augmented reality. In a recent financial earnings call, Timothy D. Cook, Apple’s chief executive, called augmented reality “big and profound,” with major implications for gaming, entertainment and business products. This fall, Apple will release iOS 11, its next mobile operating system that includes support for applications made with ARKit, a tool kit for app developers to easily create augmented-reality applications.

ARKit uses a combination of the iPhone’s camera and motion sensors, including the accelerometer and gyroscope, to let people lay digital objects on top of the real world and interact with them with precise movements.
I got a demo of ARKit from Ikea, the furniture maker, with its coming app Ikea Place. I placed an Ikea bed on the floor and was able to move around and look underneath it. This type of application would be useful for getting a sense of how an item looks and fits alongside other furniture in a space before placing an order.


“This is like a real application that real people can use to make real-life decisions,” said Michael Valdsgaard, the head of digital transformation at Ikea.
But the limitations of the Ikea Place app underscore what’s missing from ARKit. For placing virtual objects, the app can detect horizontal surfaces, like a table surface or the ground, but it cannot yet detect walls.
Vertical planes like walls are trickier to detect because they are not as smooth as floors — with doors, windows and picture frames getting in your way. Depth-sensing cameras make wall detection much easier for future iPhones, said Blair MacIntyre, a research scientist who is working on augmented reality for Mozilla, the organization that makes the Firefox web browser.
All the tech giants are betting big on augmented reality. For years, Microsoft has been developing HoloLens, an augmented-reality headset. In April, Facebook announced Camera Effects Platform, an environment for software developers to build augmented-reality apps for Facebook. This week, Google unveiled ARCore, an augmented-reality tool kit for Android devices, in response to Apple’s ARKit.

Mr. MacIntyre said augmented reality has huge potential when it matures. He envisioned people being able to take a tour of a natural-history museum, pointing their smartphone cameras at a fossil exhibit to bring a dinosaur back to life.
But he said that augmented reality on smartphones was a stopgap to the inevitable: wearing data in front of your face at all times through some kind of headset.

“If you look at science fiction, a lot of it has this characteristic of being always on and serendipitous,” he said. “You get a lot closer to that when you get a head-mounted display.”
Until that happens, smartphones are about to become much smarter.

RELATED POSTS

0 comments