People, People, People, People, People…

Last Updated on: 5th November 2020, 08:56 am

iPhone 12 Pro lets people who are blind ‘see’ others around them
It’s probably going to be a very long time before I get to try this out unless I happen to hit a really good sale on Steve needs a new phone day, but it sounds like it has the potential to be quite useful. I don’t know how often I would use it in regular life, but since this is 2020 and regular life isn’t really that much of a thing anymore, it could very well make some of this physically distant navigation a little simpler.

The technology makes use of the new lidar scanner built into the camera array of the iPhone 12 Pro and 12 Pro Max. It’s also on the newest iPad Pro and is likely to come other other devices in the future. The scanner itself is a tiny black dot near the camera lens on the back of the new, highest-end iPhones. 
People Detection won’t work on older iPhones, the iPhone 12, 12 Mini or even the new iPad Air. None of those devices come with lidar scanners, which is essential for the people sensing technology.

People Detection uses Apple’s ARKit People Occlusion feature to detect if someone is in the camera’s field of view and estimate how far away the person is. The lidar scanner makes the estimate more accurate. It sends out a short burst of light and measures how long it takes the light to come back to the lidar scanner. The new feature doesn’t work in the dark or low light environments. 
All of the sensing happens in real time to give feedback on how far away a person is from the iPhone 12 Pro user. 
The user gets feedback from People Detection in four possible ways, and they can be used in any combination. All can be customized in settings. One way to get information about a person’s closeness is through an audible readout. The phone will say out loud, “15, 14, 13” and so on, when it comes to feet. It gives the distance in half meters for people who choose that unit of measurement. 
The iPhone 12 Pro users also can set a threshold distance with two distinctly different audio tones. One is for when people are outside that distance and another is for when people are closer to the user. The default threshold setting is 6 feet, or 2 meters.

The third type of alert is through haptic feedback. The farther away a person is, the lower and slower the physical pulsing of the haptics. The closer the person gets, the faster the haptics buzz. Currently, the haptics are only through the phone, not through the Apple Watch. 
There’s also the option to get a visual readout on the screen itself. In that case, it will say how far away the person is, and a dotted line will point out where the person is on the screen.

If you have a phone current enough to take advantage of this, you’ll get your first chance to try it out when iOS 14.2 is released.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.