X
Tech

iPhone owners: Apple's iOS 13 fakes FaceTime eye contact using AR

Apple is using augmented reality to fix the eye-contact problem that afflicts video chat.
Written by Liam Tung, Contributing Writer

Soon iPhone FaceTime won't make you seem like you're not looking at your contact. The latest beta of Apple's recently unveiled iOS 13 looks set to fix a quirk of FaceTime and all video calls on computers that makes it appear the user isn't looking at the person they're speaking with. 

Apple's fix for this is coming in iOS 13, which introduces a new setting called FaceTime Attention Correction, which can be toggled on to correct the user's gaze when they're looking at the screen as opposed to the camera at the top of the device. 

SEE: Top 20 Apple keyboard shortcuts for business users (free PDF)

Most users probably wouldn't notice this eye-contact problem, but it is true that when using FaceTime, you and the other person's gaze could be improved, simply by each user looking at the camera on the top of the device rather than the screen. But obviously when you look into the camera, you're not looking at the other FaceTimer on your screen. 

Per The Verge, the new fix for the gaze issue was spotted by developer Mike Rundle, who notes Apple may use some "dark magic to move my gaze to seem like I'm staring at the camera and not at the screen". 

The difference can be seen in a FaceTime snap posted by tech enthusiast Will Sigmon.   

Another iOS 13 beta tester, Dave Schukin, suggests the attention correction feature uses ARKit to generate a depth map and position of the user's face in order to adjust their gaze. ARKit is Apple's augmented-reality software-development kit for iOS.

For now, it appears the setting is only available on 2018 model iOS devices like the iPhone XS and iPad Pro. 

Editorial standards