Apple solves lack of eye contact in video calls with Attention Correction feature
The forthcoming update of Apple's operating system (iOS13) will include a feature called FaceTime Attention Correction which adjusts the image during a video call to simulate eye-to-eye contact, something video conferencing systems have been unable to replicate until this point. The update is currently in beta testing and some users have posted videos of them using the new adjustment system.
The feature appears to only be rolling out to the iPhone XS and iPhone XS Max with the current beta testing. It will get a wider release to the general public when iOS 13 officially goes live, which will likely be sometime this auturn.
The feature uses the TrueDepth camera system, the same camera system used for Animoji, unlocking the phone and even augmented reality features found in FaceTime.
The new eye contact functionality is powered by Apple's ARKit framework.
It creates a 3D face map and depth map of the user, determines where the eyes are, and adjusts them accordingly.