Nvidia’s streaming software now has an option to make it look like you’re making eye contact with the camera, even if you’re looking elsewhere in real life. Using AI, the “Eye Contact” feature added to Nvidia Broadcast 1.4 will replace your eyes with “simulated” eyes aligned with your camera – an effect that worked really well when we tested it ourselves, except for all the times it didn’t worked. t.
In an announcement post, the company writes that the feature is intended for “content creators who want to record themselves reading their notes or a script” without having to look directly into a camera. By pitching it as something you’d use at a public performance, rather than something you’d use socially, you sidestep the dilemmas associated with this kind of technology. Is it rude to use AI to trick my mom into thinking I’m on our video call when I’m actually looking at my phone? Or to trick my boss into thinking I’m not writing an article on my other monitor during a meeting? (I’m going to say yes, since getting caught up in either scenario would put me in hot water.)
Nvidia suggests that Eye Contact will try to match your simulated eyes to the color of your real eyes, and there’s “even a disconnect function in case you’re looking too far away”.
Here’s a demo alongside a raw stream so you can compare how my eyes actually move to how Nvidia’s software renders them:
Looking at the results I’ve gotten, I’m not a huge fan of eye contact – I think it just makes things look a little off from. Part of it is the animated eye movement. While it’s really cool that it’s even possible, sometimes it seems like my eyes are darting around at superhuman speeds. There are also the odd, very distracting pop-ins you can see towards the end of the video.
There were certainly a few times when the feature got it right, and when it did, it was very impressive. Still, the misses were too frequent and too noticeable to use this the next time I came to a meeting (although theoretically I could).
Nvidia is labeling the feature as a beta and is seeking feedback from community members to help improve it. “There are millions of eye colors and light combinations,” the company says. “If you test it and find any issues, or just want to help us further develop this AI effect, send us a short video herewe would really appreciate that!”
Nvidia has leaned heavily on this kind of AI generation in recent years – a major selling point of its graphics cards is DLSS, a feature that uses machine learning to intelligently upscale images and add information that isn’t there when you look at a video game. lower (but easier to run) resolution. The latest version, DLSS 3, generates all-new frames and adds them to your gameplay, much like Broadcast generates and adds a new pair of eyes to your face.
Broadcast also has other AI-powered features such as background replacement that works like a virtual green screen and the ability to clean up background noise that your microphone picks up.
This isn’t the first eye contact feature we’ve seen. Apple started testing a similar feature called “Attention Correction” for FaceTime in 2018. In current versions of iOS, it’s labeled “Eye Contact” in Settings > FaceTime. Microsoft also has a version of the feature in Windows 11 for neural processing unit devices.
Eye contact isn’t the only feature Nvidia has added to Broadcast version 1.4. The latest update also brings a vignette effect that Nvidia says is similar to Instagram’s and improves the Blur, Replace, and Remove Virtual Background effects. The update is currently available for download for anyone with an RTX graphics card.