If you’ve had the chance to check out the Galaxy S9/S9+ and took the AR Emoji feature of the selfie camera for a spin, there’s quite the chance that you were left disappointed by its often jittery and sometimes super-stiff take on Apple‘s Animoji. And if you are any thing like us, you probably flinched when you saw the overly-simplified virtual avatar that your phone created for you. All in all, AR Emoji feels like a beta version at this point, though it clearly isn’t.
Turns out that Samsung didn’t develop AR Emoji on its own, no. The tech giant outsourced the crude technology behind this feature from Loom.ai, a tech company co-founded by CGI veteran and Oscar-winner Kiran Bhat, who developed Industrial Light & Magic’s VFX facial expression-capture technology that was used in ultra-popular flicks like The Avengers, Pirates of the Caribbean, Teenage Mutant Ninja Turtles, Warcraft, Star Wars: Episode VII, and Rogue One: A Star Wars Story. The near-life like CGI appearance of Hulk, Davy Jones, and Grand Moff Tarkin in the respective movies can be directly traced back to Mr Bhat and his Oscar-winning labor. As you might imagine, such accurate CGI requires extensive computational resources; remember that for later.
After conquering the CGI summits at Hollywood, Kiran Bhat co-created Loom.ai along with Mahesh Ramasubramanian. The main goal of Loom.ai is inherently simple and can be summarized with a single sentence: helping regular Joes carve their own super-accurate virtual avatar from a simple selfie without taking hours upon hours to render a short video.
“We’ve simplified it into a single photograph,” said Bhat. However, “people’s attention span is low. People lose it after 5 seconds,” Bhat added.
As the tech giant was given the framework, it was now calling the shots and decided that people would prefer speed in favor of better-looking avatars, which is a polarizing decision that we many wouldn’t necessarily agree with. Loom.ai’s co-founder said that AR Emoji “use a 2D tracker provided from Samsung on how the face moves, which is what gets fed into the SDK,” as Kiran Bhat points out. Sure, speed is crucial, but we suppose that users could have been given the option to select a basic or a more advanced avatar upon tapping the “Create My Emoji” button. And this is where the “gotcha” comes in: provided that the software was allowed to do its computational magic for longer, AR Emoji would have certainly turned out way better and more life-like than the AR Emoji available as a dedicated camera mode for your selfie snapper. Speed is crucial, and Samsung chose that in favor of producing super-realistic virtual avatars.
In addition, unlike the iPhone X, which uses a plethora of sensors to scan the user’s face and create a detailed 3D map of the user’s face, the Galaxy S9 and S9+ are only capable of two-dimensional scans, which severely limits the accuracy and is a big reason why AR Emoji looks so unimpressive.
Fortunately, there’s always room for improvement: Samsung deciding on how to use Loom.ai’s framework means that future AR Emoji can technically be way more superior provided that Samsung decides it’s worth it. Here’s to hoping. Aside from throwing additional depth sensors at the front of a future Galaxy device, the framework’s algorithms could be further optimized so as to produce better-looking avatars. So, it’s almost a given that the second generation of AR Emoji will probably be much better than its precursor.
“The exact timing is up to Samsung,” said Mahesh Ramasubramanian, Loom.ai’s CEO, “But it is something you will see evolve.”