‘Evolving quickly’: Video selfies will enable beauty companies to offer more accurate smartphone-enabled skin analysis
By now, we are all familiar with skin analysis technology – upload a well-lit selfie with your phone and you can instantly find out your skin’s condition. Thanks to COVID-19, we saw an acceleration of brands' adoption of such tech.
However, Estonian beauty technology firm Haut.AI reckons that beauty consumers will soon be moving away from analysing static pictures and moving on to analysing videos.
According to the firm’s own research, analysing a plain old selfie is subject to inaccuracies.
“What’s considered state of the art, or conventionally recognised as a method for analysing skin right now is a plain old picture. You’re usually asked to take a picture from the frontal position and some more advanced systems ask you to capture side pictures. However, we have actually been analysing data from these pictures and realised that there is a distortion effect,” said Anastasia Georgievskaya, co-founder and CEO of Haut.AI.
She reasons that because faces are not flat surfaces, things like spots or acne lesions can end up distorted during analysis. While there is high-tech hardware that are used widely in clinical settings, it was important to develop a more consumer-friendly way to more accurately analysis skin.
“We want to develop a way where you can reproduce clinical evaluation using your phone. That’s our greatest aspiration, to have high accuracy using the sensors on your phone. Not forgetting, in the mobile world, everyone wants to do things very fast,” said Georgievskaya.
To meet these requirements, the company has developed technology to analyse a video selfie.
“The short video will help us make a 3D model and analyse it not as a plain picture, but as a 3D model. And it would be fully available on your mobile phone. When you have a short video, you can have more of the reference points, which makes the predictions more refined, because you have not only one image,” Georgievskaya explained.
She continued that the firm is set to roll out this technology soon and is also working to use this video selfie analysis to analyse hair as well. “The way hair moves and behaves [in movement] are also important characteristics to determine hair quality.”
Georgievskaya added: “We're really into holistic skincare. And our big idea is to enable video-based analysis for the whole body like scanning your nails and hands. And we want to make it as a video because of its high accuracy, and also because it's more engaging for the consumer.”
She emphasised it is becoming imperative for the technology to evolve quickly as new developments like the Metaverse influence the beauty industry.
“For instance, you can think of the shop of the future, where you can enter the shop, scan your face, and the system will navigate you to where exactly to pick up products, which were recommended to you,” said Georgievskaya.
This would move quickly in Asia, which Georgievskaya considers one of the trendsetters of the beauty industry where “all the new technologies will first appear.”
Incidentally, omnichannel beauty retailer Sephora has recently unveiled its first Store of the Future in Singapore. It was developed as an experiential shopping journey that incorporates technology details and digital touchpoints.
Shoppers can get skin care recommendations with Skincredible, a dermatologist-grade skin analysis app and also use mobile checkouts and scan items to access over a million consumer reviews.
Sephora has said it plans to open the next similar concept store in Shanghai next year.
“We see the shopping experience actually starting to change, influenced of course by the trend of the metaverse. We are happy to hear Sephora is launching such interactive experiences [at its new store in Singapore],” said Georgievskaya.
To learn more about Sephora's latest ‘store of the future’, hear straight from Sephora Asia president Alia Gogi in our latest webinar on beauty 4.0 Register to watch our Beauty 4.0 – Tech, Tools and Future Trends webinar on-demand here.