Google unveiled a virtual try-on function earlier this week, powered by generative AI. The new function displays how clothing appears on a variety of body types. Google claims that the new function would enable customers to tweak products until they discover the ideal one.
Machine learning and new visual matching algorithms that enable users to fine-tune inputs like color, pattern, and style make this possible. Users will be able to see selections from stores all across the internet, which is the main benefit in this case.
Customers can now view how a piece of clothing will drape, stretch, cling, fold, and create wrinkles and shadows on a variety of models, thanks to the virtual try-on. Additionally, the tool will assist customers in locating complementary apparel items in various sizes, styles, or colors.
Google, in the company’s blog post, said, “We chose persons ranging in sizes XXS-4XL, representing varied skin tones (using the Monk Skin Tone Scale as a guide), body forms, ethnicities, and hair types.”
Google used multiple sets of photos of more than 80 models in a variety of stances, representing a range of sizes, skin tones, body shapes, and ethnicities, to create the virtual try-on function. In order to produce accurate photographs of the subject from all angles, the AI-powered tool learned from all the models how to match the shape of certain apparel in various postures.
According to Google, the feature would initially be compatible only for women’s clothing with retailers like H&M, Everlane, Anthropology, etc. It will eventually diversify into men’s clothes, mostly shirts. Over time, it’s anticipated that the tool will grow more exact and accurate.