When online shopping has soared because of the pandemic, retailers have introduced new ways to try on clothes virtually, and consumers are buying. But as virtual dressing room tech improves, customers might have more to worry about than the fit of a shirt.
Ph.D. students Kedan Li, Jeffrey Zhang, and Min Jin Chong created a virtual dressing room program that uses “deep learning” and artificial intelligence. It allows clothing to “lay” more accurately on a body and looks more realistic on a computer screen than ever before.
Revery, as the program is called, analyzes and processes over a million different items of clothing each week, a rate that no other fashion app or program has ever been able to achieve, according to TechCrunch.
Seeing how an article of clothing will look on your body appeals not only functionally but taps into a customer’s vanity – with successful results. Clothing companies that have started using Revery have noted skyrocketing sales of up to 380%.
Previous digital dressing room programs used 3D modeling by simply layering images on a stranger’s picture, similar to children’s online dress-up games.
Revery is different in that it can look up the SKU for just about any item of clothing available and that shoppers can customize their avatar to look more like them through skin tone, hairstyle, and poses. Its downside is that Revery’s developers have yet to offer avatars to reflect differently sized body types, something that other companies have been doing for a long time.
Some fashion brands have started encouraging customers to do full-body scans using tech they already have at home, using an Xbox Kinect, or even just their smartphone.
So before you do your next wardrobe reboot, consider looking at online brands that offer free returns – then buying in different sizes and sending back what doesn’t fit. It seems the safest way if you’re concerned about leaving some scantily-clad photos in an online dressing room.