Google Shopping has launched an AI try-on tool for fashion, allowing shoppers to see how clothes move on a model before they buy.
The virtual tool takes an image of a garment and then generates a photorealistic image that shows what it will look like while being worn by a person – including how it drapes, folds, stretches and moves.
To help them get the best match for them, users can find out how the item will look on a range of models with different body sizes and skin tones, as well as different poses.
The feature is currently available on Google Shopping for clothes from brands including Anthropologie, LOFT, H&M and Everlane. Google plans to roll out the feature to more brands in the future, as well as insisting that it will get more precise over time.