AIM Media House

Can AI Solve the Retail Returns Problem?

Can AI Solve the Retail Returns Problem?

Virtual try-on has failed for a decade. A new wave of AI startups thinks it knows why, and believes generative AI has finally made it work.

Retailers have a word for product returns. They call them "silent killers." In 2025, US retailers are projected to process $849.9 billion in returned merchandise, representing 15.8% of annual retail sales, according to the National Retail Federation and Happy Returns.

For online sales, the figure is 19.3%. Most returned items never make it back to the shelves and often cost more to process than the value of the refund itself.

For apparel specifically, the problem is structural. Ill-fitting garments are consistently cited as the primary reason for returns. Online shoppers cannot touch fabric, check drape, or confirm fit before buying.

Gen Z shoppers between 18 and 30 averaged 7.7 online returns in the past 12 months, more than any other generation, according to the same NRF report.

Virtual try-on technology has been attempting to solve this problem for more than a decade. It has not succeeded. Now, a new wave of AI startups believes it finally can.

Why the Previous Wave Failed

Retailers and technology companies have been launching virtual try-on pilots since the early 2010s. Zara, ASOS, H&M, and others each deployed versions of the technology. Most of them were discontinued.

The failures had a common architecture. Early tools required 3D representations of each garment, which cost hundreds of dollars per item to produce at companies like Farfetch once hardware, labor, and post-production were included.

Scaling across thousands of Stock Keeping Units (SKUs) was financially impractical. The tools that did reach consumers were slow and the results looked artificial enough that shoppers could not trust what they were seeing.

There was also a behavioral problem that technology alone could not address. Using early virtual try-on tools required shoppers to open a camera, stand in front of it, and actively engage with a process that felt performative.

Most fashion browsing happens in fragmented moments, on commutes, between tasks. Shoppers were not in a position to perform. The result was a pattern that repeated across every major pilot: high initial engagement followed by abandonment.

The structural economics have shifted. Generative AI has eliminated the need for handcrafted 3D assets. Modern systems can interpret standard product photography and transform it into a virtual try-on experience, dramatically reducing the per-SKU cost of deployment, according to The Interline.

Modern tools can also render full outfits rather than single items, allowing shoppers to see how garments interact in proportion and layering, closer to how people actually think about getting dressed.

Ed Voyce, founder and CEO of AI startup Catches, told CNBC that the timing is now viable for a specific reason. "The reason it's solvable now in terms of timing is that you have to be able to run visuals for end users on bare metal in the cloud, cheaply enough to make a return on investment for brands," Voyce said.

Catches, backed by LVMH's Antoine Arnault and built on Nvidia's CUDA platform, has developed a virtual try-on platform it describes as offering "mirror-like realism." Unlike tools that Voyce says "just look pretty," the Catches platform incorporates the physics of fabric texture and how material interacts with a moving body.

The application went live last month on luxury brand Amiri's website. Catches projects a 10% increase in conversions and a 20 to 30 times return on investment for brand partners.

ASOS has partnered with deep-tech startup AIUTA, allowing customers to see garments on a range of body types, heights, and skin tones. ASOS recently reported a 160 basis point reduction in its returns rate, partly driven by the virtual try-on initiative, according to CNBC.

Shopify has integrated startup Genlook's AI virtual try-on tool into its commerce platform. Zara also introduced return fees for online orders and rolled out its own virtual try-on tool in December 2025.

Why Big Tech Has Not Cracked It

Amazon, Adobe, and Google have all built virtual try-on capabilities. Google's technology will even be accessible directly within product search results from April 30, 2026, according to Google Labs. Yet none of these large-scale deployments have produced the kind of return rate reductions the industry is looking for at scale.

Big tech builds for broad deployment, tools designed to work across millions of SKUs and thousands of brands simultaneously.

Solving the returns problem requires the opposite: deep, brand-specific calibration of fabric physics, body type representation, regional sizing differences, and a conversational tone.

Catches went live specifically on luxury brand Amiri because high-ticket apparel demands a different level of fabric simulation precision.

ASOS partnered with AIUTA and trained its tool specifically on diverse body types, heights, and skin tones. That depth of integration is not how Amazon or Google are built to operate.

The incentive structure also diverges. Google's virtual try-on in search drives clicks to retailer websites. Amazon's drives purchases within its own marketplace. Neither gets paid when a return does not happen.

The startups building for ASOS, Amiri, and Shopify merchants are commercially aligned with the outcome that matters, a reduction in return rates. That alignment is what is producing the specificity of solutions beginning to move the needle.

Why Shoppers Still Do Not Use It

The adoption gap, according to behavioral research, is a trust problem. A survey analysis cited by Wearfits, a virtual try-on technology company, found that 77% of online shoppers say they want virtual try-on features.

But actual usage rates across deployed implementations remain far lower than that stated interest. The gap persists even when the technology is technically accurate.

42% of virtual try-on users express hesitation about sharing personal measurement and body data with retailers, according to the same analysis.

A systematic review of 69 VTO research studies, cited by Wearfits, found that personalized scanned avatars, while effective at enhancing engagement, raise significant privacy concerns that directly reduce adoption.

Women express higher levels of privacy concern and body image anxiety when interacting with synthetic representations of themselves than men do, a pattern that directly affects how the feature converts across different shopper segments.

The technology can also produce what researchers describe as an uncanny valley effect, where an avatar looks close enough to the shopper to feel familiar but slightly off in movement or proportion, generating discomfort rather than confidence. Shoppers in this state focus on the flaws of the avatar rather than the fit of the garment.

Wearfits notes that retailers who disclose their use of AI-generated imagery have seen higher trust from shoppers rather than lower, the opposite of what most brands assume.

Regulatory pressure is accelerating this direction. Article 50 of the EU Artificial Intelligence Act requires AI-generated content to be marked in a machine-readable and detectable format, with full enforceability from August 2026.

Simeon Siegel, Senior Managing Director at Guggenheim, told CNBC that quantifying the benefits of virtual try-on remains difficult even where results are clear.

"There are certainly companies that have absolutely seen benefits, figuring out how to quantify them is more difficult," Siegel said. He cautioned that AI is not a fix-all saying, "What you sell is always going to be more important than how you sell."