Fashion, at its core, is a social experience, with people sharing their favorite looks and trying to replicate the outfits worn by friends, celebrities and style icons. But often, it’s hard to find the right item to complete the look because consumers don’t know the right terms to use in their searches.
As artificial intelligence (AI) has improved, consumers no longer have to find the perfect way to describe that particular blazer they once saw – many retailers are now embracing visual search, using technology to identify relevant items that a buyer might be looking for. .
Farfetch, for example, has been offering this feature since 2019 as part of a partnership with Syte, a visual AI software as a service (SaaS) company. Using the luxury market app, customers can take and upload photos through the search bar, with the software identifying relevant items that are represented. The app then offers several interactive tags, such as “jacket,” “dress,” and “bow tie,” which can be used to search for similar merchandise in Farfetch.
Lihi Pinto Fryman, co-founder and chief marketing officer of Syte, told PYMNTS in an interview in 2019 that visual search makes the shopping experience smoother and more user-friendly by removing uncertainty and allowing brands to align. results as closely as possible on a customer’s vision. . “[Users do not] have to explain what they’re looking for, or what made them fall in love with this item, ”Fryman said. “All they have to do is upload an image and find inspiration. “
Read more: AI-powered visual shopping experiences for millennials, millennials
Fryman noted that Farfetch and other retailers that use visual search may not be able to deliver the exact same product, “because it may be something an influencer is wearing and [it] costs $ 5,000 ”, but this gives the merchant the ability to display similar items. “It really connects the [product] inspiration from the social with the retailer’s collection, ”she added.
Startups like Syte are now facing search giant Google, which earlier this year used its Google Lens image recognition technology with Google’s Shopping Graph database to deliver visual search capabilities. Using the Google Lens app, consumers can select items from photos to view details and purchase similar products. The company has also improved its visual search results to create more experimental on-screen interactions, delivering more comprehensive results and a smoother shopping journey.
Also see: Google Shopping overhaul Expands Opportunities for Merchants to Turn Search into Sales
Matt Madrigal, vice president / general manager of merchant purchasing at Google, told Karen Webster ahead of the feature’s September launch that the goal is to create “a visual product feed … interspersed with video content and guides. of style”.
“With this change… we’re making it easier to shop your favorite brands and discover new brands, and at the same time, help more brands get discovered,” he said.
Google Cloud also recently rolled out a new retail search product based on decades of Google’s search experience with the goal of helping retailers improve the consumer experience with personalized results and relevant promotions.
Related news: Google Cloud Retail Search Aims To Solve $ 300 Billion Abandonment Problem