Racked is no longer publishing. Thank you to everyone who read our work over the years. The archives will remain available here; for new stories, head over to Vox.com, where our staff is covering consumer culture for The Goods by Vox. You can also see what we’re up to by signing up here.
For years now, startups and retailers have had their sights set on making fashion items easily and reliably searchable by image recognition. The pitch is always the same: You see someone walking down the street carrying an excellent bag, you snap a picture of them in passing, and in seconds you’ll be able to find out where to buy that same purse — or, at the very least, something similar.
But though many tech companies have entered the fray, no single service has emerged as the winner. On Tuesday, Google threw its hat into the ring with the launch of a feature called “Style Match,” which builds on its Google Lens technology to allow users to take a photo of a home decor or fashion item and find other products like it. Google introduced Lens in May 2017 as a way for Android users to get information about the world around them via photo, identifying things like buildings and works of art.
Incidentally, Asos also has a visual search tool called Style Match, which it launched in August 2017. In March, the British e-tailer rolled out the feature to iOS and Android users around the world. (Rebecca Jennings, Racked’s resident Asos stan, gives her full review here.)
Screenshop, an app that analyzes screenshots and surfaces shoppable items similar to the subjects’ outfits, got off to a buzzy start in 2017 thanks to an ultra-famous adviser, Kim Kardashian. Earlier that year, Pinterest launched Pinterest Lens, a feature that takes photos of real-world objects, including clothing, and offers up similar images.
Not all image-search-for-fashion startups have made it, however. Asap54 got a lot of attention when it launched in 2013 before expanding into beauty three years later, but today its website offers up a 404 error.
These tools are certainly evolving the way we shop for things, and indeed, they solve the very real problem of trying and failing to use your words to Google a jumpsuit you saw once. But even as they expand our aesthetic worlds, image recognition features can also contribute to what we at Racked like to call “the Great Flattening” — the notion that as we become more connected to one another’s style via apps like Instagram, more brand-aware, and more likely to take our style cues from technology, what we wear is becoming increasingly standardized. Brands, too, are guilty of this: What they offer, and how they market it, is one big blur.
But then, perhaps image recognition for fashion isn’t as dire as all that. Sometimes you really need to figure out where to buy that jumpsuit.