Shane Hickey 

Fashion retailers eye up image-recognition apps for smartphones

Software from firms such as Snap Fashion and Style-Eyes lets shoppers upload photos of clothing and link to relevant stores
  
  

Jenny Griffith, Snap Fashion
Jenny Griffiths began developing Snap Fashion while at the University of Bristol Photograph: PR

When Cara Delevingne told Vogue that one of her favourite apps was the newly released Asap54 – which uses visual-recognition technology to identify clothes – it was a PR shot in the arm for the new player in an area where competition to become the definitive technology is rife.

Snap Fashion in the UK, Style-Eyes from Ireland and Slyce in Canada are just a few of the companies that are using elaborate software to allow shoppers to take a picture of clothing on their smartphone and then be linked to a retailer where they can buy that piece or something similar.

Image-recognition software, where algorithms are used to identify and match one image with another, has been used in security and marketing for a number of years, and the move into fashion is seen as one of the first steps in the widespread commercialisation of the technology.

Jenny Griffiths started the groundwork for what would become Snap Fashion during her degree at the University of Bristol where she graduated in 2009. Launched during London fashion week in September 2012, it notched up 250,000 users in its first year.

When a picture is taken of a shoe or a piece of clothing, the software – on a desktop or mobile – analyses it by looking at the colour, pattern and shape, and tries to find a match on an existing database of products from 170 retailers ranging from New Look to Harrods. Another app called ColourPop matches products solely by colour.

"I think it is really fascinating but also someone's job to push technology into consumer spaces because we are wasting the tools [smartphones] we are carrying around every day," said Griffiths. Snap Fashion has created an app for Westfield shopping centres that lets shoppers upload a picture and then receive suggestions from the ranges in stock.

Mark Hughes, who has worked in image recognition for 10 years, is a co-founder of Dublin-based Style-Eyes, which has attracted about 65,000 users since it launched last year. Users take a picture of what they want to find, draw an outline round it and the image is matched against a database of 1.5m pieces of clothing, shoes and handbags. Suggestions of where they can buy something similar across 600 shops are shown.

"If someone matches a very expensive dress, the chances are they will not be able to afford it but what we intend to do is bring back similar ones that might be in the price range so you can filter all the results with what your intended range is – [for example] 'Find me something like that that is less than £200'," said Hughes.

Both firms make commission when items are bought via their sites – Snap Fashion makes 5-15% and Style-Eyes 5-12%. Style-Eyes says it receives up to 15 pence each time a user clicks through to a retailer. Both say just over one third of searches result in users going through to a retailer's website.

Although there are numerous players in the field, one industry commentator from a fashion retailer said no single company has managed to reach the tipping point that will make it the standard technology for consumers and brands to use. Technological glitches can happen when the various programmes cannot recognise clothing because of a lack of distinguishing features and no one yet claims a 100% success rate.

"Fashion, from a technical perspective, is very difficult for image recognition mainly because clothes change because of what way someone is sitting or what direction you are taking it [the picture]. With standard image-recognition technology now, it works really well for static objects – the front of a box of cornflakes or the front of a big building – it never really changes, the structure remains the same. That is almost a solved problem but with fashion it is a lot more difficult," said Hughes.

Griffiths does not believe a 100% success rate is possible. "Sometimes you have that experience when you Google something and you say 'actually that is not what I am looking for' and you have to try a different search term. So just the nature of search is you are not going to get it right 100% of the time. It is all about getting the user to manage it when they get the wrong result to getting them to the right one," she said.

Cortexica, a London-based firm, uses its FindSimilar software to build image-recognition facilities for retailers such as Zalando in Germany where shoppers upload images and are redirected to stock items.

Iain McCready, chief executive of Cortexica, said the rapid development of technology will enable clothes to be identified from video within two generations of iPhones. Eight computer scientists in the staff of 20 at Cortexica are constantly updating the software, which was initially developed by Imperial College London.

So far the software has largely been aimed at women's fashion – the largest part of the market –but it is expected men will soon feature. Style-Eyes hopes to expand to the US in the summer, and Cortexica is already setting up there.

Griffiths says there is not a lot of difference between various visual search companies' products and expects one to emerge as the dominant technology in the future.

"I like to think we are close to the tipping point as an industry and I like to think we are leading. I think it will be one company that does [succeed] because I don't think consumers will be bothered downloading a load of visual search apps and test them out for themselves," she said.

How does it work?

Style-Eyes uses a technique called "machine learning" where a computer is trained in the same way as a human. By showing the computer thousands of images of clothes, shoes and handbags, the computer distinguishes between them.

A fingerprinting system, where a digital code is created for each item using visually distinctive characteristics such as colour, shape and pattern is then used to categorise each one for search and comparison across a database of products.

Algorithms that Snap Fashion's Jenny Griffiths created at the University of Bristol to identify colour, pattern and shape are still used today although are now 100 times faster.

 

Leave a Comment

Required fields are marked *

*

*