Researchers develop powerful fashion-optimized image search tool

1/7/2016 August Schiess, CSL

ADSC researchers have commercialized FashionMatch, a technology that allows users to instantly search for visually similar apparel items from a large database of images.

Written by August Schiess, CSL

 
There is a growing trend toward personalized consumer experiences, and this includes an emerging demand for personalized fashion shopping and purchasing. Imagine finding an image of someone wearing an item of clothing, putting that photo in an app, and being instantly presented with dozens of images of garments with similar colors, patterns, and shapes. Each item is linked to a website where customers can purchase it. This highly customizable shopping experience is within reach.

 

FashionMatch allows users to instantly search for visually similar apparel items from a large database of images.
FashionMatch allows users to instantly search for visually similar apparel items from a large database of images.
FashionMatch allows users to instantly search for visually similar apparel items from a large database of images.

 

Advanced Digital Sciences Center (ADSC) researchers Vassilios Vonikakis and Siddhanta Chakrabarty and their colleagues have commercialized FashionMatch, a fashion-optimized tool that analyzes apparel in an image to instantly find visually similar items and where to purchase them from a large database. 
 
The technology uses an algorithm that can identify humans, analyze posture, estimate the proportions of the garments relative to the body, and separate the garment from the background, all without user intervention. 
 
“The algorithm has mastered many computer vision challenges. Even with shadows and cluttered backgrounds, the algorithm can identify which pixels of the image correspond to the person, to the garment, and to the background,” said Vonikakis, an ADSC research scientist. “It can then identify the skeletal structure of the person, which makes it possible to tell the fashion-related proportions of the garment, like how long the sleeves are, the proportion of the neckline to the waist, general shape, and more.”  
 
The selections draw from a large database of images, and users can customize their results based on their preference for certain similarity attributes. For example, users can rate how important it is to them that selections match the garment’s color, pattern, or shape, and receive different results based on those selections. 
 
The research for this technology began in CSL by a team of graduate students, led by Bernard Ghanem, an ECE Ph.D. student at the time, in Professor Narendra Ahuja’s Computer Vision and Robotics lab. Ghanem handed the project—then called FashionLatte—to ADSC for commercialization. 
 
With a $285,000 USD grant from A*STAR, an agency in Singapore that encourages research commercialization, eight engineers were hired and split into two teams to work in parallel on developing two versions of FashionMatch for the marketplace. 
 
 
One team was led by Vonikakis to evolve FashionLatte, and the other team worked on a similar algorithm developed by former ADSC research scientist Wang Gang. Vonikakis’ team focused on making the algorithm robust for real-life conditions, while Gang’s group focused on the business development side. 
 
After the grant ended, Chakrabarty, an ADSC software engineer, joined Vonikakis’ team to make the technology more user-friendly for administrators. The two teams have already sold the license of FashionMatch to five companies, including Jet Solutions, IQnet, and Image Science, and are looking to sell to more. 
 
“We started with a good algorithm, and then we upgraded specifications to meet the needs of clients. They wanted the app to function on mobiles and ‘in the wild,’ which meant identifying the garment accurately even with complications like shadows and cluttered backgrounds,” said Vonikakis. “Our two groups delivered two complementary solutions to the growing demand for personalized and customizable fashion search and retrieval.”

Share this story

This story was published January 7, 2016.