Research into robotics at the University of Carlos III in Madrid has led to some promising innovations in the fashion and apparel business.
Research into robotics and automation at the Robotics Lab at the University of Carlos III in Madrid (UC3M) has led to important advances in mechatronics, but it looks like being useful in the fashion retail sector as well. Two startups nurtured by the Business Incubator at UC3M have developed innovations for the fashion and apparel business, drawing on technology from robotics. Dresscovery is a mobile app that can identify for example a handbag, just from a photo taken with a smartphone. Meanwhile the Proximus app can locate consumers visiting a shopping mall in order to provide them with a better service. These two apps, relying on such techniques as sensorial processing, artificial intelligence and automated learning, are thus the fruit of research that was initially focused on building highly complex robots.
The Dresscovery app has been developed by Samsamia Technologies, a startup created by Miguel González-Fierro, a PhD student working in humanoid robotics at the Robotics Lab at UC3M, and
Miguel Ángel Maldonado, a former UC3M student. The innovation is based on vision and artificial intelligence algorithms, techniques used in robots. Nevertheless the way it works is quite simple in practical terms, stresses Miguel González-Fierro: “You take a photo of a handbag with a smartphone, you frame it and when you hit ‘search’ the system finds the bag which is most similar to it that is currently on sale.” The app, available for iOS and Android systems, is linked to a database of more than 15,000 handbags from around 300 different brands. “We plan to add new functions and also garments, such as shoes, trousers, coats, etc,” promises Miguel González-Fierro, also pointing out that the great novelty of this system is that it goes well beyond traditional Internet searches. Search engines are currently based on words, but nowadays the latest challenge is to track down information using images. “We’re using a recognition algorithm that we’ve created to pull a set of characteristics – colour, shape and texture – from the image and then compare them with a huge database,” he explains.
Drawing on Big Data
Big Data processing is behind both the Dresscovery app and the Proximus system, which was also developed at the UC3M robotics lab. Created by beMee Technology, the system uses techniques from the field of robotics to analyse in real time the location and behaviour of consumers inside shops and large spaces using their mobile phones. With one of its small wireless beacons installed in the interior of a shopping mall, Proximus can analyse the movements and behaviour of the customers in order to pinpoint what it is they are interested in and so meet their needs more closely. “The idea is to attempt to improve people's experience when they are shopping in large complexes, and also to optimise advertising systems, so that it will be easier to target personalised offers, recommend styles, colours, brands and so on in line with users’ tastes,” explains Jorge García Bueno, one of the beMee co-founders. They are now introducing Bluetooth 4.0 technology into the positioning process so as to improve system precision and the quality of the statistics it can generate on user behaviour.