top of page

Practical technologies for Smart Shops

Unmanned shopping markets are completely self-service and self-checkout.

smart_Shelf.png

The registering of bought products and their payments is monitored by distributed sensors, artificial intelligence, machine
learning algorithms, and dedicated mobile applications. From the moment customers enter smart shops, their route, the
products they browse, pick up, place in their baskets and replace on shelves is monitored. What customers are looking for is
a just-walk-out experience. Meeting this customer preference requires the development of smart technologies that are
unobtrusive, intuitive and extremely accurate. Balancing customer needs with cost-effective high-performance technologies,
whilst at all times staying within GDPR regulations, is one of the great challenges facing the partners in the EU funded project
MIMEX. MIMEX is approaching this through the thorough evaluation, development and fusing of state of the art solutions using
complementary approaches. In smart shopping spaces, technologies must be: robust to line-of-sight occlusions, invariant to
illumination changes, and they must be resilient to everyday activities like shop restocking and redressing, otherwise autopurchasing mistakes could occur; inevitably lead to a decrease in shopper trust. Of equal importance is shop-manager
acceptance as they demand very high levels of technological robustness and performance, combined with cost effectiveness
and practicality. Absolute trust in product movement monitoring and auto-purchasing technologies is imperative for tasks like:
stock supply scheduling and the detection of potential thefts.

​

In this first, of a two-part, article on technologies for smart stores, we will provide a brief overview of the three main
approaches: weight sensors, radio-based solutions and computer vision, followed by a more detailed discussion about
specific computer vision solutions. Weight-sensors are often placed inside smart shelves to detect the presence, or lack of,
products placed upon them. They rely on strain-gauge load cells that exploit resistance variations in conductive wires or
pressure sensors that exploit resistance variations in elastic materials doped with conductive particles (e.g., carbon/
nanotubes). Radio-based solutions (e.g. RFID, NFC, UWB and Bluetooth) are commonly exploited in many retail scenarios,
with each offering different detection ranges and unit costs. In all cases, markers/tags are attached to products and then one
or multiple readers placed on or around smart shelves are exploited to detect their movements. Computer vision technologies
feature heavily in many smart-market solutions as they are unobtrusive and they do not require the shopper or products to
carry additional markers/hardware to make them work. They function by effectively spotting and tracking the unique visualfeatures/signatures of shop products, using metrics such as visual appearance (e.g. clothing/packaging colours, textures, etc.)
and physical dimensions. However, the visual tracking of products in this way necessitates that all merchandise in a shop’s
inventory is firstly visually-digitised into a database. This can be a potential bottleneck, as the characterising/adding of new
products is inherently non-trivial. MIMEX will approach this issue through the exploitation of recent advances in machine
learning which will enable the faster training and adding of new products to a shop’s e-inventory; reducing the product-training
burden on shop owners, increasing MIMEX’s attractiveness. In addition to tracking products, all shoppers also need to be
followed as they move throughout the store to understand who picks up which products and places them in their bags/
baskets. To achieve this, MIMEX is exploiting patented solutions coming from project partners that have been proved to be
robust even in very crowded spaces. For both product and person tracking, a further issue that is delaying the widespread
deployment of computer vision solutions in smart stores is their requirement for huge processing power. In MIMEX, recent
developments in embedded and edge hardware are being exploited. This means that powerful computer vision algorithms can
now be run both in on-board cameras, and also very close to cameras on shelf-processing edge-units. This advancement
means that complex neural network models can be run in parallel on the same device, and the extra capacity means that
resources can be shared across multiple shop shelves, reducing costs even further. Depth cameras with embedded
processing, can also provide visual tracking solutions that help overcome line of sight occlusions and shadows. Depth
cameras produce a high-level of tracking accuracy with only a single camera, thus potentially delivering more accurate
customer engagement metrics. However, networks of overlapping and low-cost monocular cameras are still more costeffective and if calibrated correctly can ensure a better multi-viewpoint coverage at a lower cost. Fisheye lenses can also be
inserted into visual tracking networks, and as they offer ultra-wide-angle panoramic coverage, they can cover confined spaces
and corners. To non-invasively monitor body temperature, a considerable worry in COVID-19 times, thermal cameras can also
be integrated into smart-camera systems, instructing automated turnstiles to deny access to people that pose a potential risk
to customers already shopping. In MIMEX we are exploring all of these camera options in order to find an optimal balance
between cost and performance.

​

To gain valuable insights about what is happening in smart shops at a global scale, one individual type of product/shopper
tracking technology is insufficient. MIMEX is therefore creating a machine-learning sensor fusion module that will reduce
ambiguities, and add robustness to the shopping experience. To validate our approach and choice of technologies, MIMEX
will evaluate usability and practicality for a wide range of stakeholders, ranging from the sensor installation process for shop
fitters, to the interface responsiveness for shoppers. These practical metrics will come from experiences with our first
prototype installed in Trento, Italy (Autumn 2021) and then from two subsequent pilots installed in Spain and Turkey (Spring
2022).

bottom of page