Walmart Doubles Down on Visual Commerce

Visual commerce continues to grow among retailers and eCommerce giants. Also known as camera-commerce, it involves AR-based experiences to virtually try on products or visualize them in your space. Amazon recently boosted its efforts with a new shoe try-on feature (and several previous moves).

Now Walmart is advancing its visual commerce efforts, specifically in optics. Last week it acquired optical tech company Memomi to accelerate efforts around virtual eyewear try-ons. Deal terms weren’t disclosed, but Memomi already works with Walmart for digital measurements for eyewear fittings.

Amazon Amps Up AR Try-Ons

Following Suit

Going deeper into Memomi’s technology and Walmart’s plans for it, the goal is to streamline the advanced process of eyewear fittings by making them available online. Walmart describes the deal as the “next step” in advancing efforts to make advanced eyewear fittings more affordable and accessible.

This move also follows some of the self-serve direct-to-consumer (DTC) disruptors like Warby Parker. And Memomi’s technology is only part of it, as Walmart stresses the acqui-hire aspects of the deal. With technology this specialized, it always helps to bring core competency in-house for ongoing integrations.

This move also follows a pattern for Walmart in product visualization tech. Recent activity includes the acquisition of Zeekit, which allowed Walmart to launch an AI-powered virtual clothing try-on feature. And last week, it launched a new AR feature for customers to visualize furniture in their homes in AR.

Speaking of other product categories beyond eyewear, it’s worth noting that Memomi’s technology is used in footwear, fashion, and beauty. Walmart is staying focused for now in applying the newly acquired tech to eyewear try-ons, but the door seems to be open in the future to expand into other products.

Shopify Taps Into Walmart-Level Scale

Dimensionally Accurate

Along with hats, cosmetics, and other head-worn products, eyewear is a logical category for AR try-ons. Not only do these products have massive addressable markets (everyone has a face), but they’re easier to pull off than “world-facing” AR. That’s simply because faces are easier to spatially map.

As background, AR requires that devices ingest a 3D map of a given surface so that they can apply graphics in dimensionally accurate ways. Based on the variability of all the surfaces in the world, faces are a narrower and more predictable subject to master (one reason Snap face filters work so well).

This is also why initial steps towards visual commerce – such as Walmart’s move, and Amazon’s work in cosmetics – involve things that go on your face. For the same reason, try Ray Ban’s sunglasses try-on feature on its website. It works remarkably well, and will only improve as LiDAR phases into ubiquity.

From there, more advanced AR steps will include outward-facing (non-selfie) AR activations. This is what Google is doing with Lens, including the ability to identify and contextualize local storefronts. In fact, it advanced this capability recently, and we’ll be back later this week to explain how. Stay tuned for that…

Share Article...

Follow Us...

Stay ahead of the curve and get the latest on Local straight to your inbox.

By submitting this form, you agree to receive communications from Localogy. You can unsubscribe at any time.

Related Resources

WordPress PopUp Plugin