top of page
  • Writer's pictureBhagyeshwari Chauhan

How Large Retailers are Engaging customers using data

Updated: Feb 12, 2021

If Amazon and Walmart were to go head to head, whose side would you be on?


Being the two giants of the retail industry, these two clash swords on a daily basis.

And that’s what recently happened.


In the present day, the industry is buzzing with the latest gossip about Walmart facing an embarrassing situation against Amazon. While Walmart was busy spying on Amazon with its shopping bots, which scrapped Amazon’s website several million times a day for pricing data, Amazon was busy thwarting its attempts. Walmart woke up to an unprecedented surprise when their bots suddenly stopped working. Amazon succeeded in blocking the bots from scraping its website, forcing Walmart to resort to secondary sources for data.


For Walmart, not being able to access Amazon’s data is no small issue. Several retailers, especially industry giants, rely on computer algorithms to see what pricing strategies their competitors are swearing by, so as to adjust their own pricing accordingly. A difference of as small as 50 cents can mean losing a sale and a customer.


Why Bots are the Deal Breakers for Retailers?

Today, bots are being referred to as the new app. Recent studies show that half of the users have a variety of mobile apps to perform only a handful of tasks. Here bots, which are smart enough to read, write, and respond to a user as if they are talking to a real human will help build consumer engagement and retention for a retailer.


But bots can also slow down a website which can be a motivator for many retailers to block them. However, if one knows how to use a bot efficiently, it can turn into a huge advantage for a retailer.


Take for instance Amazon, whose technological prowess is unmatched. They’ve essentially depended on bots several times to scrape not only to see what their competitors are up to but also keep them in the dark about their own marketing and pricing strategy.


Bot driven pricing is posing a massive change in the retail industry. Traditionally the brick and mortar stores don’t entertain the idea of changing prices too frequently. They do it at most once a weak because changing the labels manually is a very hectic task. This, however, is not true in the world of e-commerce. Here retailers can change prices multiple times a day depending upon the data collected by various algorithms. That is used for counting inventory, sales and forecasting data. With such a high level of competition, companies like Walmart, Amazon and online wholesaler Boxed depend on bots to stay on top of the game.


Shopping Bots and Web Scraping

The process of scavenging data from online websites and other sources is known as web scraping or crawling. There are 5 common tasks bots perform when scraping websites

  1. Price Scraping – Shopping Bots scrape the pricing section of a website and share that info back to a database to be further analyzed.

  2. Product matching – Shopping Bots collect info from various data points on a site so that an exact match can be made against the wide array of a competitor’s products.

  3. Product variation tracking – Shopping Bots scrape data for any variation of a particular product based on aspects like color, size etc.

  4. Product availability targeting – Shopping Bots scrapes information about the availability of a product in the case for the competitive arrangement of products in the market.

  5. Continuous data refresh – Bots visit the same site on a regular basis so as to keep track of all the changes made.


To protect their data from the competitors, some companies use CAPTCHA. It’s typically a distorted string of letters or numbers, which a bot will find hard to decipher, but will be easier for humans to understand. Giants like Amazon shy away from such practices because they tend to annoy the customer.


Some companies program their bots to move the cursor slowly on the browsing page so as to imitate the browsing pattern of a human. Another technique is to use multiple computer addresses so as to stop to bypass the blocks. These techniques come in handy for merchants looking to evade the defenses.


To stop Walmart from scraping its data Amazon targeted a browser called the Phantom JS.

It’s a special browser made for programmers. Amazon put a digital wall to hide its listings

from the Phantom JS therefore successfully stopping some of the bots scraping its website.


Recent research shows that of all the retail chains and giants, Amazon’s bot detection was by far the most sophisticated for its homepage and the most popular products. These tests were run by San Francisco-based Distil Networks, which sells anti-bot tools.


Despite Amazon’s cleverness, the amount of bots crawling its site is astounding. At times, as high as 80% of all the clicks on Amazon’s website are generated by bots. Apart from rival’s bots seeking data, a lot of bots are also from different sources like universities, seeking data for research purposes.


According to US patent application Amazon is working on a tech that will force bots to solve complicated algorithms to gain access to its websites, while at the same time not disrupting the human traffic.


While Amazon has the ability to use bots to its advantage as well as detect and stop the competitor’s bots, that’s not the case for most companies. If you are looking to use shopping

bots or stop one from scraping your website, contact us at Datahut, your big data experts.


35 views0 comments

Do you want to offload the dull, complex, and labour-intensive web scraping task to an expert?

bottom of page