Crowdsourced data image

Hack Price Monitoring with IFTTT and ScraperWiki

At WisePricer, we’re always trying to develop new ways to help online merchants stay ahead of their competition. Today, I want to share a way to keep track of Amazon’s prices for multiple products. I’ll be mashing up ScraperWiki and IFTTT to show you how to set up a daily “Price Monitoring” email delivered straight to your inbox.

In addition to this tactic, the combination of these two tools offers you endless possibilities for price monitoring. Later, I will share some of these ideas which you can implement yourself.

The Players
ScraperWiki is a platform that allows you to make data do things. It is a platform for writing and scheduling screen scrapers, and for storing the data they generate. ScraperWiki is useful for both for programmers who want to write screen scrapers with less fuss, and for journalists, activists and the general public who want to discover and reuse interesting data.

IFTTT is  a service that lets you create powerful connections with one simple statement:

If “this,” then “that.”

Using this principle, you can construct your own IFTTT Recipe using the following basic principles and ‘ingredients’:

“Channels” are the basic building blocks of IFTTT and include Twitter, Facebook, Email, Weather, etc. The “this” part is the trigger for the recipe. For example,  “I post a tweet”, or “I’m tagged in a photo on Facebook”. The “that” part of the recipe is the action element, including examples like “send me a text message or “update my Facebook status”.

A full recipe looks like this:

Following this pattern requires some basic coding skills. I believe it is essential for everyone to learn the basics or at least try. It’s a useful skill.

If Bloomberg is willing to give it a try, you certainly have no excuse…

Step 1: Setup

To start, create your own ScraperWiki account, then copy my code into your account.

You’ll be able to edit that code (i.e. changing the list of Amazon ASINs) and run the scraper for the first time.

After running the scraper (be patient, it can take a few moments), go back to the scraper dashboard. Under ‘Schedule’ click the Edit button to set up a daily schedule, and then click “Explore with API”.

Step 2: Edit Schedule

Step 3: Configure Settings

Set the feed type as “rss2” and change the SQL query to “select * from `swdata` order by pubDate desc limit 10”, and click Copy. The link you have acquired can be used in any RSS feed reader, or as a trigger in IFTTT.

That’s it! You’re done with ScraperWiki and can move on to IFTTT. If you don’t already have an IFTTT account, go ahead and open one. You’ll be amazed with what they’ve done with their UX experience. Once you’re setup with an account, create a recipe that will get your feed updates sent to your email account (or Facebook, Twitter, etc). Try it for yourself.

Here’s a screenshot of the daily email taken from my Gmail account:


Don’t be afraid to experiment! Here are some more ideas for you to try my “Scrape ‘n Notify” technique:

  • Find out if any of your competitors are having a promotion\sale by scraping their website landing page.
  • Set an email alert for current Amazon best sellers in categories within your industry.
  • Check if your main competitor is out of stock, to seize retail opportunities.

Are you ready to scale up?

Get a free trial account with WisePricer to monitor your competitors on thousands of products, and seamlessly sync from your store. It’s a cinch.

Min-Jee Hwang

Min-Jee is the former Director of Marketing at Wiser. She has extensive experience working with SaaS companies and holds a BA from Carnegie Mellon University and an MBA from NYU Stern.

Need better data to inform your decisions?

Schedule a Consultation