Automatically get prices from to your Google sheet

If you are a regular seller on, then you know there is nothing more important than keeping a tab on the competition prices, and adjusting your prices accordingly. of course the seller console alerts you whenever there is a product available for a lower price, it does so in a limited manner. It doesn’t show you the seller name for example.

What if I could show you a way to keep all the data you want in a spreadsheet. Say your costing, lowest price you are willing to sell, and current lowest price available on souq.  Something like this sheet

If this looks interesting , Read on.

We are going to use google Sheets, importXML() function to automatically fetch the prices for you.  Here is the step by step

1. Get the URL which shows Top offers for your product.
This is the top offer page for Apple iPhone X which shows the seller with lowest price first.  This the URL I have put in the Column F of my Google Sheet.

2. Get the Xpath for Seller and Price.
Souq Displays the price in a Div with a Class called ‘field price-field‘.  You want the text that’s contained in this div. Further, you want only the first record.

The Xpath for this would be
(//div[@class=”field price-field”]/text())[1]

Similarly Xpath for Seller name would be 
(//div[@class=”field seller-name”])[1]

These are the Xpaths you see in the fields G and H.

3. Use google Sheets importxml() function to fetch this information

Finally use the ImportXML() function like this in the columns where you want to fetch price and corresponding Seller.  You used imporxml() function with the first argument as the URL and second as the Xpath.

There you have it, a single sheet that captures all the information you need to successfully keep a tab on your competition


Consider the following when you use this approach

1. Google sheets only allow you to have around 50-70 importXML calls per sheet. So you can’t put 500 products there and fetch the data for all. If you wish to do that, you could use python for scraping. I use python with Selenium to scrape souq whenever I need to scrape more than 100 items

2. If you can live with a slower solution, you can use Libreoffice webservice() and filterXML() functions. There is no limit to the amount of URLS is fetches.

2. Xpath I have used will change as Souq changes their website structure. anytime you get an error, inspect the document and see if the structure of the page has changed.

Leave a Comment

Your email address will not be published. Required fields are marked *