I want to scrape some data from a website.
Basically, the website has some tabular display and shows around 50 records. For more records, the user has to click some button which makes an ajax call get & show the next 50 records.
I have previous knowledge of Selenium webdriver(Python). I can do this very quickly in Selenium. But, Selenium is more kind of automation testing tool and it is very slow.
I did some R&D and found that using Scrapy or Mechanize, I can also do the same thing.
Should I go for Scrapy or Mechanize or Selenium for this ?
I would recommend you to go with a combination of Mechanize and ExecJS (https://github.com/sstephenson/execjs) to execute any javascript requests you might come across. I have used those two gems in combination for quite some time now and they do a great job.
You should choose this instead of Selenium, because it it will be a lot faster compared to having to render the entire page in a headless browser.