Making multiple HTTP requests asynchronously

NVI picture NVI · Jan 24, 2010 · Viewed 15.3k times · Source
require 'net/http'

urls = [
  {'link' => 'http://www.google.com/'},
  {'link' => 'http://www.yandex.ru/'},
  {'link' => 'http://www.baidu.com/'}
]

urls.each do |u|
  u['content'] = Net::HTTP.get( URI.parse(u['link']) )
end

print urls

This code works in synchronous style. First request, second, third. I would like to send all requests asynchronously and print urls after all of them is done.

What the best way to do it? Is Fiber suited for that?

Answer

Joshua Penman picture Joshua Penman · Jan 5, 2015

I just saw this, a year and a bit later, but hopefully not too late for some googler...

Typhoeus by far the best solution for this. It wraps libcurl in a really elegant fashion. You can set the max_concurrency up to about 200 without it choking.

With respect to timeouts, if you pass Typhoeus a :timeout flag, it will just register a timeout as the response... and then you can even put the request back in another hydra to try again if you like.

Here's your program rewritten with Typhoeus. Hopefully this helps anybody who comes across this page later!

require 'typhoeus'

urls = [
  'http://www.google.com/',
  'http://www.yandex.ru/',
  'http://www.baidu.com/'
]

hydra = Typhoeus::Hydra.new

successes = 0

urls.each do |url|
    request = Typhoeus::Request.new(url, timeout: 15000)
    request.on_complete do |response|
        if response.success?
            puts "Successfully requested " + url
            successes += 1
        else
            puts "Failed to get " + url
        end
    end
    hydra.queue(request)
end

hydra.run 

puts "Fetched all urls!" if successes == urls.length