Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Python by (47.6k points)

I am opening a file which has 100,000 URL's. I need to send an HTTP request to each URL and print the status code. I am using Python 2.6, and so far looked at the many confusing ways Python implements threading/concurrency. I have even looked at the python concurrency library, but cannot figure out how to write this program correctly. Has anyone come across a similar problem? I guess generally I need to know how to perform thousands of tasks in Python as fast as possible - I suppose that means 'concurrently'.

1 Answer

0 votes
by (106k points)

The fastest way to send 100,000 HTTP requests in Python you can use tornado asynchronous networking library. See the code below:-

from tornado import ioloop, httpclient 

i = 0 

def handle_request(response): 

print(response.code) 

global i 

i -= 1 

if i == 0: 

ioloop.IOLoop.instance().stop() 

http_client = httpclient.AsyncHTTPClient() 

for url in open('urls.txt'): 

i += 1 

http_client.fetch(url.strip(), handle_request,

method='HEAD') 

ioloop.IOLoop.instance().start()

Related questions

0 votes
1 answer
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...