Processing Simultaneous/asynchronous Requests With Python Basehttpserver
I've set up a threaded (with Python threads) HTTP server by creating a class that inherits from HTTPServer and ThreadingMixIn: class ThreadedHTTPServer(ThreadingMixIn, HTTPServer):
Solution 1:
classThreadedHTTPServer(ThreadingMixIn, HTTPServer):
pass
is enough. Your client probably don't make concurrent requests. If you make the requests in parallel the threaded server works as expected. Here's the client:
#!/usr/bin/env pythonimport sys
import urllib2
from threading import Thread
defmake_request(url):
print urllib2.urlopen(url).read()
defmain():
port = int(sys.argv[1]) iflen(sys.argv) > 1else8000for _ inrange(10):
Thread(target=make_request, args=("http://localhost:%d" % port,)).start()
main()
And the corresponding server:
import time
from BaseHTTPServer import BaseHTTPRequestHandler, HTTPServer, test as _test
from SocketServer import ThreadingMixIn
classThreadedHTTPServer(ThreadingMixIn, HTTPServer):
passclassSlowHandler(BaseHTTPRequestHandler):
defdo_GET(self):
self.send_response(200)
self.send_header("Content-type", "text/plain")
self.end_headers()
self.wfile.write("Entered GET request handler")
time.sleep(1)
self.wfile.write("Sending response!")
deftest(HandlerClass = SlowHandler,
ServerClass = ThreadedHTTPServer):
_test(HandlerClass, ServerClass)
if __name__ == '__main__':
test()
All 10 requests finish in 1 second. If you remove ThreadingMixIn
from the server definition then all 10 requests take 10 seconds to complete.
Post a Comment for "Processing Simultaneous/asynchronous Requests With Python Basehttpserver"