Requests - How To Stream Upload - Partial File
My goal is to do a PUT of part of a file using requests and stream the file (i.e., not load it into memory and then do the PUT). This page explains how you would do that for an en
Solution 1:
Based off Greg's answers to my questions I think the following will work best:
First you'll need something to wrap your open file so that it limits how much data can be read:
classFileLimiter(object):
def__init__(self, file_obj, read_limit):
self.read_limit = read_limit
self.amount_seen = 0
self.file_obj = file_obj
# So that requests doesn't try to chunk the upload but will instead stream it:
self.len = read_limit
defread(self, amount=-1):
if self.amount_seen >= self.read_limit:
returnb''
remaining_amount = self.read_limit - self.amount_seen
data = self.file_obj.read(min(amount, remaining_amount))
self.amount_seen += len(data)
return data
This should roughly work as a good wrapper object. Then you would use it like so:
withopen('my_large_file', 'rb') as file_obj:
file_obj.seek(my_offset)
upload = FileLimiter(file_obj, my_chunk_limit)
r = requests.post(url, data=upload, headers={'Content-Type': 'application/octet-stream'})
The headers are obviously optional, but when streaming data to a server, it's a good idea to be a considerate user and tell the server what the type of the content is that you're sending.
Solution 2:
I'm just throwing 2 other answers together so bear with me if it doesn't work out of the box - I have no means of testing this:
Lazy Method for Reading Big File in Python?
http://docs.python-requests.org/en/latest/user/advanced/#chunk-encoded-requests
defread_in_chunks(file_object, blocksize=1024, chunks=-1):
"""Lazy function (generator) to read a file piece by piece.
Default chunk size: 1k."""while chunks:
data = file_object.read(blocksize)
ifnot data:
breakyield data
chunks -= 1
requests.post('http://some.url/chunked', data=read_in_chunks(f))
Post a Comment for "Requests - How To Stream Upload - Partial File"