Python Threading:Is It Okay To Read/write Multiple Mutually Exclusive Parts Of A File Concurrently?
Solution 1:
In general, no.
Concurrent reading and writing behavior is heavily dependent on both the underlying operating system and filesystem.
You may be able to get something working by reading and writing chunks that are both a multiple of the underlying block size and are block-aligned. But you are likely in the world of "undefined behavior".
See also, related question: How do filesystems handle concurrent read/write?
Solution 2:
The OP wants multithreaded access to a file, not across multiple programs or even a network.
Therefore I say YES you can do that.
For instance:
def job_handler(id, job_queue):
fh = open('test')
while True:
time.sleep(0.1)
try:
job = job_queue.get_nowait()
# Do the job
# fh.read(job.offset, job.size)
# Work with data
# fh.write(job.offset, job.size)
except queue.Empty:
fh.close()
exit(0)
if __name__ == '__main__':
job_queue = mp.Queue()
for job in [(0, 100), (200, 100), (200, 100), (100, 100), (300, 100), (300, 100), (400, 100), (500, 100), (400, 100), (600, 100)]:
job_queue.put( job )
processes = []
for p in range(1,4):
processes.append( mp.Process(target = job_handler, args = (p, job_queue) ) )
for p in processes:
p.start()
time.sleep(0.1)
for p in processes:
p.join()
In order to demonstrate what I mean with risk, i have duplicated jobs in the job_queue. Watch out the line [CLASH], without control there is a rw of process 3 within a rw of process 2.
Output:
Start Job handler 1
Start Job handler 2
1: read offset=0
2: read offset=200
Start Job handler 3
3: read offset=200
[CLASH] offset:200 read by process:{2}
1: write offset=0
1: read offset=100
3: write offset=200
2: write offset=200
...
exit(0) job_handler 3
exit(0) job_handler 2
exit(0) job_handler 1
Conclusion, if you don't have such duplicate parts you can do it without locking.
I would suggest to use a File Handle per process/thread.
Post a Comment for "Python Threading:Is It Okay To Read/write Multiple Mutually Exclusive Parts Of A File Concurrently?"