multithreading - Multiple python threads writing to different records in same list simultaneously - is this ok? -


i trying fix bug multiple threads writing list in memory. right have thread lock , running problems related work being done in threads.

i hoping make hash of lists, 1 each thread, , remove thread lock. seems each thread write own record without worrying others, perhaps fact using same owning hash problem.

does happen know if work or not? if not, i, example, dynamically add list package each thread? same thing?

i far threading expert advice welcome.

thanks,

import threading  def job(root_folder,my_list):     current,files,dirs in os.walk(root):         my_list.extend(files)         time.sleep(1)  my_lists = [[],[],[]] my_folders = ["c:\\windows","c:\\users","c:\\temp"] my_threads = [] folder,a_list in zip(my_folders,my_lists):     my_threads.append(threading.thread(target=job,args=(folder,a_list) thread in my_threads:    thread.start() thread in my_threads:    thread.join()  my_full_list = my_lists[0] + my_lists[1] + my_lists[2] 

this way each thread modifies own list , @ end combines individual lists

also pointed out gives 0 performance gain (actually slower not threading it... ) may performance gains using multiprocessing instead ...


Comments

Popular posts from this blog

javascript - RequestAnimationFrame not working when exiting fullscreen switching space on Safari -

Python ctypes access violation with const pointer arguments -