Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The reddit comments[1] on this article are very interesting as well. In particular, /u/driftingdev suggested it would be better to use concurrent.futures. To quote him/her:

In the last example:

    pool = Pool()    
    pool.map(create_thumbnail, images)
    pool.close()
    pool.join()
would be 2 fewer lines with concurrent.futures:

    with ThreadPoolExecutor(max_workers=4) as executor:
        executor.map(create_thumbnail, images)
[1] https://pay.reddit.com/r/Python/comments/1tyzw3/parallelism_...


I have a gigantic difference in execution time between the two. Also, you can have the same behavior importing contextlib.closing;

   from multiprocessing import Pool
   from contextlib import closing
   with closing(Pool(15)) as p
     p.map(f, myiterable)
takes <30 sec vs

   from concurrent.futures import ThreadPoolExecutor as Pool
   with Pool(15) as p
     p.map(f, myiterable)
that takes > 1min.

whereas ProcessPoolExecutor hangs, and I've no clue why.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: