from concurrent.futures import ThreadPoolExecutor, ProcessPoolExecutor
from urllib.request import urlopen
from time import perf_counter
def work(n):
with urlopen("https://www.google.com/#{n}") as f:
contents = f.learn(32)
return contents
def run_pool(pool_type):
with pool_type() as pool:
begin = perf_counter()
outcomes = pool.map(work, numbers)
print ("Time:", perf_counter()-start)
print ([_ for _ in results])
if __name__ == '__main__':
numbers = [x for x in range(1,16)]
# Run the duty utilizing a thread pool
run_pool(ThreadPoolExecutor)
# Run the duty utilizing a course of pool
run_pool(ProcessPoolExecutor)
How Python multiprocessing works
Within the above instance, the concurrent.futures
module offers high-level pool objects for working work in threads (ThreadPoolExecutor
) and processes (ProcessPoolExecutor
). Each pool sorts have the identical API, so you’ll be able to create features that work interchangeably with each, as the instance reveals.
We use run_pool
to submit situations of the work
operate to the several types of swimming pools. By default, every pool occasion makes use of a single thread or course of per accessible CPU core. There’s a certain quantity of overhead related to creating swimming pools, so don’t overdo it. In the event you’re going to be processing a number of jobs over a protracted time frame, create the pool first and don’t eliminate it till you’re completed. With the Executor
objects, you need to use a context supervisor to create and eliminate swimming pools (with/as
).
pool.map()
is the operate we use to subdivide the work. The pool.map()
operate takes a operate with a listing of arguments to use to every occasion of the operate, splits the work into chunks (you’ll be able to specify the chunk dimension however the default is mostly positive), and feeds every chunk to a employee thread or course of.