multiprocessing模块是python库中最高级和功能最强大的模块之一。本文就来给大家简单讲讲multiprocessing一般性技巧
进程是由系统自己管理的。
1:最基本的写法
from multiprocessing import pool
def f(x):
return x*x
if __name__ == ‘__main__’:
p = pool(5)
print(p.map(f, [1, 2, 3]))
[1, 4, 9]
2、实际上是通过os.fork的方法产生进程的
unix中,所有进程都是通过fork的方法产生的。
multiprocessing process
os
info(title):
title
, __name__
(os, ): , os.getppid()
, os.getpid()
f(name):
info()
, name
__name__ == :
info()
p = process(=f, =(,))
p.start()
p.join()
3、线程共享内存
threading
run(info_list,n):
info_list.append(n)
info_list
__name__ == :
info=[]
i ():
p=threading.thread(=run,=[info,i])
p.start()
[0]
[0, 1]
[0, 1, 2]
[0, 1, 2, 3]
[0, 1, 2, 3, 4]
[0, 1, 2, 3, 4, 5]
[0, 1, 2, 3, 4, 5, 6]
[0, 1, 2, 3, 4, 5, 6, 7]
[0, 1, 2, 3, 4, 5, 6, 7, 8]
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
进程不共享内存:
multiprocessing process
run(info_list,n):
info_list.append(n)
info_list
__name__ == :
info=[]
i ():
p=process(=run,=[info,i])
p.start()
[1]
[2]
[3]
[0]
[4]
[5]
[6]
[7]
[8]
[9]
若想共享内存,需使用multiprocessing模块中的queue
multiprocessing process, queue
f(q,n):
q.put([n,])
__name__ == :
q=queue()
i ():
p=process(=f,=(q,i))
p.start()
:
q.get()
4、锁:仅是对于屏幕的共享,因为进程是独立的,所以对于多进程没有用
multiprocessing process, lock
f(l, i):
l.acquire()
, i
l.release()
__name__ == :
lock = lock()
num ():
process(=f, =(lock, num)).start()
hello world 0
hello world 1
hello world 2
hello world 3
hello world 4
hello world 5
hello world 6
hello world 7
hello world 8
hello world 9
5、进程间内存共享:value,array
multiprocessing process, value, array
f(n, a):
n.value = i ((a)):
a[i] = -a[i]
__name__ == :
num = value(, )
arr = array(, ())
num.value
arr[:]
p = process(=f, =(num, arr))
p.start()
p.join()
0.0
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
3.1415927
[0, -1, -2, -3, -4, -5, -6, -7, -8, -9]
#manager共享方法,但速度慢
multiprocessing process, manager
f(d, l):
d[] = d[] = d[] = l.reverse()
__name__ == :
manager = manager()
d = manager.dict()
l = manager.list(())
p = process(=f, =(d, l))
p.start()
p.join()
d
l
# print ‘————-‘这里只是另一种写法
# print pool.map(f,range(10))
{0.25: none, 1: ‘1’, ‘2’: 2}
[9, 8, 7, 6, 5, 4, 3, 2, 1, 0]
#异步:这种写法用的不多
multiprocessing pool
time
f(x):
x*x
time.sleep()
x*x
__name__ == :
pool=pool(=)
res_list=[]
i ():
res=pool.apply_async(f,[i]) res_list.append(res)
r res_list:
r.get(timeout=10) #超时时间
同步的就是apply
更多简单谈谈python中的多进程相关文章请关注php中文网!