我在Python内的多处理方面breaking之以鼻,但我没有运气将话题缠住。基本上,我有一个过程很耗时。我需要将其运行在1到100的范围内,但是一旦满足我要寻找的条件,我想中止所有进程。条件是返回值== 90。
这是一个非多进程代码块。谁能给我一个例子,说明如何将其转换为多进程函数,一旦满足“ 90”的条件,代码将退出所有进程?
def Addsomething(i): SumOfSomething = i + 1 return SumOfSomething def RunMyProcess(): for i in range(100): Something = Addsomething(i) print Something return if __name__ == "__main__": RunMyProcess()
编辑:
测试第3版时出现此错误。知道是什么原因造成的吗?
Exception in thread Thread-3: Traceback (most recent call last): File "C:\Python27\lib\threading.py", line 554, in __bootstrap_inner self.run() File "C:\Python27\lib\threading.py", line 507, in run self.__target(*self.__args, **self.__kwargs) File "C:\Python27\lib\multiprocessing\pool.py", line 379, in _handle_results cache[job]._set(i, obj) File "C:\Python27\lib\multiprocessing\pool.py", line 527, in _set self._callback(self._value) File "N:\PV\_Proposals\2013\ESS - Clear Sky\01-CODE\MultiTest3.py", line 20, in check_result pool.terminate() File "C:\Python27\lib\multiprocessing\pool.py", line 423, in terminate self._terminate() File "C:\Python27\lib\multiprocessing\util.py", line 200, in __call__ res = self._callback(*self._args, **self._kwargs) File "C:\Python27\lib\multiprocessing\pool.py", line 476, in _terminate_pool result_handler.join(1e100) File "C:\Python27\lib\threading.py", line 657, in join raise RuntimeError("cannot join current thread") RuntimeError: cannot join current thread
也许您正在寻找这样的东西?请记住,我正在为Python 3编写。上面的打印语句是Python 2,在这种情况下,应注意的是使用xrange而不是range。
from argparse import ArgumentParser from random import random from subprocess import Popen from sys import exit from time import sleep def add_something(i): # Sleep to simulate the long calculation sleep(random() * 30) return i + 1 def run_my_process(): # Start up all of the processes, pass i as command line argument # since you have your function in the same file, we'll have to handle that # inside 'main' below processes = [] for i in range(100): processes.append(Popen(['python', 'thisfile.py', str(i)])) # Wait for your desired process result # Might want to add a short sleep to the loop done = False while not done: for proc in processes: returncode = proc.poll() if returncode == 90: done = True break # Kill any process that are still running for proc in processes: if proc.returncode is None: # Might run into a race condition here, # so might want to wrap with try block proc.kill() if __name__ == '__main__': # Look for optional i argument here parser = ArgumentParser() parser.add_argument('i', type=int, nargs='?') i = parser.parse_args().i # If there isn't an i, then run the whole thing if i is None: run_my_process() else: # Otherwise, run your expensive calculation and return the result returncode = add_something(i) print(returncode) exit(returncode)
这是一个使用多处理模块而不是子进程的更干净的版本:
from random import random from multiprocessing import Process from sys import exit from time import sleep def add_something(i): # Sleep to simulate the long calculation sleep(random() * 30) exitcode = i + 1 print(exitcode) exit(exitcode) def run_my_process(): # Start up all of the processes processes = [] for i in range(100): proc = Process(target=add_something, args=[i]) processes.append(proc) proc.start() # Wait for the desired process result done = False while not done: for proc in processes: if proc.exitcode == 90: done = True break # Kill any processes that are still running for proc in processes: if proc.is_alive(): proc.terminate() if __name__ == '__main__': run_my_process()
编辑2:
这是最后一个版本,我认为它比其他两个版本要好得多:
from random import random from multiprocessing import Pool from time import sleep def add_something(i): # Sleep to simulate the long calculation sleep(random() * 30) return i + 1 def run_my_process(): # Create a process pool pool = Pool(100) # Callback function that checks results and kills the pool def check_result(result): print(result) if result == 90: pool.terminate() # Start up all of the processes for i in range(100): pool.apply_async(add_something, args=[i], callback=check_result) pool.close() pool.join() if __name__ == '__main__': run_my_process()