注意.我见过 multiprocessing.Process 的日志输出 - 不幸的是,它没有回答这个问题问题.
NB. I have seen Log output of multiprocessing.Process - unfortunately, it doesn't answer this question.
我正在通过多处理创建一个子进程(在 Windows 上).我希望将子进程的 stdout 和 stderr 输出的 all 重定向到日志文件,而不是出现在控制台上.我看到的唯一建议是让子进程将 sys.stdout 设置为文件.但是,由于 Windows 上的标准输出重定向行为,这并不能有效地重定向所有标准输出输出.
I am creating a child process (on windows) via multiprocessing. I want all of the child process's stdout and stderr output to be redirected to a log file, rather than appearing at the console. The only suggestion I have seen is for the child process to set sys.stdout to a file. However, this does not effectively redirect all stdout output, due to the behaviour of stdout redirection on Windows.
为了说明问题,使用以下代码构建一个 Windows DLL
To illustrate the problem, build a Windows DLL with the following code
#include <iostream>
extern "C"
{
__declspec(dllexport) void writeToStdOut()
{
std::cout << "Writing to STDOUT from test DLL" << std::endl;
}
}
然后创建并运行如下 python 脚本,它会导入这个 DLL 并调用函数:
Then create and run a python script like the following, which imports this DLL and calls the function:
from ctypes import *
import sys
print
print "Writing to STDOUT from python, before redirect"
print
sys.stdout = open("stdout_redirect_log.txt", "w")
print "Writing to STDOUT from python, after redirect"
testdll = CDLL("Release/stdout_test.dll")
testdll.writeToStdOut()
为了看到与我相同的行为,可能需要针对不同于 Python 使用的 C 运行时构建 DLL.在我的例子中,python 是用 Visual Studio 2010 构建的,但我的 DLL 是用 VS 2005 构建的.
In order to see the same behaviour as me, it is probably necessary for the DLL to be built against a different C runtime than than the one Python uses. In my case, python is built with Visual Studio 2010, but my DLL is built with VS 2005.
我看到的行为是控制台显示:
The behaviour I see is that the console shows:
> stdout_test.py
Writing to STDOUT from python, before redirect
Writing to STDOUT from test DLL
虽然文件 stdout_redirect_log.txt 最终包含:
While the file stdout_redirect_log.txt ends up containing:
Writing to STDOUT from python, after redirect
换句话说,设置 sys.stdout 无法重定向 DLL 生成的 stdout 输出.鉴于 Windows 中用于标准输出重定向的底层 API 的性质,这不足为奇.我之前在本机/C++ 级别遇到过这个问题,但从未找到一种方法来可靠地从进程中重定向标准输出.它必须在外部完成.
In other words, setting sys.stdout failed to redirect the stdout output generated by the DLL. This is unsurprising given the nature of the underlying APIs for stdout redirection in Windows. I have encountered this problem at the native/C++ level before and never found a way to reliably redirect stdout from within a process. It has to be done externally.
这实际上是我启动子进程的原因 - 这样我就可以从外部连接到它的管道,从而保证我正在拦截它的所有输出.我绝对可以通过使用 pywin32 手动启动进程来做到这一点,但我非常希望能够使用多处理的功能,特别是通过多处理 Pipe 对象与子进程通信的能力,以获得进展更新.问题是是否有任何方法既可以为其 IPC 设施使用多处理和可靠地将所有孩子的 stdout 和 stderr 输出重定向到一个文件.
This is actually the very reason I am launching a child process - it's so that I can connect externally to its pipes and thus guarantee that I am intercepting all of its output. I can definitely do this by launching the process manually with pywin32, but I would very much like to be able to use the facilities of multiprocessing, in particular the ability to communicate with the child process via a multiprocessing Pipe object, in order to get progress updates. The question is whether there is any way to both use multiprocessing for its IPC facilities and to reliably redirect all of the child's stdout and stderr output to a file.
更新:查看 multiprocessing.Processs 的源代码,它有一个静态成员 _Popen,看起来它可以用来覆盖用于创建进程的类.如果设置为 None (默认),它使用一个 multiprocessing.forking._Popen,但它看起来像说
UPDATE: Looking at the source code for multiprocessing.Processs, it has a static member, _Popen, which looks like it can be used to override the class used to create the process. If it's set to None (default), it uses a multiprocessing.forking._Popen, but it looks like by saying
multiprocessing.Process._Popen = MyPopenClass
我可以覆盖进程创建.然而,虽然我可以从 multiprocessing.forking._Popen 中得到它,但看起来我必须将一堆内部的东西复制到我的实现中,这听起来很不稳定,而且不是很面向未来.如果这是唯一的选择,我想我可能会更愿意用 pywin32 手动完成整个工作.
I could override the process creation. However, although I could derive this from multiprocessing.forking._Popen, it looks like I would have to copy a bunch of internal stuff into my implementation, which sounds flaky and not very future-proof. If that's the only choice I think I'd probably plump for doing the whole thing manually with pywin32 instead.
您建议的解决方案是一个不错的解决方案:手动创建您的进程,以便您可以显式访问它们的 stdout/stderr 文件句柄.然后,您可以创建一个套接字与子进程通信,并在该套接字上使用 multiprocessing.connection(multiprocessing.Pipe 创建相同类型的连接对象,因此这应该为您提供所有相同的 IPC 功能).
The solution you suggest is a good one: create your processes manually such that you have explicit access to their stdout/stderr file handles. You can then create a socket to communicate with the sub-process and use multiprocessing.connection over that socket (multiprocessing.Pipe creates the same type of connection object, so this should give you all the same IPC functionality).
这是一个包含两个文件的示例.
Here's a two-file example.
master.py:
import multiprocessing.connection
import subprocess
import socket
import sys, os
## Listen for connection from remote process (and find free port number)
port = 10000
while True:
try:
l = multiprocessing.connection.Listener(('localhost', int(port)), authkey="secret")
break
except socket.error as ex:
if ex.errno != 98:
raise
port += 1 ## if errno==98, then port is not available.
proc = subprocess.Popen((sys.executable, "subproc.py", str(port)), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
## open connection for remote process
conn = l.accept()
conn.send([1, "asd", None])
print(proc.stdout.readline())
subproc.py:
import multiprocessing.connection
import subprocess
import sys, os, time
port = int(sys.argv[1])
conn = multiprocessing.connection.Client(('localhost', port), authkey="secret")
while True:
try:
obj = conn.recv()
print("received: %s
" % str(obj))
sys.stdout.flush()
except EOFError: ## connection closed
break
您可能还想查看 this问题 从子进程中获取非阻塞读取.
You may also want to see the first answer to this question to get non-blocking reads from the subprocess.
这篇关于多处理:我怎样才能 ʀᴇʟɪᴀʙʟʏ 从子进程重定向标准输出?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!