Re: python logging multiple processes to one file (via socket server)

On Oct 27, 6:09 am, Gelonida N <gelon...@xxxxxxxxx> wrote:

I have a rather 'simple' problem.
Logging from multiple processes to the same file AND be sure, that no
log message is lost,

1.) Log multiple processes to one file:

I have a python program, which I want to log, but which forks several times.

Due to the forking logging to files with the default logging.FileHandler
seems out of question.

It seems, that I could use a SocketHandler, which collects data from all
different processes and logs then to one file.

Does anybody have a working example.

2.) Ensure, that no log message is lost.
If I understood the doc of the SocketHandler, then
it will drop messages if the socket handler is not available.

However for my current activity I would prefer, that it aborts if it
cannot connect to the socket and that it blocks if the log server
doesn't handle the sent data fast enough.

Is this possible.

Thanlks a lot in advance.

What I found so far:

It states:
"The following section documents this approach in more detail and
includes a working socket receiver which can be used as a starting point
for you to adapt in your own applications."

Somehow I have a mental block though and fail to see the 'following

I also found ran
first tests.

However it seems, that this server stops logging my application after
about 128 log entries. (the number varies and is not necessarily exactly
128), whereas the console loggier continues logging.

I'm not really show why and would prefer a simpler example first.

Thanks in advance for any code example, idea, link, comment.

You might want to check out the SIMPL toolkit (http:// A SIMPL receiver will
"naturally" queue and serialize messages from multiple senders.