[Israel.pm] print to 2 places at the same time
migo at homemail.com
Wed Jun 2 05:26:35 PDT 2004
On 02 Jun 2004 10:43:11 +0200, Offer Kaye wrote:
> > Well after the open line you have 2 concurrent processes writting
> concurrent? How? First I print to both STDOUT and OUT, later I print to
> STDOUT alone, finally I print to both STDOUT and OUT. Nothing concurrent
I hope you understand that if you have 2 processes (this is what you
have) and both write to STDOUT (the tee process writes to its standard
output that happens to be the same as the perl's STDOUT), and you only
have a one-way synchronisation, then you are likely to get random output
from two processes.
The solutions are: 1) add 2-way synchronisation, 2) keep one way
synchronization but make only one process to actually print (this is what
my solution does).
> > One trick is to add this line after your open line:
> > open STDOUT, ">&", \*OUT;
> > select OUT; $| = 1; select STDOUT; $| = 1;
> > and this line before your close line:
> > close(STDOUT);
> > There are other solutions too, but maybe this one is good enough.
> Okay, now I have the code:
> open(OUT,"| tee foo.txt") or die "tee foo no good: $!\n";
> open STDOUT, ">&", \*OUT;
> select OUT; $| = 1; select STDOUT; $| = 1;
> print OUT "foobar\n";
> print "AAAAAAAAARRRRRRRRRGGGGGGGGG\n";
> print OUT "barfoo\n";
> close(OUT) or die "wtf?\n";
> close(STDOUT) or die "wtf2?\n";
> But while it does the job, it hangs at the end, so that I have to do "^c" to
> stop it. This is not good.
Did not I say you should close(STDOUT) _before_ close(OUT)? :)
In short, you can't close the original stream (OUT) before you close all
its copies (i.e. STDOUT) first, see "man dup".
> BTW, I would appreciate an explanation of what the second open does...
Everything that you write to STDOUT (print without arguments) is actually
directed to OUT, there are examples in "perldoc -f open". This was meant
as a quick hack to fix your existing code.
perl -e 'print+chr(64+hex)for+split//,d9b815c07f9b8d1e'
More information about the Perl