How to send huge amounts of data from child process to parent process in a non-blocking way in Node.js? -


i'm trying send huge json string child process parent process. initial approach following:

child: process.stdout.write(myhugejsonstring);

parent: child.stdout.on('data', function(data) { ...

but read process.stdout blocking:

process.stderr , process.stdout unlike other streams in node in writes them blocking.

  • they blocking in case refer regular files or tty file descriptors.
  • in case refer pipes:
    • they blocking in linux/unix.
    • they non-blocking other streams in windows.

the documentation child_process.spawn says can create pipe between child process , parent process using pipe option. isn't piping stdout blocking in linux/unix (according cited docs above)?

ok, stream objectoption? hmmmm, seems can share readable or writable stream refers socket child process. non-blocking? how implement that?

so question stands: how send huge amounts of data child process parent process in non-blocking way in node.js? cross-platform solution neat, examples explanation appreciated.

one neat trick used on *nix fifo pipes (http://linux.about.com/library/cmd/blcmdl4_fifo.htm). allows child write file thing , parent read same. file not on fs don't io problems, access handled kernel itself. but... if want cross-platform, won't work. there's no such thing on windows (as far know).

just note define size of pipe , if write (from child) not read else (from parent), child block when pipe full. not block node processes, see pipe normal file stream.


Comments

Popular posts from this blog

javascript - RequestAnimationFrame not working when exiting fullscreen switching space on Safari -

Python ctypes access violation with const pointer arguments -