Perl: execute 10 system processes async

This question is for running perl on windows 2012 server.

So I have a folder called Commands_To_Run and under there are 100 batch files e.g.

 - run_1.bat
 - run_2.bat
 - run_3.bat 
 - run_100.bat

Each of these run*.bat files take about 30 mins to complete. If I run these batch files serially using a FOR loop then it takes me 100 * 30 min to run. (Too long!)

What I want to is write a perl script that will execute 10 batch files at a time. Once any one of the batch files complete the next batch file would get executed.

For example I would like to execute run1.bat through run10.bat. Let's say run7.bat finishes then I want to run next run11.bat and so on. So there are 10 files running at any given time.

I thought about using this perl script to run batch file but this will run all 100 at the same time and it will kill my windows CPU & processing.

for ($x=0; $x < scalar(@files); $x++ ) {
    chomp $file;
    $cmd="start $file ";
    print "Runnung Command is: $cmd\n";

I looked at the suggestion given but there is no working example of how to use use Forks::Super

A simple way to run processes in parallel and in a queue is with Parallel::ForkManager

use warnings;
use strict;
use feature 'say';

use Parallel::ForkManager;    

my $pm = Parallel::ForkManager->new(10); 

# Prepare the list of your batch files (better get names from disk)
my @batch_files = map { "Commands_To_Run/run_$_.bat" } 1..100;

foreach my $batch_file (@batch_files)
    $pm->start and next;
    # Run batch job
    say "Running: $batch_file";
    #system($batch_file);        # uncomment to actually run the jobs

This is a minimal but working script. See, for example, this post and this post for more on how jobs go and in particular on how to return data from jobs.

Note: This is not a core module, so you'd likely need to install it

Multiple asynchronous execution of commands, The only things to be careful of is if the process you're running produces a lot of output; in which case it will fill the output buffer and block until you� The IO::Async::Loop object provides a number of methods to facilitate the running of child processes. spawn_child is primarily a wrapper around the typical fork(2) / exec(2) style of starting child processes, and run_child provide a method similar to perl's readpipe (which is used to implement backticks `` ).

The fmap_scalar function from Future::Utils can handle all of the logic of keeping a certain amount of processes running, and IO::Async::Process can run and manage each process asynchronously (given it's windows, I'm not sure if all of this will work sensibly):

use strict;
use warnings;
use IO::Async::Loop;
use Future::Utils 'fmap_scalar';

my @commands = ...;

my $loop = IO::Async::Loop->new;

my $f = fmap_scalar {
  my $cmd = shift;
  my $f = $loop->new_future;
  $loop->open_process(command => $cmd, on_finish => sub { $f->done($_[1]) });
  return $f;
} foreach => \@commands, concurrent => 10;

my @exit_codes = $f->get; # starts the loop until all processes are done

Controlling multiple asynchronous processes in Perl, Perl is a great glueware language which includes a number of facilities that allow you to run other processes from your Perl script - for example, you can open a file � On Windows, only the system PROGRAM LIST syntax will reliably avoid using the shell; system LIST, even with more than one element, will fall back to the shell if the first spawn fails. Perl will attempt to flush all files opened for output before any operation that may do a fork, but this may not be supported on some platforms (see perlport).

Parallel::ForkManager relies on fork, a feature of Unix systems that is badly emulated by Perl (using threads) on Windows systems. I would recommend using threads directly instead. Less can go wrong that way.

use threads; 
use Thread::Queue 3.01 qw( );

sub worker {
   my ($command) = @_;

   my $q = Thread::Queue->new();
   for (1..10) {
      async {
         while (my $job = $q->dequeue()) {

   $q->enqueue($_) for @commands;
   $_->join for threads->list;

Proc::Async, Run a process or coderef asynchronously. It should work on any Unix system as well as Windows 2000 and Using this method is a more efficient way to execute Perl code than As a valued partner and proud supporter of MetaCPAN, StickerYou is happy to offer a 10% discount on all Custom Stickers,� Perl is a great glueware language which includes a number of facilities that allow you to run other processes from your Perl script - for example, you can open a file using a | character in the file handle, and you can run another process by calling it up in backquotes.

Just putting it out there, but it could be possible doing this with a batch file as well, this should loop through all .bat files, checking process count and only kick of new ones if processes are not less or equal to 9 (if equal to 9 it will still kick of one):

@echo off
setlocal enabledelayedexpansion
set cnt=1
for %%i in (*.bat) do (
    set id=%%i
    call :check

for /f "tokens=1,*" %%a in ('tasklist /FI "WINDOWTITLE eq _process*" ^| find /I /C "cmd.exe"') do set procs=%%a
    if !procs! leq 9 (
    if not "!id!"=="%0" start "_process!cnt!" !id!
    set /a cnt+=1
   ) else (
     goto check

AnyEvent::Run, At some point everyone gets the desire to be able to do multiple things at once or be in everything written here works on anything but a UNIX-esque system. Caveat off another copy of a running Perl interpreter that continues to execute the current program. parent process gets set to the process ID of the child process. This module can execute an external process, monitor its state, get its results and, if needed, kill it prematurely and remove its results. There are, of course, many modules that cover similar functionality, including functions directly built-in in Perl. So why to have this module, at all?

[PDF] Practical Perl Tools: Parallel Asynchronicity, Part 1, Each process still has only one thread running at once, though, regardless of how With multiprocessor kernel threads on a machine with multiple CPUs, the OS Cooperative multitasking systems have running threads give up control if one Since both the code inside and after the async start executing, you need to be� Currently, on all platforms except MSWin32, all system calls (e.g., using system() or back-ticks) made from threads use the environment variable settings from the main thread. In other words, changes made to %ENV in a thread will not be visible in system calls made by that thread. To work around this, set environment variables as part of the

perlothrtut, Coro achieves that by supporting multiple running interpreters that share data, which use Coro; async { # some asynchronous thread of execution print "2\n"; cede; too: disabling the Windows process emulation code in your perl and using� Either a reference to an array containing the command and its arguments, or a plain string containing the command. This value is passed into perl's exec function. code => CODE. A block of code to execute in the child process. It will be called in scalar context inside an eval block. setup => ARRAY

coro(3): only real threads in perl, #!/usr/local/bin/perl $| = 1; print "Content-type: text/plain", "\n\n"; print "We are The fork command actually creates a child process, and returns the PID of the The system command that we have been using to execute UNIX commands is� If an async method doesn’t use an await operator to mark a suspension point, the method executes as a synchronous method does, despite the async modifier. The compiler issues a warning for such methods. Asynchrony is essential for activities that are potentially blocking, such as when your application accesses the web or a file system.

  • Possible duplicate of perl process queue
  • You need GNU Parallel.
  • @pilcrow I looked at the suggestion given but there is no working example of how to use use Forks::Super. How would it work in my case? I don't get it
  • As an unrelated note, your for loop example can be written more perlish as: foreach my $file (@files) {...} (note foreach and for are in practice synonyms)
  • good working script - for completeness add system($batch_file); after running line as that will do the actual execution of my run_*.bat files
  • @SamB Thank you for edits, I've tweaked them a little for what I think you meant. I commented out the line to actually run the jobs, as I always do; I like to leave it to people to check things before they enable their actual bulk runs! Another fluid part is where the folder is in relation to the script, etc, but you can adjust that.
  • One thing needs to be mentioned here it waits for all children i.e. it will process the next 10 only after every file in the first 10 batch is processed. Sorry I accepted the answer too soon.
  • @SamB Um, no it doesn't -- wait_all_childern only tells it to "reap" all child processes after it's all done (as all code that forks must; on windows it's probably going to join threads). While it's working it starts a new process as soon as one exits; it always keeps it at 10. (Can't see that with just printing because it's so fast!)
  • let me test it out more thoroughly. thank you for the great answer though.
  • I have strawberry installed and this is what I get -- Can't locate IO/Async/ in @INC (you may need to install the IO::Async::Loop module)
  • IO::Async and Future::Utils are both non-core modules, so need to be installed.