一次运行多个exec命令(但等待最后一个完成)

问题描述:

I've looked around for this and I can't seem to find anyone who is trying to do exactly what I am.

I have information that is passed in to my function via a _POST request. Based on that data, I run an exec command to run a TCL script a certain number of times (with different parameters, based on the post variable). Right now, I have the exec in a foreach so this takes forever to run (the TCL script takes 15 or so seconds to come back, so if I need to run it 100 times, I have a bit of an issue). Here is my code:

    public function executeAction(){
    //code to parse the _POST variable into an array called devices

    foreach($devices as $devID){
        exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2'], $execout[$devID]);
    }
    print_r($execout);
}

Obviously this code is just an excerpt with big chunks removed, but hopefully it's enough to demonstrate what I'm trying to do.

I need to run all of the execs at once and I need to wait for them all to complete before returning. I also need the output of all of the scripts stored in the array called $execout.

Any ideas?

Thanks!!!

我环顾四周,似乎无法找到任何正在努力做到的人 am。 p>

我有通过_POST请求传递给我的函数的信息。 基于该数据,我运行一个exec命令来运行一定次数的TCL脚本(使用不同的参数,基于post变量)。 现在,我在foreach中有exec所以这需要永远运行(TCL脚本需要15秒左右才能返回,所以如果我需要运行100次,我有点问题)。 这是我的代码: p>

  public function executeAction(){
 //代码将_POST变量解析成一个名为devices 
 
 foreach的数组($ devices as $  devID){
 exec(“../ path / to / script.tcl -parameter1”。$ device ['param1']。“ -  parameter2”。$ device ['param2'],$ execout [$ devID])  ; 
} 
 print_r($ execout); 
} 
  code>  pre> 
 
 

显然,此代码只是删除了大块的摘录,但希望它足以证明 我正在尝试做什么。 p>

我需要立即运行所有的执行程序,我需要等待它们全部完成才能返回。 我还需要存储在名为$ execout的数组中的所有脚本的输出。 p>

任何想法? p>

谢谢!!! p > div>

If you put your exec() call in a separate script, you can call that external script multiple times in parallel using curl_multi_exec(). That way, you'd make all the calls in separate requests, so they could execute simultaneously. Poll &$still_running to see when all requests have finished, after which you can collect the results from each.

Update: Here's a blog post detailing exactly what I'm describing.


Example

Based on the blog post linked above, I put together the following example.

Script being run in parallel:

// waitAndDate.php

<?php
sleep((int)$_GET['time']);
printf('%d secs; %s', $_GET['time'], shell_exec('date'));

Script making calls in parallel:

// multiExec.php

<?php
$start = microtime(true);

$mh = curl_multi_init();
$handles = array();

// create several requests
for ($i = 0; $i < 5; $i++) {
    $ch = curl_init();

    $rand = rand(5,25); // just making up data to pass to script
    curl_setopt($ch, CURLOPT_URL, "http://domain/waitAndDate.php?time=$rand");
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch, CURLOPT_TIMEOUT, 30);

    curl_multi_add_handle($mh, $ch);
    $handles[] = $ch;
}

// execute requests and poll periodically until all have completed
$isRunning = null;
do {
    curl_multi_exec($mh, $isRunning);
    usleep(250000);
} while ($isRunning > 0);

// fetch output of each request
$outputs = array();
for ($i = 0; $i < count($handles); $i++) {
    $outputs[$i] = trim(curl_multi_getcontent($handles[$i]));
    curl_multi_remove_handle($mh, $handles[$i]);
}

curl_multi_close($mh);

print_r($outputs);
printf("Elapsed time: %.2f seconds
", microtime(true) - $start);

Here is some output I received when running it a few times:

Array
(
    [0] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
    [1] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
    [2] => 18 secs; Mon Apr  2 19:01:43 UTC 2012
    [3] => 11 secs; Mon Apr  2 19:01:36 UTC 2012
    [4] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
)
Elapsed time: 18.36 seconds

Array
(
    [0] => 22 secs; Mon Apr  2 19:02:33 UTC 2012
    [1] => 9 secs; Mon Apr  2 19:02:20 UTC 2012
    [2] => 8 secs; Mon Apr  2 19:02:19 UTC 2012
    [3] => 11 secs; Mon Apr  2 19:02:22 UTC 2012
    [4] => 7 secs; Mon Apr  2 19:02:18 UTC 2012
)
Elapsed time: 22.37 seconds

Array
(
    [0] => 5 secs; Mon Apr  2 19:02:40 UTC 2012
    [1] => 18 secs; Mon Apr  2 19:02:53 UTC 2012
    [2] => 7 secs; Mon Apr  2 19:02:42 UTC 2012
    [3] => 9 secs; Mon Apr  2 19:02:44 UTC 2012
    [4] => 9 secs; Mon Apr  2 19:02:44 UTC 2012
)
Elapsed time: 18.35 seconds

Hope that helps!

One side note: make sure your web server can process this many parallel requests. If it serves them sequentially or can only serve very few simultaneously, this approach gains you little or nothing. :-)

Quoting PHP Documentation :

Note:

If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.

So, you can exec in background if you redirect the output in a file :

exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2']." > outputfile.txt", $execout[$devID]);

But if you want to wait that ALL execs are finished before continuing, you have to make a call back from the external script. Maybe like this :

exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2']." > ../path/to/outputfile.txt; ".PHP_BINDIR.DIRECTORY_SEPARATOR."php ../path/to/callback.php", $execout[$devID]);

Like this, your callback.php script will be called after every execs of script.tcl.

Maybe you can do something with these tricks.

PHP's exec function will always wait for a response from your execution. However you can send the stdout & stderror of the process to /dev/null (on unix) and have these all the scripts executed almost instantly. This can be done by adding..

 '> /dev/null 2>&1 &'

To the end of your execution string.

But! that means they'll fork off and finish processing independently. It may be worth having them write a response back somewhere. And you could create a listener to pick this up and process it.

You need to modify your script a bit

  1. Save post data to session
  2. Exec , save result to session
  3. Use redirection by using JavaScript
  4. Echo redirect command after exec return, with same URL but add incremental index such as ?index=99
  5. When index reaches end, show whole result

Look at ExecFuture andFutureIterator in the libputil library:

https://secure.phabricator.com/book/libphutil/class/ExecFuture/

It does exactly what you need with a pretty nice syntax:

$futures = array();
foreach ($files as $file) {
  $futures[$file] = new ExecFuture("gzip %s", $file);
}
foreach (new FutureIterator($futures) as $file => $future) {
  list($err, $stdout, $stderr) = $future->resolve();
  if (!$err) {
    echo "Compressed {$file}...
";
  } else {
    echo "Failed to compress {$file}!
";
  }
}