Cronjob脚本失败,但通过浏览器工作(Zip Archive PHP)

问题描述:

I have a PHP script that generates large zip files, when I execute it via SSH it works, but the same script fails intermittently when executed via cron.

Things I tried:

  • Changing paths inside the script, but since it executes via SSH it shouldn't be the issue
  • Changing path to the PHP executable inside the cron command (I ran "whereis php" and used the 2 available PHP executable locations)
  • Modified file permissions
  • Put time limit as ini_set('max_execution_time', 990000); and via set_time_limit (990000);
  • Half a dozen support tickets with the hosting company, but they can't help

Last thing I'm trying right now, which I don't think will help is manually setting a time limit after which the cronjob will fail by including: /bin/timeout -s 2 990000 but I think it's useless since normally there are no timelimits, unless I'm missing something.

Log file shows that the script fails after I instantiate an object from class ZipArchive and then try to execute the addFile method.

This is my current cron command:

30 4 * * * /bin/timeout -s 2 990000 /usr/bin/php /home/script.php > /tmp/script.log

Appreciate your help.

我有一个生成大型zip文件的PHP脚本,当我通过SSH执行它时,它可以工作,但是相同的脚本 通过cron执行时间歇 strong>失败。 p>

我尝试过的事情: p>

  • 更改脚本中的路径 ,但由于它通过SSH执行它不应该是问题 li>
  • 在cron命令中更改PHP可执行文件的路径(我运行“whereis php”并使用了2个可用的PHP可执行文件位置)
  • 修改后的文件权限 li>
  • 将时间限制为 ini_set('max_execution_time',990000); strong>并通过 set_time_limit(990000); strong> li>
  • 与托管公司的六张支持票,但他们无法帮助 li> ul>

    最后一件事我' 我现在正在尝试,我不认为这将有助于手动设置时间限制,之后cronjob将失败,包括: / bin / timeout -s 2 990000 strong>但我觉得它没用 通常是 重新没有时间限制,除非我遗漏了一些东西。 p>

    日志文件显示在我从类ZipArchive实例化一个对象然后尝试执行addFile方法后脚本失败。 p >

    这是我当前的cron命令: p>

      30 4 * * * / bin / timeout -s 2 990000 / usr / bin / php / home  /script.php>  /tmp/script.log
    
    nn

    感谢您的帮助。 p> div>

Unfortunately the only solution to make this work is an ugly work, which works great. It's a simple CURL script that executes the file that crontab cannot execute.. Then I put the CURL file in the crontab instead.

So I'm executing the CURL script via crontab, that then goes and executes the script that cannot be executed via cron. It's ugly, but it works..

# Open a PHP/CURL session
$s = curl_init();

# Configure the PHP/CURL command
curl_setopt($s, CURLOPT_USERAGENT,'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)');
curl_setopt($s, CURLOPT_URL, "HTTP://YOUR_DOMAIN.COM/SCRIPT.PHP"); // Define target site
curl_setopt($s, CURLOPT_RETURNTRANSFER, TRUE);  // Return file contents in a string
curl_setopt($s, CURLOPT_BINARYTRANSFER, false);  // Indicate binary transfer
curl_setopt($s, CURLOPT_REFERER, "https://google.ca");     // Referer value
curl_setopt($s, CURLOPT_SSL_VERIFYPEER, FALSE); // No certificate
curl_setopt($s, CURLOPT_FOLLOWLOCATION, TRUE);  // Follow redirects
curl_setopt($s, CURLOPT_MAXREDIRS, 4);          // Limit redirections to four

# Execute the PHP/CURL command (send contents of target web page to string)

if($run_the_script = curl_exec($s))
    echo "cron executed!";

The cron script the the following in the header, it's loose protection against normal users executing the script:

if (isset($_SERVER['REMOTE_ADDR']) AND $_SERVER['SERVER_ADDR'] != $_SERVER['REMOTE_ADDR']) die('Permission denied.');

^ Check that the file is being executed by a script that resides on the same IP as the server

Usually when a PHP script works through the webserver but doesn't through the command line interface it's because of a difference in configuration. You could try backing up your current CLI's php.ini and copying your current CGI's php.ini to the location of the CLI one. If you are then able to execute the script correctly you know it's because of the configuration.