PHP - 一起使用Cron Jobs和GD

PHP  - 一起使用Cron Jobs和GD

问题描述:

I have a customer based website which requires them to upload an image, and on doing so my script will save about 25-30 variations of the image onto the server using the GD library. Because of the amount of images there is currently a very long waiting time for the customer to continue on the site waiting until all images have been created and saved. Until then they cannot proceed so we get a high level of customers leaving the site.

Is it possible after upload to instead store the image url in a database table record, then have a php script which creates the 25-30 images pull each record in the database and run every 5 minutes of the day using cronjob. This way it allows the customer to continue through the website and have the images automatically created 'in the background'

Will all this going on in the background cause any issues for the speed of my site? Will it slowdown the site for people browsing especially if 10-100's of customers are using it at the same time?

我有一个基于客户的网站,要求他们上传图片,这样我的脚本将节省大约25 -30使用GD库将图像变化到服务器上。 由于图像数量的原因,目前客户需要等待很长时间才能继续在网站上等待所有图像都已创建并保存。 在那之前他们无法继续,所以我们让高级别的客户离开了网站。 p>

上传之后是否可以将图片网址存储在数据库表记录中,然后有一个PHP脚本 创建25-30个图像拉动数据库中的每个记录,并使用cronjob每天运行5分钟。 通过这种方式,它允许客户继续浏览网站,并在后台自动创建图像。 p>

所有这些都会在后台进行,导致我的网站速度出现任何问题 ? 如果10-100的客户同时使用它,它会减慢浏览网站的速度吗? p> div>

Having a PHP script process the images would cause no more lag in a cron job than it would when the customer uploads the image and waits for the processing to complete in realtime. So in short, no there's no additional impact of this approach.

The catch is, you need to make sure your cron job is self-aware and does not create overlaps. For example if the cron runs and takes more than 5 minutes to complete its current task, what happens when a second cron spins up and begins processing another image (or the same one if you didn't implement the queue properly)? Well now you have two crons running, fighting for resources. Which means the second one will likely take over 5 minutes as well. Eventually you get to the point with 3, 4, etc crons all running at once. So make sure your crons are only "booting up" if there's not one running already.

All this being said, you'd probably be best off having another server handle the image processing depending on the size of your client's site and how heavy their traffic is. You could have a cloud server in a cluster with your production site server which can connect via local network to access an image, process it, and return the 25-30 copies to the server in the appropriate location. This way your processing queue occupies 0 resources of the public facing web server and will have no impact on speed of the site itself.

I suggest to start looking at queues, more specific to Gearman.

This will decrease the load time for your customers as you can offload the generation of the images to a separate server. And it scales easily across multiple servers if you need more processing power.

Sure you can store PATH of an image on your server and process it later. create php script that when run creates a LOCK file ie "/tmp/imgprocessor.lock" and deletes it at the end, if cron starts a new process you first check that file doesn't exist. I would store uploaded images in ie pathtoimages/toprocess/ and delete each one after processing or move it elsewhere. Have new images in ie /processed/

This way you don't need to query DB for path of images but just process what is in 'toprocess' folder and you can just have UNIQ_NAME_OF_IMAGE in table. In your web script before loading the oage check if UNIQ_NAME_OF_IMAGE exist in 'processed' folder , and if so display it ...

On server load , it depends how many images you originaly have and what sizes are, image processing can be heavy on server but processing 1000 users *30 images wont be a heavy duty task , as I said depends on size of images.

NOTE: if you go this way you need to make sure that when starting a cron ERROR log is also outputed to some log file. Cron script must be bullet proof ie if it fails for some reason LOCK file will remain , so no more processing will happen , you will need to delete it manually (or create custom error handler that deletes it and maybe send some mails ) , you should check periodically log file so you know whats going on.