使用PHP Curl获取页面时出现400 Bad Request
I keep getting a 400 bad request code from a hotfile.com page when I try to get it with curl.
- I can get the pages fine in the browser
- The first (login) (post) request in the script works
- The last (get) request in the for loop works
- I have tried setting the curl headers to the same headers sent by my browser
- I have tried sleeping up to 5 seconds between requests, to no difference
The problem is all the curl get requests in the for loop return a 400 bad request except for the last one which is freakin weird to me.
Here's the link to the script: http://pastie.org/627436 I am using Sean Hubers curl wrapper: http://github.com/shuber/curl And also SimpleHMTLDOM: http://simplehtmldom.sourceforge.net/
It might be difficult for people to try unless you have a hotfile account as the script won't work on a non-registered account.
Cheers in advance :)
当我尝试用curl来获取它时,我一直从hotfile.com页面获得400个错误的请求代码。
- 我可以在浏览器中正常浏览页面 li>
- 脚本中的第一个(登录)(发布)请求可以正常工作 li>
- for循环中的最后一个(get)请求正常工作 li>
- 我尝试将curl标头设置为浏览器发送的相同标头 li>
- I 尝试在请求之间休息5秒,没有区别 li>
ol>
问题是for循环中的所有curl get请求都返回400错误请求,除了 最后一个对我来说很奇怪。 p>
这是脚本的链接: http ://pastie.org/627436 我正在使用Sean Hubers卷曲包装: http:// github .com / shuber / curl 还有SimpleHMTLDOM: http://simplehtmldom.sourceforge.net/ p>
除非您拥有热门文件帐户,否则人们可能很难尝试,因为脚本无法运行 非注册帐户。 p>
提前干杯:) p> div>
my first guess would be changing
$urls = explode("
",$_POST['urls']); => $urls = explode("
",$_POST['urls']);
(
=>
)
since you said the last one is working, I would imagine URLs prior to the last one are http:\xyz.com as a result of this explode. Basically make sure your URL list after explode contains no extra chars, perhaps even call a trim on it. Just a guess though since I can't test it without an account :)