开发者

Doing serverside validation then posting to remote script

开发者 https://www.devze.com 2022-12-08 15:48 出处:网络
I had thought I would use curl for this but it looks like I was mistaken.What I need to do is catch a post and do some database lookups for validation purposes and then post the validated data to a re

I had thought I would use curl for this but it looks like I was mistaken. What I need to do is catch a post and do some database lookups for validation purposes and then post the validated data to a remote URL.

I have done this by populating an html form and submitting it using javascript but this has obvious flaws. I want to construct a post and send the browser along exactly as if they had posted a form to the remote url.

Am I missing something in the curl docs? What is a Good™ way 开发者_运维百科to do this?


You can use curl for this.

But you might have other issues to think about. Cookies, sessions etc are set when a browser physically posts to the another url. These might not be set when post from the server. You also should check out screen scraping questions on SO for more on this.

Assuming this is not the case, you should be able to get the post, validate the fields, and repost using curl. There are many many examples of doing this.

edit

  • post form to your server.php
  • process/validate fields in server.php
  • post validated parameters using curl to remote.service


I haven´t tried it, but what I would do, is submit the form to it´s final destination and add a javascript onSubmit() function that does an ajax request to your server and returns a true or false.

That is, if you can rely on javascript...


Just have the request/response go like this:

+---------+   request    +--------+   curl request   +--------+
|         | -----------> |        | ---------------> |        |
| browser |              | url #1 |                  | url #2 |
|         | <----------- |        | <--------------- |        |
+---------+   response   +--------+   curl response  +--------+

The user sitting behind the browser won't have the benefit of knowing what the final URL (url #2 from above) is since it's nowhere in the HTML source, so they won't ever hack and jump past the middleman URL (url #1) manually.


I find that issuing a "wget" tends to be easier to manage than CURL.

$remoteContent = `wget -o - http://someremoteurl`;

Aside from that Matt's response is correct. However you if the response from the remote site that you're screen-scraping contains links, you'll have to search and replace them (if you want to handle them yourself) - at which point you're creating a proxy server....

-CF

0

精彩评论

暂无评论...
验证码 换一张
取 消