i am going to create very basic web apps that fetch (scrape) all site search result. here is an example from me : click here. You can see that I've send any query to this site from the left menu (Rincian Pencarian Anda) then it return some results list.
Here is my big problem, i want to fetch all search result from this site to my site (don't worry about copyright cause i have read any rules that allowed to fetch the site result).
I am going to use PHP with cURl exts.
So... what's your specific question?
i want to fetch the result content (html) from that site then show them in my own site.
Okay...and what exactly is stopping you from doing that?
Btw, questions are followed by a "?" question mark, none of your post have one.
OK now i am builded a form that the visitor can choose the same option with the ticket.com site. So, i will need to send this option value to next page that will query to ticket.com and show the result to my site (i hope i can get the search result only not all website content)
my site (form with some option list) -> send the value to ticket.com -> get the result and display to my site.
is this need curl function? thank you sir.
You can use CURL, or file_get_contents if your site allows remote URL references, and the site in question accepts all its parameters via GET.
Incidentally, I read the rules that you claim to have read, and they state very specifically that you MAY NOT scrape their site without written permission of the company. I would suggest you go back and read again.