wininet or winhttp, which is preferred for POST requests - c++

I was going through the MSDN page comparing WinInet and WinHttp. Seems as though WinInet has more functionality than WinHttp. The MSDN page is here. Under what circumstances would one choose WinHttp over WinInet?
Considering that WinInet has HttpSendRequest that can be used to POST data and WinHttp also has WinHttpSendRequest. What would be the advantages of taking WinHttp over WinInet? Is there a difference in how the data is posted using WinHttp and WinInet?
In addition, if some examples of POSTing requests with both WinHttp and WinInet would help, some small sample codes.
EDIT 3 WIRESHARK SCREENSHOT
EDIT 2 I finally managed to get a HTTP_STATUS_CODE from the app and it's 200 OK but the problem comes when sending the post data, it is sent but the parameters aren't set. I tried implementing this code on the PHP end.
<?php
$fp = fopen("data.txt", "a");
fwrite($fp, "ID = " . $_POST['id']);
fclose($fp);
?>
the file is created as soon as the app runs but the ID parameter is not set, it writes to the file
"ID = " and no more. the 10 is not being passed through, no idea why
Thank You.
EDIT: Link to the example I tried to use.
HttpSendRequest to POST form data

Actually MSDN has a good page on WinHTTP vs. WinINet, stating:
With a few exceptions, WinINet is a superset of WinHTTP. When selecting between the two, you should use WinINet, unless you plan to run within a service or service-like process that requires impersonation and session isolation.

This example should help you out and if you are interested in WinInet example you can look in here.
Further more, a quote regarding WinInet from MSDN:
Note WinINet does not support server implementations. In addition, it should not be used from a service. For server implementations or services use Microsoft Windows HTTP Services (WinHTTP).
Which is a good advice.

Related

How to open the default browser in background and get the source code of a web page?

I'm using Dev-C++ and i'm looking for a mode to open(...or better...i need to load a browser intance in the background) the default browser (Example I.E.) and send a request to get the source code of the page I requested.
Can I do something like this in C++?
Thank you!
P.S. I need this for Windows
You seem to have imagined the wrong solution for your problem. If you want to get the HTML source for a web page, you don't need to somehow do it through the browser. You need to do whatever the browser does to get it.
When you enter an address into a browser, the browser sends a HTTP GET request to the server that hosts the resource you're requesting (often a web page) and the server sends a HTTP response back containing the resource content (often HTML) back.
You want to do the same in your application. You need to send a HTTP request to the server and read the response. A popular library for doing this is libcurl.
If you don't have to send a POST request (i.e. just a simple web request or with parameters passed on the URL (GET), then you could just use URLDownloadToFile().
If you don't want to use callbacks etc. and just download the file, you can call it rather simple:
URLDownloadToFile(0, "http://myserver/myfile", "C:\\mytempfile", 0, 0);
There are also a few other functions provided that will automatically push the downloaded data to a stream rather than a file.
It can't be done in pure C++. You should use native Windows library or other (like Qt Framework) and use it's capabilities of getting and parsing website. In Qt, you'd use QtWebkit.
edit: also if you want only the source code of a page, you can do this without using browser or their engines, you can use Winsock.

Is there a web service that spits out the entire request that was sent?

I apologize if this isn't "programming" worthy. I'm wondering if a service exists that when the HTTP service is pinged, it echos back the exact same request you made as the response.
The reason I want this is I want to UnitTest a class I made to build requests and send them over a socket. I realize I could just do a Mock object of some sort, but I think that involves more complexity than just making sure the request being sent was properly built.
Ideally, the web service would send the content back as proper HTTP 1.1 with the request info I sent in the body of the response.
Thanks!
Kyle
-- edit --
Just a quick reference to the solution. Point your browser to: http://scooterlabs.com/echo.json or http://scooterlabs.com/echo.xml
This guy seemed to have the same problem as you web service echo test
Refers to some links you might be interested in
I guess there are some uses for a simple echo, but in any kind of a realistic interaction it's going to be pretty hard to isolate just the piece you are trying to test.
A more general approach would be to use a local proxy server, stands as the man in the middle
between you and all remote sites, and can log urls, responses, content and so on.
If you're developing the server side as well as the client, you definitely ought to run a
local mirror of the server site.

Consume REST service that returns a single value

I am used to consuming Web services via a XMLHttpRequest, to retrieve xml or JSON.
Recently, I have been working with SharePoint REST services, which can return a single value (for example 5532, or "Jeff"). I am wondering if there is a more efficient way than XMLHttpRequest to retrieve this single value. For example, would it work if I loaded the REST url via an iframe, then retrieved the iframe content? Or is there any other well established method?
[Edit] By single value, I really mean that the service just returns these characters. This is not even presented in a JSON or xml response.
Any inefficiency in XMLHttpRequest is largely due to the overhead of HTTP, which the iframe approach is going to incur, as well. Furthermore, if the Sharepoint service expects to speak HTTP, you're going to need to speak HTTP. However, an API does not have to run over HTTP to be RESTful, per Roy Fielding, so if the service provided an API over a raw socket -- or if you simply wanted to craft your own slimmer HTTP request -- you could use a Flash socket via a library like: http://code.google.com/p/javascript-as3-socket/. You could cut the request message size down to under 100 bytes, and could pull out the response data trivially.
The jQuery library is a well established framework which you can use. It´s also an article which answer your concrete question at StackOverflow.

Asp Mvc 3 - Restful web service for consuming on multiple platforms

I am wanting to expose a restful web service for posting and retrieving data, this may be consumed by mobile devices or a web site.
Now the actual creation of the service isn't a problem, what does seem to be a problem is communicating from a different domain.
I have made a simple example service deployed on the ASP.NET development server, which just exposes a simple POST action to send a request with JSON content. Then I have created a simple web page using jquery ajax to send some dummy data over, yet I believe I am getting stung with the same origin policy.
Is this a common thing, and how do you get around it? Some places have mentioned having a proxy on the domain that you always request a get to, but then you cannot use it in a restful manner...
So is this a common issue with a simple fix? As there seem to be plenty of restful services out there that allow 3rd parties to use their service...
How exactly are you "getting stung with the same origin policy"? From your description, I don't see how it could be relevant. If yourdomain.com/some-path/defined-request.json returns a certain JSON response, then it will return that response regardless of what is requesting the file, unless you have specifically defined required credentials that are not satisfied.
Here is an example of such a web service. It will return the same JSON object regardless of from where the request is made: http://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&sensor=true
Unless I am misunderstanding you (in which case you should clarify your actual problem), the same origin policy doesn't really seem to apply here.
Update Re: Comment
"I make a simple HTML page and load it as file://myhtmlfilelocation/myhtmlfile.html and try to make an ajax request"
The cause of your problem is that you are using the file:// URL scheme, instead of the http:// protocol scheme. You can find information about this scheme in Section 3.10 of RFC 1738. Here is an excerpt:
The file URL scheme is used to designate files accessible on a particular host computer. This scheme, unlike most other URL schemes, does not designate a resource that is universally accessible over the Internet.
You should be able to resolve your issue by using the http:// scheme instead of the file:// scheme when you make your asynchronous HTTP request.

Uploading files through a HTTP POST in C++

I'm trying to send a file and other POST variables to a PHP script on my server. There are no good resources on Google and the code samples I've found don't work. Preferably without using cURL.
If you're going to roll your own you'd need the relevant RFC for HTTP file uploading (googling on "rfc http file upload" will yield the same result). This RFC also shows how to handle a mix of files and other FORM-data (or POST variables). The problem is of course that you'll probably want to read the MIME RFC as well...
Just a couple of resources make it pretty easy to roll your own
Here is an example of a GET request via ASIO (the C++ networking library in Boost)
Here is the HTTP protocol made really easy
The GET request is how you can view any page on your site. With that code you can download any page and get it as raw text. As you can see it sends a GET header to the server. As explained in that HTTP protocol page, the POST request looks like this
POST /path/script.cgi HTTP/1.0 From:
frog#jmarshall.com User-Agent:
HTTPTool/1.0 Content-Type:
application/x-www-form-urlencoded
Content-Length: 32
home=Cosby&favorite+flavor=flies
To send a file:
You put your URL after post
change the content type to the type of file you are trying to upload.
Set Content-Length to the number of bytes in that file
Append the file after a carrage return (replace "home=Cosby&favorite+flavor=flies")
Another (more quick-n-dirty) solution is to use a utility, via a system() or similar call.
For example the wget utility has a --post-file option.
I'd say roll your own. Its not too complicated.
Capture an HTTP post sent from a browser in Wireshark and reverse engineer as necessary using the spec as your guide. (See Andreas Magnusson's answer below for perhaps more relevant specs.)
I would recommend this approach personally for learning the protocol rather than just going pure spec. Its pretty difficult to learn things just from the spec. I would rather explore the different behaviors by known http clients and try to figure out how things are working by using the spec as my guide.
Format and send the data accordingly over a socket once you're comfortable with HTTP.
Also, If you are not familiar with socket programming, check out Beej's guide to socket programming.
this worked great for me on debian (http get, http post):
http://cpp-netlib.github.com
I use v 0.9.3 that requires boost 1.49