I am making a Winamp plugin with the single function of sending the details of the song being played over HTTP to a webpage/webserver. I couldn't find anything like this that actually works, so decided to dive in for the first time into C++ and just do it myself.
The plugin is a C++ DLL file. I have got to the point where I can output the song title to a windows message box every time a new song is played (not bad for a C++ first-timer! ;) )
Where am I stuck:
I couldn't find anything to get me in the push notification on C++ direction AT ALL.
I tried, without any success, embedding/including HTTP libraries into my DLL to try post requests instead. libcurl, this library here, and this library too - but couldn't get any of them to work! I just kept getting linking errors and then some. After a few hours I just gave up.
I am a very skilled JavaScript programmer so I thought perhaps using JS to connect into a push notification service can work out but running JS inside C++ looks too complicated to me. Perhaps I'm wrong?
Bottom line, I don't know which solution is better (read: easier) to implement - push notifications or post requests?
I appreciate any solutions/input/directions/information or whatever you got. I can post code but the only code I have is the part getting the song information from Winamp and that part works.
Thanks in advance.
EDIT: I guess it's worth noting I'm using the new VS2012?
That's OK to use these kind of libraries, but when speaking of C++, I mainly think of "pure code", so I like the native style to do something, and do it without the help of libraries.
Thinking of that, I can provide you an example on how to send a HTTP's POST request, using Microsoft's winsock2 library.
First, I'd like to point out that I used this link to have a base on how to use winsock2.h.
Ok, now going on to the code, you need these includes:
#include <string>
#include <sstream>
#include <winsock2.h>
You'll also need to link winsock2 library (or specify it in your project settings, as you use VS2012):
#pragma comment(lib, "ws2_32.lib")
Now, the function I edited from the link (I've just edited it to make it look simpler, and also to do some error check and proper cleanup):
int http_post(char *hostname, char *api, char *parameters, std::string& message)
{
int result;
WSADATA wsaData;
result = WSAStartup(MAKEWORD(1, 1), &wsaData);
if(result != NO_ERROR)
{
//printf("WSAStartup failed: %d\n", result);
return 0;
}
sockaddr_in sin;
int sock = socket(AF_INET, SOCK_STREAM, 0);
if(sock == INVALID_SOCKET)
{
//printf("Error at socket(): %ld\n", WSAGetLastError());
WSACleanup();
return 0;
}
sin.sin_family = AF_INET;
sin.sin_port = htons(80);
struct hostent *host_addr = gethostbyname(hostname);
if(host_addr == NULL)
{
//printf("Unable to locate host\n");
closesocket(sock);
WSACleanup();
return 0;
}
sin.sin_addr.s_addr = *((int *)*host_addr->h_addr_list);
if(connect(sock, (const struct sockaddr *)&sin, sizeof(sockaddr_in)) == SOCKET_ERROR)
{
//printf("Unable to connect to server: %ld\n", WSAGetLastError());
closesocket(sock);
WSACleanup();
return 0;
}
std::stringstream stream;
stream << "POST " << api << " HTTP/1.0\r\n"
<< "User-Agent: Mozilla/5.0\r\n"
<< "Host: " << hostname << "\r\n"
<< "Content-Type: application/x-www-form-urlencoded;charset=utf-8\r\n"
<< "Content-Length: " << strlen(parameters) << "\r\n"
<< "Accept-Language: en-US;q=0.5\r\n"
<< "Accept-Encoding: gzip, deflate\r\n"
<< "Accept: */*\r\n"
<< "\r\n" << parameters
;
if(send(sock, stream.str().c_str(), stream.str().length(), 0) == SOCKET_ERROR)
{
//printf("send failed: %d\n", WSAGetLastError());
closesocket(sock);
WSACleanup();
return 0;
}
if(shutdown(sock, SD_SEND) == SOCKET_ERROR)
{
//printf("shutdown failed: %d\n", WSAGetLastError());
closesocket(sock);
WSACleanup();
return 0;
}
char buffer[1];
do
{
result = recv(sock, buffer, 1, 0);
if(result > 0)
{
//printf("Bytes received: %d\n", result);
message += buffer[0];
}
else if(result == 0)
{
//printf("Connection closed\n");
}
else
{
//printf("recv failed: %d\n", WSAGetLastError());
}
}
while(result > 0);
closesocket(sock);
WSACleanup();
return 1;
}
As you are using a DLL, I commented out the printfs, so you can use your proper output function according to the Winamp plugin API.
Now, to use the function, it's self-explanatory:
std::string post;
http_post("www.htmlcodetutorial.com", "/cgi-bin/mycgi.pl", "user=example123&artist=The Beatles&music=Eleanor Rigby", post);
By doing this, and checking the function return number (either 1 for success or 0 for failure), you can parse the result string returned from the site, and check if everything went OK.
About your last question, using POST request is OK, but you, as a webmaster, know that it's possible to "abuse" this option, such as request flooding, etc. But it really is the simplest way.
If you, by the server side part, parse the data correctly, and check for any improper use of the request, then you can use this method without any problems.
And last, as you are a C++ beginner, you'll need to know how to manipulate std::strings for parsing the song name and artist to the post POST message string safely, if you don't know yet, I recommend this link.
Is there any way to easily make a HTTP request with C++? Specifically, I want to download the contents of a page (an API) and check the contents to see if it contains a 1 or a 0. Is it also possible to download the contents into a string?
I had the same problem. libcurl is really complete. There is a C++ wrapper curlpp that might interest you as you ask for a C++ library. neon is another interesting C library that also support WebDAV.
curlpp seems natural if you use C++. There are many examples provided in the source distribution.
To get the content of an URL you do something like that (extracted from examples) :
// Edit : rewritten for cURLpp 0.7.3
// Note : namespace changed, was cURLpp in 0.7.2 ...
#include <curlpp/cURLpp.hpp>
#include <curlpp/Options.hpp>
// RAII cleanup
curlpp::Cleanup myCleanup;
// Send request and get a result.
// Here I use a shortcut to get it in a string stream ...
std::ostringstream os;
os << curlpp::options::Url(std::string("http://example.com"));
string asAskedInQuestion = os.str();
See the examples directory in curlpp source distribution, there is a lot of more complex cases, as well as a simple complete minimal one using curlpp.
my 2 cents ...
Windows code:
#include <string.h>
#include <winsock2.h>
#include <windows.h>
#include <iostream>
#include <vector>
#include <locale>
#include <sstream>
using namespace std;
#pragma comment(lib,"ws2_32.lib")
int main( void ){
WSADATA wsaData;
SOCKET Socket;
SOCKADDR_IN SockAddr;
int lineCount=0;
int rowCount=0;
struct hostent *host;
locale local;
char buffer[10000];
int i = 0 ;
int nDataLength;
string website_HTML;
// website url
string url = "www.google.com";
//HTTP GET
string get_http = "GET / HTTP/1.1\r\nHost: " + url + "\r\nConnection: close\r\n\r\n";
if (WSAStartup(MAKEWORD(2,2), &wsaData) != 0){
cout << "WSAStartup failed.\n";
system("pause");
//return 1;
}
Socket=socket(AF_INET,SOCK_STREAM,IPPROTO_TCP);
host = gethostbyname(url.c_str());
SockAddr.sin_port=htons(80);
SockAddr.sin_family=AF_INET;
SockAddr.sin_addr.s_addr = *((unsigned long*)host->h_addr);
if(connect(Socket,(SOCKADDR*)(&SockAddr),sizeof(SockAddr)) != 0){
cout << "Could not connect";
system("pause");
//return 1;
}
// send GET / HTTP
send(Socket,get_http.c_str(), strlen(get_http.c_str()),0 );
// recieve html
while ((nDataLength = recv(Socket,buffer,10000,0)) > 0){
int i = 0;
while (buffer[i] >= 32 || buffer[i] == '\n' || buffer[i] == '\r'){
website_HTML+=buffer[i];
i += 1;
}
}
closesocket(Socket);
WSACleanup();
// Display HTML source
cout<<website_HTML;
// pause
cout<<"\n\nPress ANY key to close.\n\n";
cin.ignore(); cin.get();
return 0;
}
Here is a much better implementation:
#include <windows.h>
#include <string>
#include <stdio.h>
using std::string;
#pragma comment(lib,"ws2_32.lib")
HINSTANCE hInst;
WSADATA wsaData;
void mParseUrl(char *mUrl, string &serverName, string &filepath, string &filename);
SOCKET connectToServer(char *szServerName, WORD portNum);
int getHeaderLength(char *content);
char *readUrl2(char *szUrl, long &bytesReturnedOut, char **headerOut);
int main()
{
const int bufLen = 1024;
char *szUrl = "http://stackoverflow.com";
long fileSize;
char *memBuffer, *headerBuffer;
FILE *fp;
memBuffer = headerBuffer = NULL;
if ( WSAStartup(0x101, &wsaData) != 0)
return -1;
memBuffer = readUrl2(szUrl, fileSize, &headerBuffer);
printf("returned from readUrl\n");
printf("data returned:\n%s", memBuffer);
if (fileSize != 0)
{
printf("Got some data\n");
fp = fopen("downloaded.file", "wb");
fwrite(memBuffer, 1, fileSize, fp);
fclose(fp);
delete(memBuffer);
delete(headerBuffer);
}
WSACleanup();
return 0;
}
void mParseUrl(char *mUrl, string &serverName, string &filepath, string &filename)
{
string::size_type n;
string url = mUrl;
if (url.substr(0,7) == "http://")
url.erase(0,7);
if (url.substr(0,8) == "https://")
url.erase(0,8);
n = url.find('/');
if (n != string::npos)
{
serverName = url.substr(0,n);
filepath = url.substr(n);
n = filepath.rfind('/');
filename = filepath.substr(n+1);
}
else
{
serverName = url;
filepath = "/";
filename = "";
}
}
SOCKET connectToServer(char *szServerName, WORD portNum)
{
struct hostent *hp;
unsigned int addr;
struct sockaddr_in server;
SOCKET conn;
conn = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP);
if (conn == INVALID_SOCKET)
return NULL;
if(inet_addr(szServerName)==INADDR_NONE)
{
hp=gethostbyname(szServerName);
}
else
{
addr=inet_addr(szServerName);
hp=gethostbyaddr((char*)&addr,sizeof(addr),AF_INET);
}
if(hp==NULL)
{
closesocket(conn);
return NULL;
}
server.sin_addr.s_addr=*((unsigned long*)hp->h_addr);
server.sin_family=AF_INET;
server.sin_port=htons(portNum);
if(connect(conn,(struct sockaddr*)&server,sizeof(server)))
{
closesocket(conn);
return NULL;
}
return conn;
}
int getHeaderLength(char *content)
{
const char *srchStr1 = "\r\n\r\n", *srchStr2 = "\n\r\n\r";
char *findPos;
int ofset = -1;
findPos = strstr(content, srchStr1);
if (findPos != NULL)
{
ofset = findPos - content;
ofset += strlen(srchStr1);
}
else
{
findPos = strstr(content, srchStr2);
if (findPos != NULL)
{
ofset = findPos - content;
ofset += strlen(srchStr2);
}
}
return ofset;
}
char *readUrl2(char *szUrl, long &bytesReturnedOut, char **headerOut)
{
const int bufSize = 512;
char readBuffer[bufSize], sendBuffer[bufSize], tmpBuffer[bufSize];
char *tmpResult=NULL, *result;
SOCKET conn;
string server, filepath, filename;
long totalBytesRead, thisReadSize, headerLen;
mParseUrl(szUrl, server, filepath, filename);
///////////// step 1, connect //////////////////////
conn = connectToServer((char*)server.c_str(), 80);
///////////// step 2, send GET request /////////////
sprintf(tmpBuffer, "GET %s HTTP/1.0", filepath.c_str());
strcpy(sendBuffer, tmpBuffer);
strcat(sendBuffer, "\r\n");
sprintf(tmpBuffer, "Host: %s", server.c_str());
strcat(sendBuffer, tmpBuffer);
strcat(sendBuffer, "\r\n");
strcat(sendBuffer, "\r\n");
send(conn, sendBuffer, strlen(sendBuffer), 0);
// SetWindowText(edit3Hwnd, sendBuffer);
printf("Buffer being sent:\n%s", sendBuffer);
///////////// step 3 - get received bytes ////////////////
// Receive until the peer closes the connection
totalBytesRead = 0;
while(1)
{
memset(readBuffer, 0, bufSize);
thisReadSize = recv (conn, readBuffer, bufSize, 0);
if ( thisReadSize <= 0 )
break;
tmpResult = (char*)realloc(tmpResult, thisReadSize+totalBytesRead);
memcpy(tmpResult+totalBytesRead, readBuffer, thisReadSize);
totalBytesRead += thisReadSize;
}
headerLen = getHeaderLength(tmpResult);
long contenLen = totalBytesRead-headerLen;
result = new char[contenLen+1];
memcpy(result, tmpResult+headerLen, contenLen);
result[contenLen] = 0x0;
char *myTmp;
myTmp = new char[headerLen+1];
strncpy(myTmp, tmpResult, headerLen);
myTmp[headerLen] = NULL;
delete(tmpResult);
*headerOut = myTmp;
bytesReturnedOut = contenLen;
closesocket(conn);
return(result);
}
Update 2020: I have a new answer that replaces this, now 8-years-old, one: https://stackoverflow.com/a/61177330/278976
On Linux, I tried cpp-netlib, libcurl, curlpp, urdl, boost::asio and considered Qt (but turned it down based on the license). All of these were either incomplete for this use, had sloppy interfaces, had poor documentation, were unmaintained or didn't support https.
Then, at the suggestion of https://stackoverflow.com/a/1012577/278976, I tried POCO. Wow, I wish I had seen this years ago. Here's an example of making an HTTP GET request with POCO:
https://stackoverflow.com/a/26026828/2817595
POCO is free, open source (boost license). And no, I don't have any affiliation with the company; I just really like their interfaces. Great job guys (and gals).
https://pocoproject.org/download.html
Hope this helps someone... it took me three days to try all of these libraries out.
There is a newer, less mature curl wrapper being developed called C++ Requests. Here's a simple GET request:
#include <iostream>
#include <cpr.h>
int main(int argc, char** argv) {
auto response = cpr::Get(cpr::Url{"http://httpbin.org/get"});
std::cout << response.text << std::endl;
}
It supports a wide variety of HTTP verbs and curl options. There's more usage documentation here.
Disclaimer: I'm the maintainer of this library.
Updated answer for April, 2020:
I've had a lot of success, recently, with cpp-httplib (both as a client and a server). It's mature and its approximate, single-threaded RPS is around 6k.
On more of the bleeding edge, there's a really promising framework, cpv-framework, that can get around 180k RPS on two cores (and will scale well with the number of cores because it's based on the seastar framework, which powers the fastest DBs on the planet, scylladb).
However, cpv-framework is still relatively immature; so, for most uses, I highly recommend cpp-httplib.
This recommendation replaces my previous answer (8 years ago).
Here is my minimal wrapper around cURL to be able just to fetch a webpage as a string. This is useful, for example, for unit testing. It is basically a RAII wrapper around the C code.
Install "libcurl" on your machine yum install libcurl libcurl-devel or equivalent.
Usage example:
CURLplusplus client;
string x = client.Get("http://google.com");
string y = client.Get("http://yahoo.com");
Class implementation:
#include <curl/curl.h>
class CURLplusplus
{
private:
CURL* curl;
stringstream ss;
long http_code;
public:
CURLplusplus()
: curl(curl_easy_init())
, http_code(0)
{
}
~CURLplusplus()
{
if (curl) curl_easy_cleanup(curl);
}
std::string Get(const std::string& url)
{
CURLcode res;
curl_easy_setopt(curl, CURLOPT_URL, url.c_str());
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1L);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, write_data);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, this);
ss.str("");
http_code = 0;
res = curl_easy_perform(curl);
if (res != CURLE_OK)
{
throw std::runtime_error(curl_easy_strerror(res));
}
curl_easy_getinfo(curl, CURLINFO_RESPONSE_CODE, &http_code);
return ss.str();
}
long GetHttpCode()
{
return http_code;
}
private:
static size_t write_data(void *buffer, size_t size, size_t nmemb, void *userp)
{
return static_cast<CURLplusplus*>(userp)->Write(buffer,size,nmemb);
}
size_t Write(void *buffer, size_t size, size_t nmemb)
{
ss.write((const char*)buffer,size*nmemb);
return size*nmemb;
}
};
As you want a C++ solution, you could use Qt. It has a QHttp class you can use.
You can check the docs:
http->setHost("qt.nokia.com");
http->get(QUrl::toPercentEncoding("/index.html"));
Qt also has a lot more to it that you could use in a common C++ app.
You may want to check C++ REST SDK (codename "Casablanca"). http://msdn.microsoft.com/en-us/library/jj950081.aspx
With the C++ REST SDK, you can more easily connect to HTTP servers from your C++ app.
Usage example:
#include <iostream>
#include <cpprest/http_client.h>
using namespace web::http; // Common HTTP functionality
using namespace web::http::client; // HTTP client features
int main(int argc, char** argv) {
http_client client("http://httpbin.org/");
http_response response;
// ordinary `get` request
response = client.request(methods::GET, "/get").get();
std::cout << response.extract_string().get() << "\n";
// working with json
response = client.request(methods::GET, "/get").get();
std::cout << "url: " << response.extract_json().get()[U("url")] << "\n";
}
The C++ REST SDK is a Microsoft project for cloud-based client-server communication in native code using a modern asynchronous C++ API design.
libCURL is a pretty good option for you. Depending on what you need to do, the tutorial should tell you what you want, specifically for the easy handle. But, basically, you could do this just to see the source of a page:
CURL* c;
c = curl_easy_init();
curl_easy_setopt( c, CURL_URL, "www.google.com" );
curl_easy_perform( c );
curl_easy_cleanup( c );
I believe this will cause the result to be printed to stdout. If you want to handle it instead -- which, I assume, you do -- you need to set the CURL_WRITEFUNCTION. All of that is covered in the curl tutorial linked above.
With this answer I refer to the answer from Software_Developer. By rebuilding the code I found that some parts are deprecated (gethostbyname()) or do not provide error handling (creation of sockets, sending something) for an operation.
The following windows code is tested with Visual Studio 2013 and Windows 8.1 64-bit as well as Windows 7 64-bit. It will target an IPv4 TCP Connection with the Web Server of www.google.com.
#include <winsock2.h>
#include <WS2tcpip.h>
#include <windows.h>
#include <iostream>
#pragma comment(lib,"ws2_32.lib")
using namespace std;
int main (){
// Initialize Dependencies to the Windows Socket.
WSADATA wsaData;
if (WSAStartup(MAKEWORD(2,2), &wsaData) != 0) {
cout << "WSAStartup failed.\n";
system("pause");
return -1;
}
// We first prepare some "hints" for the "getaddrinfo" function
// to tell it, that we are looking for a IPv4 TCP Connection.
struct addrinfo hints;
ZeroMemory(&hints, sizeof(hints));
hints.ai_family = AF_INET; // We are targeting IPv4
hints.ai_protocol = IPPROTO_TCP; // We are targeting TCP
hints.ai_socktype = SOCK_STREAM; // We are targeting TCP so its SOCK_STREAM
// Aquiring of the IPv4 address of a host using the newer
// "getaddrinfo" function which outdated "gethostbyname".
// It will search for IPv4 addresses using the TCP-Protocol.
struct addrinfo* targetAdressInfo = NULL;
DWORD getAddrRes = getaddrinfo("www.google.com", NULL, &hints, &targetAdressInfo);
if (getAddrRes != 0 || targetAdressInfo == NULL)
{
cout << "Could not resolve the Host Name" << endl;
system("pause");
WSACleanup();
return -1;
}
// Create the Socket Address Informations, using IPv4
// We dont have to take care of sin_zero, it is only used to extend the length of SOCKADDR_IN to the size of SOCKADDR
SOCKADDR_IN sockAddr;
sockAddr.sin_addr = ((struct sockaddr_in*) targetAdressInfo->ai_addr)->sin_addr; // The IPv4 Address from the Address Resolution Result
sockAddr.sin_family = AF_INET; // IPv4
sockAddr.sin_port = htons(80); // HTTP Port: 80
// We have to free the Address-Information from getaddrinfo again
freeaddrinfo(targetAdressInfo);
// Creation of a socket for the communication with the Web Server,
// using IPv4 and the TCP-Protocol
SOCKET webSocket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP);
if (webSocket == INVALID_SOCKET)
{
cout << "Creation of the Socket Failed" << endl;
system("pause");
WSACleanup();
return -1;
}
// Establishing a connection to the web Socket
cout << "Connecting...\n";
if(connect(webSocket, (SOCKADDR*)&sockAddr, sizeof(sockAddr)) != 0)
{
cout << "Could not connect";
system("pause");
closesocket(webSocket);
WSACleanup();
return -1;
}
cout << "Connected.\n";
// Sending a HTTP-GET-Request to the Web Server
const char* httpRequest = "GET / HTTP/1.1\r\nHost: www.google.com\r\nConnection: close\r\n\r\n";
int sentBytes = send(webSocket, httpRequest, strlen(httpRequest),0);
if (sentBytes < strlen(httpRequest) || sentBytes == SOCKET_ERROR)
{
cout << "Could not send the request to the Server" << endl;
system("pause");
closesocket(webSocket);
WSACleanup();
return -1;
}
// Receiving and Displaying an answer from the Web Server
char buffer[10000];
ZeroMemory(buffer, sizeof(buffer));
int dataLen;
while ((dataLen = recv(webSocket, buffer, sizeof(buffer), 0) > 0))
{
int i = 0;
while (buffer[i] >= 32 || buffer[i] == '\n' || buffer[i] == '\r') {
cout << buffer[i];
i += 1;
}
}
// Cleaning up Windows Socket Dependencies
closesocket(webSocket);
WSACleanup();
system("pause");
return 0;
}
References:
Deprecation of gethostbyname
Return Value of socket()
Return Value of send()
C++ does not provide any way to do it directly. It would entirely depend on what platforms and libraries that you have.
At worst case, you can use the boost::asio library to establish a TCP connection, send the HTTP headers (RFC 2616), and parse the responses directly. Looking at your application needs, this is simple enough to do.
Note that this does not require libcurl, Windows.h, or WinSock! No compilation of libraries, no project configuration, etc. I have this code working in Visual Studio 2017 c++ on Windows 10:
#pragma comment(lib, "urlmon.lib")
#include <urlmon.h>
#include <sstream>
using namespace std;
...
IStream* stream;
//Also works with https URL's - unsure about the extent of SSL support though.
HRESULT result = URLOpenBlockingStream(0, "http://google.com", &stream, 0, 0);
if (result != 0)
{
return 1;
}
char buffer[100];
unsigned long bytesRead;
stringstream ss;
stream->Read(buffer, 100, &bytesRead);
while (bytesRead > 0U)
{
ss.write(buffer, (long long)bytesRead);
stream->Read(buffer, 100, &bytesRead);
}
stream->Release();
string resultString = ss.str();
I just figured out how to do this, as I wanted a simple API access script, libraries like libcurl were causing me all kinds of problems (even when I followed the directions...), and WinSock is just too low-level and complicated.
I'm not quite sure about all of the IStream reading code (particularly the while condition - feel free to correct/improve), but hey, it works, hassle free! (It makes sense to me that, since I used a blocking (synchronous) call, this is fine, that bytesRead would always be > 0U until the stream (ISequentialStream?) is finished being read, but who knows.)
See also: URL Monikers and Asynchronous Pluggable Protocol Reference
Here is some code that will work with no need to use any 3rd party library:
First define your gateway, user, password and any other parameters you need to send to this specific server.
#define USERNAME "user"
#define PASSWORD "your password"
#define GATEWAY "your gateway"
Here is the code itself:
HINTERNET hOpenHandle, hResourceHandle, hConnectHandle;
const TCHAR* szHeaders = _T("Content-Type:application/json; charset=utf-8\r\n");
hOpenHandle = InternetOpen(_T("HTTPS"), INTERNET_OPEN_TYPE_DIRECT, NULL, NULL, 0);
if (hOpenHandle == NULL)
{
return false;
}
hConnectHandle = InternetConnect(hOpenHandle,
GATEWAY,
INTERNET_DEFAULT_HTTPS_PORT,
NULL, NULL, INTERNET_SERVICE_HTTP,
0, 1);
if (hConnectHandle == NULL)
{
InternetCloseHandle(hOpenHandle);
return false;
}
hResourceHandle = HttpOpenRequest(hConnectHandle,
_T("POST"),
GATEWAY,
NULL, NULL, NULL, INTERNET_FLAG_SECURE | INTERNET_FLAG_KEEP_CONNECTION,
1);
if (hResourceHandle == NULL)
{
InternetCloseHandle(hOpenHandle);
InternetCloseHandle(hConnectHandle);
return false;
}
InternetSetOption(hResourceHandle, INTERNET_OPTION_USERNAME, (LPVOID)USERNAME, _tcslen(USERNAME));
InternetSetOption(hResourceHandle, INTERNET_OPTION_PASSWORD, (LPVOID)PASSWORD, _tcslen(PASSWORD));
std::string buf;
if (HttpSendRequest(hResourceHandle, szHeaders, 0, NULL, 0))
{
while (true)
{
std::string part;
DWORD size;
if (!InternetQueryDataAvailable(hResourceHandle, &size, 0, 0))break;
if (size == 0)break;
part.resize(size);
if (!InternetReadFile(hResourceHandle, &part[0], part.size(), &size))break;
if (size == 0)break;
part.resize(size);
buf.append(part);
}
}
if (!buf.empty())
{
// Get data back
}
InternetCloseHandle(hResourceHandle);
InternetCloseHandle(hConnectHandle);
InternetCloseHandle(hOpenHandle);
That should work on a Win32 API environment.
Here is an example.
C and C++ don't have a standard library for HTTP or even for socket connections. Over the years some portable libraries have been developed. The most widely used, as others have said, is libcurl.
Here is a list of alternatives to libcurl (coming from the libcurl's web site).
Also, for Linux, this is a simple HTTP client. You could implement your own simple HTTP GET client, but this won't work if there are authentication or redirects involved or if you need to work behind a proxy. For these cases you need a full-blown library like libcurl.
For source code with libcurl, this is the closest to what you want (Libcurl has many examples). Look at the main function. The html content will be copied to the buffer, after a successfully connection. Just replace parseHtml with your own function.
The HTTP protocol is very simple, so it is very simple to write a HTTP client.
Here is one
https://github.com/pedro-vicente/lib_netsockets
It uses HTTP GET to retrieve a file from a web server, both server and file are command line parameters. The remote file is saved to a local copy.
Disclaimer: I am the author
check http.cc
https://github.com/pedro-vicente/lib_netsockets/blob/master/src/http.cc
int http_client_t::get(const char *path_remote_file)
{
char buf_request[1024];
//construct request message using class input parameters
sprintf(buf_request, "GET %s HTTP/1.1\r\nHost: %s\r\nConnection: close\r\n\r\n", path_remote_file, m_server_ip.c_str());
//send request, using built in tcp_client_t socket
if (this->write_all(buf_request, (int)strlen(buf_request)) < 0)
{
return -1;
}
EDIT: edited URL
You can use embeddedRest library. It is lightweight header-only library. So it is easy to include it to your project and it does not require compilation cause there no .cpp files in it.
Request example from readme.md from repo:
#include "UrlRequest.hpp"
//...
UrlRequest request;
request.host("api.vk.com");
const auto countryId = 1;
const auto count = 1000;
request.uri("/method/database.getCities",{
{ "lang", "ru" },
{ "country_id", countryId },
{ "count", count },
{ "need_all", "1" },
});
request.addHeader("Content-Type: application/json");
auto response = std::move(request.perform());
if (response.statusCode() == 200) {
cout << "status code = " << response.statusCode() << ", body = *" << response.body() << "*" << endl;
}else{
cout << "status code = " << response.statusCode() << ", description = " << response.statusDescription() << endl;
}
Here is some (relatively) simple C++11 code that uses libCURL to download a URL's content into a std::vector<char>:
http_download.hh
# pragma once
#include <string>
#include <vector>
std::vector<char> download(std::string url, long* responseCode = nullptr);
http_download.cc
#include "http_download.hh"
#include <curl/curl.h>
#include <sstream>
#include <stdexcept>
using namespace std;
size_t callback(void* contents, size_t size, size_t nmemb, void* user)
{
auto chunk = reinterpret_cast<char*>(contents);
auto buffer = reinterpret_cast<vector<char>*>(user);
size_t priorSize = buffer->size();
size_t sizeIncrease = size * nmemb;
buffer->resize(priorSize + sizeIncrease);
std::copy(chunk, chunk + sizeIncrease, buffer->data() + priorSize);
return sizeIncrease;
}
vector<char> download(string url, long* responseCode)
{
vector<char> data;
curl_global_init(CURL_GLOBAL_ALL);
CURL* handle = curl_easy_init();
curl_easy_setopt(handle, CURLOPT_URL, url.c_str());
curl_easy_setopt(handle, CURLOPT_WRITEFUNCTION, callback);
curl_easy_setopt(handle, CURLOPT_WRITEDATA, &data);
curl_easy_setopt(handle, CURLOPT_USERAGENT, "libcurl-agent/1.0");
CURLcode result = curl_easy_perform(handle);
if (responseCode != nullptr)
curl_easy_getinfo(handle, CURLINFO_RESPONSE_CODE, responseCode);
curl_easy_cleanup(handle);
curl_global_cleanup();
if (result != CURLE_OK)
{
stringstream err;
err << "Error downloading from URL \"" << url << "\": " << curl_easy_strerror(result);
throw runtime_error(err.str());
}
return data;
}
If you are looking for a HTTP client library in C++ that is supported in multiple platforms (Linux, Windows and Mac) for consuming Restful web services. You can have below options.
QT Network Library - Allows the application to send network requests and receive replies
C++ REST SDK - An emerging third-party HTTP library with PPL support
Libcurl - It is probably one of the most used http lib in the native world.
Is there any way to easily make a HTTP request with C++? Specifically, I want to download the contents of a page (an API) and check the contents to see if it contains a 1 or a 0. Is it also possible to download the contents into a string?
First off ... I know this question is 12 years old. However . None of the answers provided gave an example that was "simple" without the need to build some external library
Below is the most simple solution I could come up with to retrieve and print the contents of a webpage.
Some Documentation on the functions utilized in the example below
// wininet lib :
https://learn.microsoft.com/en-us/windows/win32/api/wininet/
// wininet->internetopena();
https://learn.microsoft.com/en-us/windows/win32/api/wininet/nf-wininet-internetopena
// wininet->intenetopenurla();
https://learn.microsoft.com/en-us/windows/win32/api/wininet/nf-wininet-internetopenurla
// wininet->internetreadfile();
https://learn.microsoft.com/en-us/windows/win32/api/wininet/nf-wininet-internetreadfile
// wininet->internetclosehandle();
https://learn.microsoft.com/en-us/windows/win32/api/wininet/nf-wininet-internetclosehandle
#include <iostream>
#include <WinSock2.h>
#include <wininet.h>
#pragma comment(lib, "wininet.lib")
int main()
{
// ESTABLISH SOME LOOSE VARIABLES
const int size = 4096;
char buf[size];
DWORD length;
// ESTABLISH CONNECTION TO THE INTERNET
HINTERNET internet = InternetOpenA("Mozilla/5.0", INTERNET_OPEN_TYPE_DIRECT, NULL, NULL, NULL);
if (!internet)
ExitProcess(EXIT_FAILURE); // Failed to establish connection to internet, Exit
// ATTEMPT TO CONNECT TO WEBSITE "google.com"
HINTERNET response = InternetOpenUrlA(internet, "http://www.google.com", NULL, NULL, NULL, NULL);
if (!response) {
// CONNECTION TO "google.com" FAILED
InternetCloseHandle(internet); // Close handle to internet
ExitProcess(EXIT_FAILURE);
}
// READ CONTENTS OF WEBPAGE IN HTML FORMAT
if (!InternetReadFile(response, buf, size, &length)) {
// FAILED TO READ CONTENTS OF WEBPAGE
// Close handles and Exit
InternetCloseHandle(response); // Close handle to response
InternetCloseHandle(internet); // Close handle to internet
ExitProcess(EXIT_FAILURE);
}
// CLOSE HANDLES AND OUTPUT CONTENTS OF WEBPAGE
InternetCloseHandle(response); // Close handle to response
InternetCloseHandle(internet); // Close handle to internet
std::cout << buf << std::endl;
return 0;
}
Generally I'd recommend something cross-platform like cURL, POCO, or Qt. However, here is a Windows example!:
#include <atlbase.h>
#include <msxml6.h>
#include <comutil.h> // _bstr_t
HRESULT hr;
CComPtr<IXMLHTTPRequest> request;
hr = request.CoCreateInstance(CLSID_XMLHTTP60);
hr = request->open(
_bstr_t("GET"),
_bstr_t("https://www.google.com/images/srpr/logo11w.png"),
_variant_t(VARIANT_FALSE),
_variant_t(),
_variant_t());
hr = request->send(_variant_t());
// get status - 200 if succuss
long status;
hr = request->get_status(&status);
// load image data (if url points to an image)
VARIANT responseVariant;
hr = request->get_responseStream(&responseVariant);
IStream* stream = (IStream*)responseVariant.punkVal;
CImage *image = new CImage();
image->Load(stream);
stream->Release();
Although a little bit late. You may prefer https://github.com/Taymindis/backcurl .
It allows you to do http call on mobile c++ development. Suitable for Mobile game developement
bcl::init(); // init when using
bcl::execute<std::string>([&](bcl::Request *req) {
bcl::setOpts(req, CURLOPT_URL , "http://www.google.com",
CURLOPT_FOLLOWLOCATION, 1L,
CURLOPT_WRITEFUNCTION, &bcl::writeContentCallback,
CURLOPT_WRITEDATA, req->dataPtr,
CURLOPT_USERAGENT, "libcurl-agent/1.0",
CURLOPT_RANGE, "0-200000"
);
}, [&](bcl::Response * resp) {
std::string ret = std::string(resp->getBody<std::string>()->c_str());
printf("Sync === %s\n", ret.c_str());
});
bcl::cleanUp(); // clean up when no more using
All the answers above are helpful. My answer just adds some additions:
Use boost beast, sync example, async example, ssl example
Use nghttp2, example, It supports SSL, HTTP/2
Use Facebook proxygen, this project comprises the core C++ HTTP abstractions used at Facebook. It's aimed at high performance and concurrency. I recommend installing it with vcpkg or you will struggle with the dependencies management. It supports SSL. It also support some advanced protocols:HTTP/1.1, SPDY/3, SPDY/3.1, HTTP/2, and HTTP/3
Both nghttp2 and proxygen are stable, can be considered to use in production.
You can use ACE in order to do so:
#include "ace/SOCK_Connector.h"
int main(int argc, ACE_TCHAR* argv[])
{
//HTTP Request Header
char* szRequest = "GET /video/nice.mp4 HTTP/1.1\r\nHost: example.com\r\n\r\n";
int ilen = strlen(szRequest);
//our buffer
char output[16*1024];
ACE_INET_Addr server (80, "example.com");
ACE_SOCK_Stream peer;
ACE_SOCK_Connector connector;
int ires = connector.connect(peer, server);
int sum = 0;
peer.send(szRequest, ilen);
while (true)
{
ACE_Time_Value timeout = ACE_Time_Value(15);
int rc = peer.recv_n(output, 16*1024, &timeout);
if (rc == -1)
{
break;
}
sum += rc;
}
peer.close();
printf("Bytes transffered: %d",sum);
return 0;
}
CppRest SDK by MS is what I just found and after about 1/2 hour had my first simple web service call working. Compared that to others mentioned here where I was not able to get anything even installed after hours of looking, I'd say it is pretty impressive
https://github.com/microsoft/cpprestsdk
Scroll down and click on Documentation, then click on Getting Started Tutorial and you will have a simple app running in no time.
For the record cesanta's mongoose library seems to also support this: https://github.com/cesanta/mongoose/blob/6.17/examples/http_client/http_client.c
I want to write a program in C/C++ that will dynamically read a web page and extract information from it. As an example imagine if you wanted to write an application to follow and log an ebay auction. Is there an easy way to grab the web page? A library which provides this functionality? And is there an easy way to parse the page to get the specific data?
Have a look at the cURL library:
#include <stdio.h>
#include <curl/curl.h>
int main(void)
{
CURL *curl;
CURLcode res;
curl = curl_easy_init();
if(curl) {
curl_easy_setopt(curl, CURLOPT_URL, "curl.haxx.se");
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
}
return 0;
}
BTW, if C++ is not strictly required. I encourage you to try C# or Java. It is much easier and there is a built-in way.
Windows code:
#include <winsock2.h>
#include <windows.h>
#include <iostream>
#pragma comment(lib,"ws2_32.lib")
using namespace std;
int main (){
WSADATA wsaData;
if (WSAStartup(MAKEWORD(2,2), &wsaData) != 0) {
cout << "WSAStartup failed.\n";
system("pause");
return 1;
}
SOCKET Socket=socket(AF_INET,SOCK_STREAM,IPPROTO_TCP);
struct hostent *host;
host = gethostbyname("www.google.com");
SOCKADDR_IN SockAddr;
SockAddr.sin_port=htons(80);
SockAddr.sin_family=AF_INET;
SockAddr.sin_addr.s_addr = *((unsigned long*)host->h_addr);
cout << "Connecting...\n";
if(connect(Socket,(SOCKADDR*)(&SockAddr),sizeof(SockAddr)) != 0){
cout << "Could not connect";
system("pause");
return 1;
}
cout << "Connected.\n";
send(Socket,"GET / HTTP/1.1\r\nHost: www.google.com\r\nConnection: close\r\n\r\n", strlen("GET / HTTP/1.1\r\nHost: www.google.com\r\nConnection: close\r\n\r\n"),0);
char buffer[10000];
int nDataLength;
while ((nDataLength = recv(Socket,buffer,10000,0)) > 0){
int i = 0;
while (buffer[i] >= 32 || buffer[i] == '\n' || buffer[i] == '\r') {
cout << buffer[i];
i += 1;
}
}
closesocket(Socket);
WSACleanup();
system("pause");
return 0;
}
There is a free TCP/IP library available for Windows that supports HTTP and HTTPS - using it is very straightforward.
Ultimate TCP/IP
CUT_HTTPClient http;
http.GET("http://folder/file.htm", "c:/tmp/process_me.htm");
You can also GET files and store them in a memory buffer (via CUT_DataSource derived classes). All the usual HTTP support is there - PUT, HEAD, etc. Support for proxy servers is a breeze, as are secure sockets.
You can do it with socket programming, but it's tricky to implement the parts of the protocol needed to reliably fetch a page. Better to use a library, like neon. This is likely to be installed in most Linux distributions. Under FreeBSD use the fetch library.
For parsing the data, because many pages don't use valid XML, you need to implement heuristics, not a real yacc-based parser. You can implement these using regular expressions or a state transition machine. As what you're trying to do involves a lot of trial-and-error you're better off using a scripting language, like Perl. Due to the high network latency you will not see any difference in performance.
You're not mentioning any platform, so I give you an answer for Win32.
One simple way to download anything from the Internet is the URLDownloadToFile with the IBindStatusCallback parameter set to NULL. To make the function more useful, the callback interface needs to be implemented.
Try using a library, like Qt, which can read data from across a network and get data out of an xml document. This is an example of how to read an xml feed. You could use the ebay feed for example.
It can be done in Multiplatform QT library:
QByteArray WebpageDownloader::downloadFromUrl(const std::string& url)
{
QNetworkAccessManager manager;
QNetworkReply *response = manager.get(QNetworkRequest(QUrl(url.c_str())));
QEventLoop event;
QObject::connect(response, &QNetworkReply::finished, &event, &QEventLoop::quit);
event.exec();
return response->readAll();
}
That data can be e.g. saved to file, or transformed to std::string:
const string webpageText = downloadFromUrl(url).toStdString();
Remember that you need to add
QT += network
to QT project configuration to compile the code.