"CURLE_OUT_OF_MEMORY" error when posting via https - c++

I am attempting to write an application that uses libCurl to post soap requests to a secure web service. This Windows application is built against libCurl version 7.19.0 which, in turn, is built against openssl-0.9.8i. The pertinent curl related code follows:
FILE *input_file = fopen(current->post_file_name.c_str(), "rb");
FILE *output_file = fopen(current->results_file_name.c_str(), "wb");
if(input_file && output_file)
{
struct curl_slist *header_opts = 0;
CURLcode rcd;
header_opts = curl_slist_append(header_opts, "Content-Type: application/soap+xml; charset=utf8");
curl_easy_reset(curl_handle);
curl_easy_setopt(curl_handle, CURLOPT_NOPROGRESS, 1);
curl_easy_setopt(curl_handle, CURLOPT_WRITEDATA, output_file);
curl_easy_setopt(curl_handle, CURLOPT_READDATA, input_file);
curl_easy_setopt(curl_handle, CURLOPT_URL, fs_service_url);
curl_easy_setopt(curl_handle, CURLOPT_POST, 1);
curl_easy_setopt(curl_handle, CURLOPT_HTTPHEADER, header_opts);
rcd = curl_easy_perform(curl_handle);
if(rcd != 0)
{
current->curl_result = rcd;
current->curl_error = curl_easy_strerror(rcd);
}
curl_slist_free_all(header_opts);
}
When I attempt to execute the URL, curl returns an CURLE_OUT_OF_MEMORY error which appears to be related to a failure to allocate an SSL context. Has anyone else encountered this problem before?

I had the same problem, just thought I'd add the note that rather than calling the OpenSsl export SSL_library_init directly it can be fixed by adding the flag CURL_GLOBAL_SSL to curl_global_init

After further investigation, I found that this error was due to a failure to initialise the openSSL library by calling SSL_library_init().

I encountered the same symptom after upgrading to Ubuntu 16.04 as described in this answer. The solution was to Use TLS like so.
curl_easy_setopt(curl_, CURLOPT_SSLVERSION, CURL_SSLVERSION_TLSv1_2));
Apparently SSLv3 was disabled on Ubuntu 16.04.

Related

What could be reason of getting SSL connect error while posting json request to https url from Windows PC?

I am using libcurl.dll version 7.72.0.0 in Visual Studio for a C++ application.
When I am posting a JSON request to an HTTPS url on one Windows 7 PC with Service Pack 1, the request is posted successfully. But on other PC with same OS as Windows 7 Service Pack 1, I am getting response code 35 from curl_easy_perform(): SSL connect error.
What could be the reason of the same C++ code working on one Windows 7 PC and getting an error on other Windows 7 PC? Both PCs are on the same network.
Following is the C++ code:
curl_easy_setopt(curl, CURLOPT_URL, Url.c_str());
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, header);
// send all data to this function
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, WriteMemoryCallback);
// we pass our 'chunk' struct to the callback function
curl_easy_setopt(curl, CURLOPT_WRITEDATA, (void *)&sResponse);
//some servers don't like requests that are made without a user-agent
// field, so we provide one
curl_easy_setopt(curl, CURLOPT_USERAGENT, "libcurl-agent/1.0");
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, postthis);
printf("\npostthisData=%s\n", postthis);
if (itsHTTPSRequest)
curl_easy_setopt(curl, CURLOPT_SSL_VERIFYPEER, 0);
// if we don't provide POSTFIELDSIZE, libcurl will strlen() by
// itself
curl_easy_setopt(curl, CURLOPT_POSTFIELDSIZE, (long)strlen(postthis));
curl_easy_setopt( curl, CURLOPT_CONNECTTIMEOUT, m_RWTimeOut );
curl_easy_setopt( curl, CURLOPT_LOW_SPEED_TIME, m_RWTimeOut);
curl_easy_setopt( curl, CURLOPT_LOW_SPEED_LIMIT, 30L);

Synchronized curl requests

I'm trying to do HTTP requests to multiple targets, and I need to them to run (almost) exactly at the same moment.
I'm trying to create a thread for each request, but I don't know why Curl is crashing when doing the perform. I'm using an easy-handle per thread so in theory everything should be ok...
Has anybody had a similar problem? or Does anyone know if the multi interface allows you to choose when to perform all the requests?
Thanks a lot.
EDIT:
Here is an example of the code:
void Clazz::function(std::vector<std::string> urls, const std::string& data)
{
for (auto it : urls)
{
std::thread thread(&Clazz::DoRequest, this, it, data);
thread->detach();
}
}
int Clazz::DoRequest(const std::string& url, const std::string& data)
{
CURL* curl = curl_easy_init();
curl_slist *headers = NULL;
headers = curl_slist_append(headers, "Expect:");
headers = curl_slist_append(headers, "Content-Type: application/json");
curl_easy_setopt(curl, CURLOPT_POST, 1);
curl_easy_setopt(curl, CURLOPT_POSTFIELDS, data.c_str());
curl_easy_setopt(curl, CURLOPT_CONNECTTIMEOUT, 15);
curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headers);
curl_easy_setopt(curl, CURLOPT_URL, url.c_str());
curl_easy_setopt (curl, CURLOPT_FAILONERROR, 1L);
//curlMutex.lock();
curl_easy_perform(curl);
//curlMutex.unlock();
long responseCode = 404;
curl_easy_getinfo (curl, CURLINFO_RESPONSE_CODE, &responseCode);
curl_easy_cleanup(curl);
curl_slist_free_all(headers);
}
I hope this can help, thanks!
Are you calling curl_global_init anywhere? Perhaps rather early in your main() method?
Quoting from http://curl.haxx.se/libcurl/c/curl_global_init.html:
This function is not thread safe. You must not call it when any other thread in the program (i.e. a thread sharing the same memory) is running. This doesn't just mean no other thread that is using libcurl. Because curl_global_init calls functions of other libraries that are similarly thread unsafe, it could conflict with any other thread that uses these other libraries.
Quoting from http://curl.haxx.se/libcurl/c/curl_easy_init.html:
If you did not already call curl_global_init, curl_easy_init does it automatically. This may be lethal in multi-threaded cases, since curl_global_init is not thread-safe, and it may result in resource problems because there is no corresponding cleanup.
It sounds like you're not calling curl_global_init, and letting curl_easy_init take care of it for you. Since you're doing it on two threads simultaneously, you're hitting the thread unsafe scenario, with the lethal result that was mentioned.
After being able to debug properly in the device y have found that the problem is an old know issue with curl.
http://curl.haxx.se/mail/lib-2010-11/0181.html
after using CURLOPT_NOSIGNAL in every curl handle the crash has disappeared. :)

Curl returns all ok when bad ip supplied to CURLOPT_DNS_SERVERS

I've made a multithreaded (pthread) c++ program that is configured to use a list of custom dns.
In my tests I've used google's 8.8.8.8 for good ex, and some random ip like 113.65.123.138, 13.23.123.87 to test fail. But all goes the same in both cases, successfully.
Curl was built with C-ares support, and I've tested just to be sure:
curl_version_info_data *data = curl_version_info(CURLVERSION_NOW);
cout<<endl<<"Curl version: "<< data->version <<endl
<<"AsyncDNS: "<<( data->features | CURL_VERSION_ASYNCHDNS ? "YES" : "NO" ) <<endl;
//output: Curl version: 7.30.0 \n AsyncDNS: YES
The rest of the code:
curl_easy_setopt(curl, CURLOPT_DNS_SERVERS, thisThreadData->current_dns->dns_str.c_str());
curl_easy_setopt(curl, CURLOPT_DNS_USE_GLOBAL_CACHE,false); //thread safety
curl_easy_setopt(curl, CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4);
curl_easy_setopt(curl, CURLOPT_CONNECTTIMEOUT, CONNECT_TIMEOUT);
curl_easy_setopt(curl, CURLOPT_TIMEOUT, CONNECTION_TIMEOUT);
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, true);
curl_easy_setopt(curl, CURLOPT_FAILONERROR, 1);
curl_easy_setopt(curl, CURLOPT_MAXREDIRS, 5);
curl_easy_setopt(curl, CURLOPT_NOSIGNAL, 1);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, curl_to_string);
curl_easy_setopt(curl, CURLOPT_WRITEHEADER, &getUrlOutput->header);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, &getUrlOutput->html);
curl_easy_setopt(curl, CURLOPT_SSL_VERIFYPEER, false);
curl_easy_setopt(curl, CURLOPT_SSL_VERIFYHOST, false);
curl_easy_setopt(curl, CURLOPT_ACCEPT_ENCODING, "gzip,deflate");
status=curl_easy_perform(curl);
I've tested the random IPs (just in case I've stumbled on some valid DNS) :
$ host google.com 113.65.123.138
;; connection timed out; no servers could be reached
$ host google.com 13.23.123.87
;; connection timed out; no servers could be reached
What am I missing ?
Update
I've tried libcurl's latest version (7.33.0) and c-ares (1.10.0) and the same outcome.
Also if I supply a wrong domain for a url it returns CURLE_HTTP_RETURNED_ERROR (22) opposed to CURLE_COULDNT_RESOLVE_HOST (6).
Update 2
Forgot to mention that I used a HTML_PROXY for the connection, and it seems that was an important aspect, see answer.
According to the curl forums, it's expected behavior as the dns is handled by the proxy.
When I am using a proxy, how do I (or can I) control
whether the target url is resolved locally or by the proxy? I have
conditions where I need both choices.
With HTTP-Proxy the client (curl) hands over the full URL to the proxy and the proxy will resolve the hostname .
If you really want to do it client-side, then you need to
first resolve the name and then "re-arrange" the URL to use IP
numericals only and set the Host: header to contain the host name you
resolved.
In my case when the proxy's DNS finds an error it returns a html formated error page with the message "host not found", html_status 503, that's why curl passed dns checks and said the domain is ok, but failed with CURLE_HTTP_RETURNED_ERROR.

Determine extension when downloading a file with curl?

I've created a program that downloads subtitles from INTERNET using curl
How to know the extension of the file I downloaded (.zip or .rar)?
here is my code (it is a part of function)
FILE* download=fopen("download.zip","wb");//i assume it's a zip
curl = curl_easy_init();
curl_easy_setopt(curl, CURLOPT_POST,1);
curl_easy_setopt(curl, CURLOPT_POSTFIELDS,post_data.c_str());
curl_easy_setopt(curl, CURLOPT_USERAGENT, "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322)");
curl_easy_setopt(curl, CURLOPT_URL,url.c_str());
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, write2file);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, download);
res = curl_easy_perform(curl);
if(res != CURLE_OK)
std::cout<<"Download Failed "<<std::endl;
curl_easy_cleanup(curl);
fclose(download);
You can use the curl_easy_getinfo() function, it has the content-type info:
CURLINFO_CONTENT_TYPE
Pass a pointer to a char pointer to receive the content-type of the
downloaded object. This is the value read from the Content-Type:
field. If you get NULL, it means that the server didn't send a valid
Content-Type header or that the protocol used doesn't support this.
Using the info you can give the right extension to the downloaded file by renaming it on disk.

How (i.e. what tool to use) to monitior headers sent by Curl (Cookie problem)

I am using Curl (libcurl) in a C++ aplication, and am unable to send cookies (I think).
I have Fiddler, TamperData and LiveHTTP Headers installed, but they are only useful for viewing browser traffic, and are (it would seem) unable of monitoring general network traffic on a machine, so when I run my machine, I cant see the header information being sent. However, when I view the page in a browser, when succesfully logged on, I can see that cookie information is being sent.
When running my app, I succesfully log onto the page, when I subsequently, try to fetch another page, the (page) data suggests that I am not logged on - i.e. "state" has somehow being lost.
My C++ code looks alright, so I dont know what is going wrong - this is why I need to:
First be able to view my machines network traffic (not just browser traffic) - which (free) tool?
Assuming I am using Curl incorrectly, whats wrong with my code? (the cookies are being retrieved and stored ok, it seems they are just not being sent with requests for some reason.
Here is the section of my class that deals with the cookie side of Http requests:
curl_easy_setopt(curl, CURLOPT_TIMEOUT, long(m_timeout));
curl_easy_setopt(curl, CURLOPT_USERAGENT,
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; WOW64; SV1; .NET CLR 2.0.50727)");
curl_easy_setopt(curl, CURLOPT_COOKIEFILE, "cookies.txt");
curl_easy_setopt(curl, CURLOPT_COOKIEJAR, "cookies.txt");
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1L);
curl_easy_setopt(curl, CURLOPT_AUTOREFERER, 1L);
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, CurlCallback);
curl_easy_setopt(curl, CURLOPT_WRITEDATA, this);
Is there anything wrong with the above code?
You can use Wireshark (the former Ethereal) to view all the network traffic a machine is sending and receiving.
As Sean Carpenter said, Wireshark is the right tool to view network traffic. Start a capture and use http as a filter to see only HTTP traffic. If you just want to see HTTP requests/responses sent/received by Curl, set the CURL_VERBOSE option and look at stderr: curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L).
I believe you are using Curl correctly. Compile and run the following (complete) example; you will see that, the second time you run it (when cookies.txt exists) cookies are sent to the server.
Example code:
#include <stdio.h>
#include <curl/curl.h>
int main()
{
CURL *curl;
CURLcode success;
char errbuf[CURL_ERROR_SIZE];
int m_timeout = 15;
if ((curl = curl_easy_init()) == NULL) {
perror("curl_easy_init");
return 1;
}
curl_easy_setopt(curl, CURLOPT_ERRORBUFFER, errbuf);
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L);
curl_easy_setopt(curl, CURLOPT_TIMEOUT, long(m_timeout));
curl_easy_setopt(curl, CURLOPT_URL, "http://www.google.com/");
curl_easy_setopt(curl, CURLOPT_USERAGENT,
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; WOW64; SV1; .NET CLR 2.0.50727)");
curl_easy_setopt(curl, CURLOPT_COOKIEFILE, "cookies.txt");
curl_easy_setopt(curl, CURLOPT_COOKIEJAR, "cookies.txt");
curl_easy_setopt(curl, CURLOPT_FOLLOWLOCATION, 1L);
curl_easy_setopt(curl, CURLOPT_AUTOREFERER, 1L);
if ((success = curl_easy_perform(curl)) != 0) {
fprintf(stderr, "%s: %s\n", "curl_easy_perform", errbuf);
return 1;
}
curl_easy_cleanup(curl);
return 0;
}