I'm developing application in C++ (cross-platform; Windows, Mac and Linux) that needs to communicate securely with servers using https protocol with libcurl (built with winssl/darwinssl/openssl on Windows/Mac/Linux respectively). I've changed a curl option, CURLOPT_SSL_VERIFYPEER from 0 to 1 which should help prevent MitM issues.
This has caused issues that an initial search points to turning that option off, but after digging deeper I found:
Get a CA certificate that can verify the remote server and use the proper option to point out this CA cert for verification when connecting. For libcurl hackers: curl_easy_setopt(curl, CURLOPT_CAPATH, capath); from curl docs
and
Get a better/different/newer CA cert bundle! One option is to extract the one a recent Firefox browser uses by running 'make ca-bundle' in the curl build tree root, or possibly download a version that was generated this way for you.
from curl docs
I actually use CURLOPT_CAINFO to the bundle as I had seen some word of issues using CURLOPT_CAPATH on Windows; curl docs. I have downloaded and installed this bundle along with the application on Windows and Mac and I'd like to know if this is the correct way to do it or if there is a better practice.
Initially this caused issues for users of the application running behind some corporate networks or proxy which seemed to get fixed by building libcurl against winssl instead of openssl on Windows; though potentially disguising itself as a firewall issue, still unclear although it seems likely.
Sorry for the length.
Is anything silly about installing the ca-cert-bundle.crt along with the application, and is there anything that should be done differently to communicate securely with the server from this installed application?
A slightly separate, but still very related, issue I have is CURLOPT_CAINFO on Linux giving the error:
error setting certificate verify locations:
CAfile: ../share/my_application/curl-ca-bundle.crt
CApath: none
Though attempting to open the file for reading from within the application does work successfully. Edit: This issue I solved by NOT setting the CURLOPT_CAINFO field on Linux (leaving it blank) and adding the dependency package ca-certificates to the application package. The default path is correctly /etc/ssl/certs/ca-certificates.crt and seems to be working. To me this feels a bit better than installing the bundle with the application.
Edit2: Although solved it appears the ca-certificates package sometimes doesn't install ca-certificates.crt and instead ca-bundle.crt and the locations vary on different distros as this source, happyassassin.net shows that different Linux systems store the CA bundles in different locations. It did not seem to have a clear answer as to HOW to handle this. Should I be using a value in the configuration file that the user can then modify, or any other thoughts on the subject?
Edit3: Some users have pointed out that my name exists in one of the paths curl looks for, I'm not entirely sure how that is possible as the only thing I've specified for curl is where I built openssl/cares libraries...
I realize this is a loaded/multipart question but it is all on the same subject as the title states, I'd appreciate any help.
Thanks.
In my opinion, it is better to use system certificate then package certificates with application (if you are not using some special certs). For the linux it should be easy according to https://serverfault.com/questions/394815/how-to-update-curl-ca-bundle-on-redhat And for windows you can either use winssl or create the file from system https://superuser.com/questions/442793/why-cant-curl-properly-verify-a-certificate-on-windows Configure cURL to use default system cert store
A default libcurl build is setup to attempt to use the "right" CA bundle.
Linux
A libcurl built on Linux will scan and check where the CA store is located on your system and use that. If you install libcurl on a regular Linux distro, it should've been built to use the distro's "typical" CA store.
macOS
If you build libcurl for mac and tell it to use the Secure Transport backend, it will automatically use the macOS CA store. So will the default-installed curl and libcurls that come shipped bundled with macOS from Apple.
Windows
If you build libcurl for Windows to use Schannel (the windows TLS system) it will by default use the Windows CA store.
Other setups
If you deviate from these setups, you basically opt to not use the CA store that comes bundled in the operating system you're using. Then you need to handle and update the CA store yourself.
Related
I have an application which statically linked with all dependent libraries and at the end I have a single binary file for Windows and Linux.
Is there a way to set ca certificate at compile time the content wil be included together with binary and it will not be necessary to move it together with application binary.
If you use the native SSL library on Windows (sometimes referred to as winssl), there's no need to ship any CA cert at all since curl will then use the internal one Windows features.
If you built libcurl to use OpenSSL, you can set a callback to verify the CA with a fixed built-in CA store. Showed in the cacertinmem example on the curl web site using the CURLOPT_SSL_CTX_FUNCTION option.
Shipping an app with a fixed internal CA cert storage might be troublesome when the services your app are using update/change their certs along the way, so maybe using an external file that you can update occasionally is still a better idea?
Novice to Qt and developing a cross platform app, which requires SSL authentication from the server as well as client sides The .pem based encryption is working on Linux, Android, Windows. However there are problems with Mac OSX. Our code looks like below:
QFile privateKeyFile(":/Certificate.pem"); // --> has certificate + key
privateKeyFile.open(QIODevice::ReadOnly | QIODevice::Text);
setLocalCertificateChain(QSslCertificate::fromPath(":/Certificate.pem", QSsl::Pem));
setPrivateKey(QSslKey(privateKeyFile.readAll(), QSsl::Rsa));
In above code privateKey().isNull() returns true for Mac. When we referred this post, it says that Mac doesn't support .pem based encryption.
The Secure Transport back-end to curl only supports client IDs that are in PKCS#12 (P12) format; it does not support client IDs in PEM format because Apple does not allow us to create a security identity from an identity file in PEM format without using a private API. And we can't use the private API, because apps that use private API are not allowed in any of Apple's app stores.
With my limited understanding, I interpreted that .pem is not a good idea for SSL communication with the server. Please stop me if it's wrong!
Hence, we decided to move to .pfx for all the platforms. We already had a .pfx file with a passphrase. We converted above code to be compatible with .pfx (i.e. "Certificate.pfx", we had this old file along with "Certificate.pem"). Instead of QSsl::Pem, we tried QSsl::Der. But as expected, it didn't work. However, there was no encryption error either, but we are sure that we are doing something wrong. :-)
We referred this post and try to regenerate a .pfx from .pem, but that also didn't help.
QSslCertificate::importPkcs12 fails to parse PFX file
In above case, the QSslCertificate::importPkcs12() returns false for the original .pfx file. Even if we generate a new .pfx from the command line, that also fails for the above function.
Question: Can someone help with exact way of performing the .pfx encryption with the server?
.pem authentication is also fine.
Note:
Server supports both .pfx & .pem. That we confirmed with regular C OpenSSL libraries. But we want to achieve it using Qt.
We are open to formats other than .pfx, should they work in all the platforms
DISCLAIMER: I am writing this from the top of my mind, since I don't personally own a Mac and cannot verify it anymore.
We had this exact problem about a year or two ago at my last job.
It all boils down to Apple dropping support for OpenSSL.
Because of that, Qt switched from OpenSSL backend to Secure Transport backend on Mac with Qt5.6. Now the Secure Transport implementation is lacking some features. For example we were not able to load private key pem-files. I think switching from PKCS#8 to PKCS#1 helped, which can both be stored in .pem files and look almost identical, so that took a while to figure out.
We also noticed that a successfully loaded private key will be stored inside the Mac's key store and could be viewed and exported from there by the user, which we also did not want.
We finally went with re-compiling the QtNetwork module to use OpenSSL instead of Secure Transport. You will need to provide OpenSSL for that, since OSX does not include the headers anymore. A homebrew installation was sufficient I think. Other than that the compilation was surprisingly painless and fast, since you just have to compile one small module, not the whole Qt.
The easiest way to do this is:
download the source distribution of the Qt version you are running
./configure it to use OpenSSL (the -openssl switch I believe)
cdinto the network folder
make
copy the generated QtNetwork.framework inside your Qt-Installation and replace the existing one.
With that everything worked as expected.
I have an application using the java QuickFix library. I am trying to port it to C++. The problem is that the java version of the library seems to send the data over an SSL connection, while the C++ library sends the data unencrypted. In java, SSL is enabled internally in the library when I pass the config file to the library. The exact same file is passed to C++, but SSL is not turned on in the second case.
Please help me, how can I use QuickFix with SSL in C++?
Config file (sorry for the ?? too much sensitive information):
[default]
# QuickFixJ specific parameters (please do not modify)
FileStorePath=logs/session/
MessageProcessingDelay=6000
# Default parameter settings for your client (modify/add as needed), to be applied to all sessions.
ConnectionType=initiator
StartTime=00:00:00
EndTime=00:00:00
HeartBtInt=30
ReconnectInterval=5
BeginString=FIX.4.4
SocketConnectHost=???.???.???.???
SocketConnectPort=??????
TargetCompID=??????
Username=??????
Password=??????
UseDataDictionary=N
#The following three lines are needed for Apache Mina SSL support only.
SocketUseSSL=Y
SocketKeyStore=config/ssl/ApacheMina/keystore.jks
SocketKeyStorePassword=?????????
#Declare and configure quote and trade sessions
# beginning with a '[session]' designator for each session.
[session]
SenderCompID=????????
Account=???????????
[session]
SenderCompID=???????
Account=??????
Thanks for the config file.
I have never used then switch SocketUseSSL=Y
Instead, in our production environment, in order to encrypt data over SSL we are using a free software: sTunnel
It's quite straightforward to install and config, and run over many different platforms.
Hope this can help.
quick fix has added ssl support in the recent c++ version
I'm in the process of creating a utility to backup user's media files. The media isn't being shared etc its only a backup utility.
I'm trying to think of the best way to protect users from ISPs accusing them of downloading illegal media files by using some sort of secure connection.
The utility is written in C++ using the Qt lib and so far I've only been able to find the QtSslSocket component for secure connections. The domain already has a valid SSL certificate for the next few years.
Can anyone suggest the best way to go about implementing this from both the server and client side. i.e what does the server need to have in place and is there anything in particular the backup utility needs to implement from the client side to ensure secure transactions?
Are there any known, stable sftp or ftps servers available etc?
As far as I know, Qt doesn't have support for secure FTP transfers.
Not sure what other info. would be useful to make the question any clearer but any advice or help pointing me in the right direction will be most welcomed.
EDIT I'm also Java competent so a Java solution will work just as well...
As Martin wrote, you can wrap client. But if you don't want to do that, you can use libssh.
I searched for some sort of solution to this for a couple days and then forgot about the problem. Then today I stumbled across this little gem in the Qt-Creator source Utils::ssh, includes support for SFTP, plain-old SSH, and all sorts of goodies.
Disentangling stuff from Qt-Creator can be a pain, but having gone through this process it amounts to grabbing Botan (one of the other libs in QT-Creator) + Utils.
When it rains, it pours, I find two solutions to this problem in an hour - http://nullget.sourceforge.net/ (Requires Chinese translation), but from their summary:
NullGet is written with Qt, runs on
multiple platforms, the GUI interface
of the multi-threaded multi-protocol
HTTP download software. Use NullGet
can easily download a variety of
network protocol data stream, faster
download speeds, support for HTTP, the
protocol currently supported are:
HTTP, HTTPS, FTP, MMS, RTSP. And it
can run on most current popular
operating systems including Windows,
Linux, FreeBSD and so on.
Easiest way would be to just wrap a commandline sftp client with a Qt front end.
On the server any ftp server should do sftp pretty much out of the box.
As Synthesizerpatel says Qt Creator implements SFTP. So I have isolated the library that contains SSH and SFTP and I have created a new project named QSsh in Github (https://github.com/lvklabs/QSsh). The aim of the project is to provide SSH and SFTP support for any Qt Application.
I have written an example on how to upload a file using SFTP in examples/SecureUploader/
I hope it might be helpful
I am needing to implement email notifications for a C++ project. Basically a user provides all the relevant information for their email account and on certain events this component would fire off an email. Ideally I would like to find a small cross platform open source command line project that I can exec from within my project and parse the output. Something like blat but it would also support SSL connections and can be used in both Windows(XP and 2003) and Linux (Ubuntu 6.06 and 8.04)
I could also use a library if it were simple enough and licensed under a commercial friendly license, but would be open to hearing all suggestions.
Thank you very much in advance for any recommendations
(A) One option is to use XMail:
http://www.xmailserver.org/
The readme file has instructions of how to build it in Linux and Windows:
http://www.xmailserver.org/Readme.html
If you look at the forums:
http://xmailforum.homelinux.net/
or do some Google searches you should be able to find more information on how to use it.
(B) Another, possibly easier option, would be to just make your application connect to and use an external SMTP server to send your notifications.
To compose the email libmime (http://www.mozilla.org/mailnews/arch/libmime-description.html) can be helpful.
To send the mail libsmtp (http://libsmtp.berlios.de/) can be used.
All the protocol and SSL code for my email client is available in Lgi:
http://www.memecode.com/lgi.php
It's LGPL, so you could use it as a DLL/SO. However it's not packaged ready to use binaries, you'd have to build it yourself and write some glue using the SMTP and MIME code. The SSL sockets stuff uses OpenSSL and works on both Linux and Windows.
I ended up using the Perl script sendEmail. A windows binary was available and building a new binary after modifying the Perl script was not too hard to do at all. The script also had no issues running in the LTE Ubuntu environments after the required Debian packages were installed.