Client Server C++ Windows Application - c++

Suppose I have 3 computers each collecting data and storing that data in files on the hard drive. I would like those computers to send those files to a 4th computer. What is the simplest way to accomplish this?

The simplest way I can think of would be to have the 4th computer advertise a shared network drive, and then have the 3 computers mount that drive as a pseudo-local drive (N:\ or whatever). Then all the 3 computers have to do is write or copy the files onto n:\whatever_folder. No network programming required.

Create an FTP server on the 4th computer, and have each of the 3 data-collectors upload their files there.
How to establish an FTP server is beyond the scope of this question, or even SO. Ask on serverfault (I think).

Related

Connecting SAS and FTP server(or network drive)

I'am completly new to SAS(and all these techincal things), but i need connect a network drive (or FTP?) to SAS, so i can copy or send files to it. I guess, it should look like another server at a workspace or something. I only know the address of it(maybe it will give you some clue):
\\10.48.42.166\exportsexternal1\oaDiez0Naf
The problem is even in that i don't know how to google it correctly - all link's are mainly about FILENAME in FTP

Automated file transfer between two macs using an ethernet cable?

Quick background, I am an intern at a company assigned to a project that I have no experience with, and I need some help trying to figure out where to start.
The goal of the project is to transfer very large chunks of data from a database, to a PC and then to a Mac. I am trying to code the communication between the PC and the Mac (this has to be done in c++, I've heard Python is easier but I have to use c++). Some requirements are that the PC and Mac be directly connected via an ethernet cable, and neither computer will have access to internet. The data transfer needs to be automated, so whenever the PC detects that it has received a full dataset from the database, it transfers the data to the PC. I cannot use any third party software to do this.
So far, through the research I've done, I think I need to set up a TCP Server-Client network. I've been using the code here (http://cs.ecs.baylor.edu/~donahoo/practical/CSockets/practical/) as a guideline for socket coding. I am first trying to test this by sending files between two macs (I don't have access to a PC atm). Any guidelines as to where I go from here would be helpful. I have looked into setting up static IP addresses and such, but I get stuck from there.
I don't expect anyone to code this for me, I am just new to socket coding and this sort of project, so just looking for a nudge in the right direction. Thanks!
Before you start coding, keep in mind that to connect PC to Mac you may need a crossover cable.
Then do some reading on the wired ad-hoc networks. The last post in this discussion may help.
Finally, configure and mount shared volumes (using the stock software, no 3rd parties involved), and don't use the low-level socket interface.

Ipod Synchronization with PC (moving files to and from an ipod touch)

I'm building a PC application in C++, and I need to synchronize some TXT data from my PC to IPod Touch and vice-versa.
How to do it? I have no idea.
Can I synchronize with USB port, or network?
Without paying a (very expensive) license to apple your program will not be able to talk to the ipod. However, If the ipod is jailbroken there are a number of ways you can do it.
If you are only wanting to talk to jailbroken ipods we may still be able to help but if you want stock it is likely out of the price range a single person could afford. Please update your question if you are willing to do only jailbroken ipods.
Correction: There are some things you can do. If you have a cutom app from the app store installed on the ipod, I tunes does allow syncing with that app for programs like ebook readers, you may be able to do this to sync. Also you could connect to a server while your app is running and get the latest txt file that way.
But there is no way to just grab a file or put a file on the ipod, and the app running on the phone would be trapped in it's own sandbox and would not be able to see or touch any other files on the ipod.

Secure file upload with Qt

I'm in the process of creating a utility to backup user's media files. The media isn't being shared etc its only a backup utility.
I'm trying to think of the best way to protect users from ISPs accusing them of downloading illegal media files by using some sort of secure connection.
The utility is written in C++ using the Qt lib and so far I've only been able to find the QtSslSocket component for secure connections. The domain already has a valid SSL certificate for the next few years.
Can anyone suggest the best way to go about implementing this from both the server and client side. i.e what does the server need to have in place and is there anything in particular the backup utility needs to implement from the client side to ensure secure transactions?
Are there any known, stable sftp or ftps servers available etc?
As far as I know, Qt doesn't have support for secure FTP transfers.
Not sure what other info. would be useful to make the question any clearer but any advice or help pointing me in the right direction will be most welcomed.
EDIT I'm also Java competent so a Java solution will work just as well...
As Martin wrote, you can wrap client. But if you don't want to do that, you can use libssh.
I searched for some sort of solution to this for a couple days and then forgot about the problem. Then today I stumbled across this little gem in the Qt-Creator source Utils::ssh, includes support for SFTP, plain-old SSH, and all sorts of goodies.
Disentangling stuff from Qt-Creator can be a pain, but having gone through this process it amounts to grabbing Botan (one of the other libs in QT-Creator) + Utils.
When it rains, it pours, I find two solutions to this problem in an hour - http://nullget.sourceforge.net/ (Requires Chinese translation), but from their summary:
NullGet is written with Qt, runs on
multiple platforms, the GUI interface
of the multi-threaded multi-protocol
HTTP download software. Use NullGet
can easily download a variety of
network protocol data stream, faster
download speeds, support for HTTP, the
protocol currently supported are:
HTTP, HTTPS, FTP, MMS, RTSP. And it
can run on most current popular
operating systems including Windows,
Linux, FreeBSD and so on.
Easiest way would be to just wrap a commandline sftp client with a Qt front end.
On the server any ftp server should do sftp pretty much out of the box.
As Synthesizerpatel says Qt Creator implements SFTP. So I have isolated the library that contains SSH and SFTP and I have created a new project named QSsh in Github (https://github.com/lvklabs/QSsh). The aim of the project is to provide SSH and SFTP support for any Qt Application.
I have written an example on how to upload a file using SFTP in examples/SecureUploader/
I hope it might be helpful

What would you use to implement a fast and lightweight file server?

I need to have as part of a desktop application a file server which should respond as fast as possible to file transfer requests (from remote clients, usually located on the same LAN). There will be many file requests for small sized files. The server should be able to provide both upload and download services.
I am not tight to any particual technology so I am open to any programming language, toolkits, libraries as long as they can run on Windows.
My initial take is to go with a C/C++ implementation using Windows Sockets or use the services provided by libraries such as Boost (asio or such). I have also thought of Erlang but that I'll have to learn and so the performance benefits should justify the increased development time due to having to learn the language.
LATER EDIT: I appreciate the answers that say use FTP or HTTP or basically anything that has been already created but considering you still want to write one from scratch, what would you do?
Why not just go with FTP? You should be able to find an adequate server implementation in any language, and client access libraries too.
It sounds like a lot of wheel-reinvention. Granted, FTP is not ideal, and has a few odd spots, but ... it's there, it's standard, well-known, and already very widely implemented.
For frequent uploads of small files, the fastest way would be to implement your own proprietary protocol, but that would require a considerable amount of work - and also it would be non-standard, meaning future integration would be difficult unless you are able to implement your protocol in any client you'll support. If you choose to do it anyway, this is my suggestion for a simple protocol:
Command: 1 byte to identify what'll be done: (0x01 for upload request, 0x02 for download request, 0x11 for upload response, 0x12 for download response, etc).
File name: can be fixed-size or prefixed with a byte for the length (assuming the name is less than 255 bytes)
Checksum, MD5 for instance (if upload request or download response)
File size (if upload request or download response)
payload (if upload request or download response)
This could be implemented on top of a simple TCP socket. You can also use UDP, avoiding the cost of establishing a connection but in this case you have to deal with retransmission control.
Before deciding to implement your own protocol, take a look at HTTP libraries like libcurl, you could make your server use standard HTTP commands like GET for download and POST for upload. This would save a lot of work and you'll be able to test the download with any web browser.
Another suggestion to improve performance is to use as the file repository not the filesystem, but something like SQLite. You can create a single table containing one char column for the file name and one blob column for the file contents. Since SQLite is lightweight and does an efficient caching, you'll most of the time avoid the disk access overhead.
I'm assuming you don't need client authentication.
Finally: although C++ is your preference to give you raw native code speed, rarely this is the major bottleneck in this kind of application. Most probably will be disk access and network bandwidth. I'm mentioning this because in Java you'll probably be able to make a servlet to do exactly the same thing (using HTTP GET for download and POST for upload) with less than 100 lines of code. Use Derby instead of SQLite in this case, put that servlet in any container (Tomcat, Glassfish, etc) and it's done.
If all the machines are running on Windows on the same LAN, why do you need a server at all? Why not simply use Windows file sharing?
I would suggest not to use FTP, or SFTP, or any other connection oriented technique. Instead, go for a connectionless protocol or technique.
The reason is that, if you require lots of small files to be uploaded or downloaded, and the response should be as fast as possible, you want to avoid the cost of setting up and destroying connections.
I would suggest that you look at either using an existing implementation or implementing your own HTTP or HTTPS server/service.
Your bottlenecks are likely to come from one of the following sources:
Harddisk I/O - The WD velociraptor is supposed to have a random access speed of about 100MB/s. Also, it is important whether you set it up as RAID0,1,5 or what nots. Some read fast but write slow. Trade-offs.
Network I/O - Assuming that you have the fastest harddisks in a fast RAID setup, unless you use Gbit I/O, your network will be slow. If your pipes are big, you still need to supply it with data.
Memory cache - The in-memory file-system cache will need to be big enough to buffer all the network I/O so that it does not slow you down. That will require large amounts of memory for the kind of work you're looking at.
File-system structure - Assuming that you have gigabytes worth of memory, then the bottleneck will most likely be the data-structure that you use for the file-system. If the file-system structure is cumbersome it will slow you down.
Assuming that all the other problems are solved, then do you worry about your application itself. Notice, that most of the bottlenecks are outside your software control. Therefore, whether you code it in C/C++ or use specific libraries, you will still be at the mercy of the OS and hardware.
Sounds like you should use an SFTP (SSH) server, it's firewall/NAT safe, secure, and already does what you want and more. You could also use SAMBA or windows file sharing for an even more simple implementation.
Why not use something existing, for example a normal Web server handles a lot of small files (images) very well and fast.
And lots of people already spent time in optimizing the code.
And the second benefit is that the transfer is done with HTTP which is an established protocol. And is easily switched to SSL if you need more security.
For the uploads, they are also no problem with a script or custom module - with the same method you can also add authorization.
As long as you don't need to dynamically seek the files i guess this would be one of the best solutions.
It's a new part to an existing desktop application? What's the goal of the server? Is it protecting the files that are uploaded/downloaded and providing authentication and/or authorisation? Does it provide some kind of structure for the uploads to be stored in?
One option may be to install Apache HTTP Server on the machine and serve the file via that. Use POST to upload and GET to download.
If the clients are within a LAN could you not just share a drive?