Sending Files With Sockets - c++

I'm trying to create a basic HTTP server to learn more about how it works. I'm having difficulties with sending binary files to the client. My code is as below:
char * buffer = (char *)malloc(sizeof(char) * 512);
fseek(content_file, 0, SEEK_SET);
while (!feof(content_file)) {
size_t read = fread(buffer, sizeof(char), sizeof(buffer), content_file);
if (read > 0) {
client->send((const void *)buffer);
}
}
fclose(content_file);
free(buffer);
Now I know it can send some unnecessary data after the last block read but before trying to fix it, I want to know what's wrong with it. It was working fine for text files and I was using fgets. But after switching to fread to support binary files, text files are corrupted and became something like this: ThisÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍÍ ("This" is the only correct part in the sent data)
Obviously I'm missing something but can you please help me to do this correctly?
Edit:
Using a buffer_size value instead of sizeof(buffer) fixed the missing/corrupted data problem.

You problem is that sizeof(buffer) gives you the size of the pointer, not what it points to.
Add a buffer_size and use that both for malloc and freed.

Related

recv() returns strange buffer C++

I know C++ for quite long, but started using it for my purposes some what a year and a half ago.
I started learning network programming on C++ and the first networking project is "File Transfering between hosts over TCP/IP" which sounds kind easy but I am stuck with sending data.
I am trying to send small buffer less than 4KB, so buffer[4096] works fine for me, but I am planning to expand this. WSAStartup(), socket(), bind(), listen(), accept() functions work fine and values for them are initialised for both Server and Client, but I am dealing with other problems, maybe recv(), send() etc.
I still couldn't find the source of the problem.
Also it would be a ton helpful if somebody give me an example of transfering files over TCP/IP, but not in one packet, I want the file to be chunked and sent in parts or as it's called "ring model", but I couldn't find a working model;
P.S. This is first time I am asking here, pls give feedback about how well all of this is written, so that I could write more informative for community help, thanks)
Server
char* buffer = new char[4096];
ZeroMemory(buffer, sizeof(buffer));
ofstream file("a.txt", ios::binary);
int err = recv(conn, buffer, sizeof(buffer), 0);
file << buffer;
file.close();
if (err == 0)
{
printf("Client diconnected...\n");
}
printf("Quitting...\n");
delete[] buffer;
Client
ifstream file("a.txt", ios::binary);
file.seekg(0, ios::end);
int size = file.tellg();
file.seekg(0, ios::beg);
char* buffer = new char[size];
file.read(buffer, size);
file.close();
int err = send(client, buffer, size, 0);
if (err == 0)
{
printf("Disconnecting...\n");
}
printf("Quitting...\n");
delete[] buffer;
"a.txt" file on Client side is 45 bytes in here are 45 * 'a'
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
And this is what I get on Server side, file size is 14 bytes
aaaaaaaa ’pÈ/
In C++, sizeof(buffer) is the size of the pointer type.
You may want to read up on more modern (as in after 1998) C++. We have std::vector nowadays, and that has a convenient size method. It would return 4096 for your buffer. Also, vector handles new[] and delete[] for you.
The fact that you get 8 "a"'s suggests that you built for x64. The remaining bytes are garbage; you should check how many bytes recv actually wrote to buffer. You cannot assume that you got all the bytes you asked for (whether that's 8 or 4096).
I believe that sizeof(buffer) in this line -->
int err = recv(conn, buffer, sizeof(buffer), 0);
will return sizeof(char*) which is 4 bytes in a 32 bit program or 8 bytes in a 64 bits program instead of 4096 because it is not a static array as in you did not declare it as char buffer[4096]. So, either declare it as char buffer[4096] or convert the above code to
int err = recv(conn, buffer, 4096, 0);
Two additional points:
TCP is a streaming protocol (not "message based"), so there's no guarantee that a single recv() will get everything sent in a single send().
The server line file << buffer; assumes buffer is zero terminated.
MSDN state that :
If no error occurs, recv returns the number of bytes received and the
buffer pointed to by the buf parameter will contain this data
received. If the connection has been gracefully closed, the return
value is zero.
Otherwise, a value of SOCKET_ERROR is returned, and a specific error
code can be retrieved by calling WSAGetLastError.
https://learn.microsoft.com/en-us/windows/win32/api/winsock/nf-winsock-recv
test if you actually read 45 bytes and check if there's an error (WSAGetLastError function)

Is readsome() appropriate to read binary data on Windows?

Context: I am trying to read the content of a PNG picture in C++ to send it later to my Android app. To do so, I open the file in binary mode, read it's content by chuncks of 512 bytes, then send the data to the app. I'm on Windows.
Issue: I use an ifstream instance and the readsome() function as shown below, and it returns me 512, which is what I expected since I asked to read 512 bytes. However, it seems that I am far from really having 512 bytes in my buffer, which confuses me. While I debug my programm step by step, the number of char in the buffer seems random, but is never 512 as expected.
Code:
int currentByteRead = 0;
std::ifstream fl(imgPath.toStdString().c_str(), ios_base::binary);
fl.seekg( 0, std::ios::end );
int length = fl.tellg();
char *imgBytes = new char[512];
fl.seekg(0, std::ios::beg);
// Send the img content by blocks of 512 bytes
while(currentByteRead + 512 < length) {
int nbRead = fl.readsome(imgBytes, 512); // nbRead is always set to 512 here
if(fl.fail()) {
qDebug() << "Error when reading file content";
}
sendMessage(...);
currentByteRead += 512;
imgBytes = new char[512];
}
// Send the remaining data
int nbRemainingBytes = length - currentByteRead;
fl.readsome(imgBytes, nbRemainingBytes);
sendMessage(...);
fl.close();
currentByteRead += nbRemainingBytes;
The length I get at the beginning is the correct one, and it seems there is no error. But it is as if not all the data was copied into the buffer during the readsome() call.
Questions: Did I misunderstood something about the readsome() function ? Is there something related to Windows causing this behaviour ? Is there a more appropriate way to proceed ?
I finally found a way to do what I wanted, and as suggested by David Herring I will put here my answer.
My thoughts about the issue: If I use a std::ifstream::pos_type variable instead of an int, the correct number of bytes is read and put in the buffer. This was not the case when using an int, as if the chars were only written in the buffer until a given (random ?) point. I am not sure to understand why this behavior occurred. My guess was that I had issues with '\n' characters, but the randomness of the final content of the buffer is still unclear for me.
Correction: This is the working code I finally reached nonetheless. Starting with this, I was able to do what I had in mind.
std::ifstream ifs(imgPath.toStdString().c_str(), std::ios::binary|std::ios::ate);
std::ifstream::pos_type pos = ifs.tellg();
int length = ifs.tellg();
std::vector<char> result(pos);
ifs.seekg(0, std::ios::beg);
ifs.read(result.data(), pos);
ifs.close();
I hope this will help others. Thank you David for your suggestions.

Copying binary jpg data into a buffer

void jpgToBuff(const char* srcfilename)
{
FILE* file = fopen(srcfilename, "rb");
fseek(file, 0, SEEK_END);
unsigned long fileLen = ftell(file);
fseek(file, 0, SEEK_SET);
char* file_data;
file_data = (char *)malloc((fileLen + 1) * sizeof(char));
fread(file_data, fileLen, 1, file);
fclose(file);
}
Am I doing this correctly. I want to eventually send this information through a socket and decode it on the other side. Any suggestions would be super helpful. Is this theoretically possible to send this through a socket and decode it into an image on the other side?
Well, you're not doing enough error checking. fopen, fseek, ftell, malloc, fread and fclose can all fail. Failures would likely result in a crash or other unexpected results.
fread might return fewer characters than you attempted to read, so you should probably check for that too.
You've allocated +1 byte, presumably so you could add a terminating '\0'? But you left that final byte uninitialized. A jpeg file could reasonably contain an embedded '\0', so adding a terminating '\0' is probably not going to get you the results you're hoping for.
Finally, sizeof(char) is defined by the standards to be 1, always. So, you might as well multiply by 1, or better yet, don't.
Other than that it looks basically right.

Reading a file located in memory with libavformat

I'm currently trying to read small video files sent from a server
In order to read a file using libavformat, you are supposed to call
av_open_input_file(&avFormatContext, "C:\\path\\to\\video.avi", 0, 0, 0);
The problem is that in this case the file is not on the disk, but in memory.
What I'm doing for the moment is downloading the file, writing it on the disk using a temporary name, and then calling av_open_input_file with the temporary file name, which is not a very clean solution.
In fact what I want is a function like av_open_custom(&avFormatContext, &myReadFunction, &mySeekFunction); but I didn't find any in the documentation.
I guess it is technically possible, since the name of the file is not something that helps the library determine which format it is using.
So is there a function like this, or an alternative to av_open_input_file?
It's funny how I always find the solution by myself right after I post the question on this site, even though I've been working on this problem for hours.
In fact you have to initialize avFormatContext->pb before calling av_open_input, and pass to it a fake filename.
This is not written in the documentation but in a commentary directly in the library's source code.
Example code if you want to load from an istream (untested, just so somebody which has the same problem can get the idea)
static int readFunction(void* opaque, uint8_t* buf, int buf_size) {
auto& me = *reinterpret_cast<std::istream*>(opaque);
me.read(reinterpret_cast<char*>(buf), buf_size);
return me.gcount();
}
std::ifstream stream("file.avi", std::ios::binary);
const std::shared_ptr<unsigned char> buffer(reinterpret_cast<unsigned char*>(av_malloc(8192)), &av_free);
const std::shared_ptr<AVIOContext> avioContext(avio_alloc_context(buffer.get(), 8192, 0, reinterpret_cast<void*>(static_cast<std::istream*>(&stream)), &readFunction, nullptr, nullptr), &av_free);
const auto avFormat = std::shared_ptr<AVFormatContext>(avformat_alloc_context(), &avformat_free_context);
auto avFormatPtr = avFormat.get();
avFormat->pb = avioContext.get();
avformat_open_input(&avFormatPtr, "dummyFilename", nullptr, nullptr);
This is great information and helped me out quite a bit, but there are a couple of issues people should be aware of. libavformat can and will mess with your buffer that you gave to avio_alloc_context. This leads to really annoying double-free errors or possibly memory leaks. When I started searching for the problem, I found https://lists.ffmpeg.org/pipermail/libav-user/2012-December/003257.html which nailed it perfectly.
My workaround when cleaning up from this work is to just go ahead and call
av_free(avioContext->buffer)
and then setting your own buffer pointer (that you allocated for your avio_alloc_context call) to NULL if you care.
Tomaka17's excellent answer gave me a good start toward solving an analogous problem using Qt QIODevice rather than std::istream. I found I needed to blend aspects of Tomaka17's solution, with aspects of the related experience at http://cdry.wordpress.com/2009/09/09/using-custom-io-callbacks-with-ffmpeg/
My custom Read function looks like this:
int readFunction(void* opaque, uint8_t* buf, int buf_size)
{
QIODevice* stream = (QIODevice*)opaque;
int numBytes = stream->read((char*)buf, buf_size);
return numBytes;
}
...but I also needed to create a custom Seek function:
int64_t seekFunction(void* opaque, int64_t offset, int whence)
{
if (whence == AVSEEK_SIZE)
return -1; // I don't know "size of my handle in bytes"
QIODevice* stream = (QIODevice*)opaque;
if (stream->isSequential())
return -1; // cannot seek a sequential stream
if (! stream->seek(offset) )
return -1;
return stream->pos();
}
...and I tied it together like this:
...
const int ioBufferSize = 32768;
unsigned char * ioBuffer = (unsigned char *)av_malloc(ioBufferSize + FF_INPUT_BUFFER_PADDING_SIZE); // can get av_free()ed by libav
AVIOContext * avioContext = avio_alloc_context(ioBuffer, ioBufferSize, 0, (void*)(&fileStream), &readFunction, NULL, &seekFunction);
AVFormatContext * container = avformat_alloc_context();
container->pb = avioContext;
avformat_open_input(&container, "dummyFileName", NULL, NULL);
...
Note I have not yet worked out the memory management issues.

C++ gsoap mime/dime for binary files in windows

I'm pretty close to losing my head here ;)
I'm developing a service that uses gsoap. I would like to return a mime response.
I have everything working, but when reading binary files, all kind of files like jpeg, pdf, etc... contains the \0 char several times over the data (if opened with notepad can see a lot of NUL).
So any code for reading a raw file fails miserably once it finds the end-of-file char. I have tried to replace the \0 but the file becomes incorrect to display.
I have also tried several methods including the example that comes with gsoap.
So resuming,
fstream generic code doesn't work.
for (i = 0; i < MAX_FILE_SIZE; i++)
{ if ((c = fgetc(fd)) == EOF)
break;
image.__ptr[i] = c;
}
doesn't work also
QFile::ReadAll works but when converting QString to char* the array is trimmed in the first NUL.
So, which is the best aproach to read an entire binary file? Its crazy how sometimes C++ at the basic.
Thanks in advance.
I have tried this as retnick suggested below
UrlToPdf urlToPdf;
urlToPdf.getUrl(&input, &result);
QByteArray raw = urlToPdf.getPdf(QString(result.data.c_str()));
int size = raw.toBase64().size();
char* arraydata = new char[size];
strcpy(arraydata, raw.toBase64().data());
soap_set_mime(this, "MIME_boundary", NULL);
if(soap_set_mime_attachment(this, arraydata, size, SOAP_MIME_BASE64, "application/pdf", NULL, NULL, NULL))
{
soap_clr_mime(this);
soapMessage = this->error;
}
but no luck... the mime response is bigger than the actual file...
David G Ortega
to read binary files use fread()
Once you read it treat it as an array of bytes not as a string. No string functions allowed.
EDIT: The gSOAP documentation section 14.1 explains how to send MIME attachments. I only refer to the relevant function (please read it all).
int soap_set_mime_attachment(struct soap *soap, char *buf_ptr, size_t buf_size,
enum soap_mime_encoding encoding,
const char *type, const char *id,
const char *location, const char *description);
char *buf_ptr is your buffer.
size_t buf_size is the length of your buffer.
So just do your QFile::ReadAll.
this gives you back a QByteArray. The QByteArray has the method
QByteArray QByteArray::toBase64 () const
this will return a
QByteArray base64image = QByteArray::toBase64(rawImage);
so now just do
soap_set_mime(soap, "MIME_boundary", "<boundary.xml#just-testing.com>");
/* add a base64 encoded image (base64image points to base64 data) */
soap_set_mime_attachment(soap,
base64image.data(), base64image.size(),
SOAP_MIME_BASE64, "image/jpeg",
"<boundary.jpeg#just-testing.com>", NULL, NULL);
I have not tested this but should be close to finished.
QFile::ReadAll works but when converting QString to char* the array is trimmed in the first NUL.
Are you sure it's actually trimmed or you just can't print/view the array in the debugger [since C-style strings are 0 terminated]?
If the QString itself is not enough for your needs you may want to convert it to a std::vector or similar using the range constructor or range assign, you'll have lots less grief towards the how much data the container holds.
EDIT:
Here's some sample code for fstream reading from a binary file:
std::ifstream image( <image_file_name>, std::ios_base::in | std::ios_base::binary );
std::istream_iterator< char > image_begin( image ), image_end;
std::vector< char > vctImage( image_begin, image_end );
The std::ios_base::binary is the most important part of the thing (similar to fopen/fread ["rb"] & probably QFile has something similar)
Also posting some sample code usually helps in getting the right answer.
HIH
I have the solution for this... As renick suggested I tried his idea but it failed without undestanding it so much... From a logical point of view recnick was right... bat the truth is that any king of string manipulation using QT QByteArray, std or mem is going to stop when findind the first \0 char, Qt QString can do it without problems but when converting it to c string (char*) the data will be again trimmed with the first \0
I found that using QDataStream::readRawData reads the file into a char* given the size to read. So thats how I accomplished the deal...
QFile file("test.pdf");
file.open(QIODevice::ReadOnly);
int size = file.size();
char* buffer = new char[size];
QDataStream stream(&file);
stream.readRawData(buffer, size);
soap_set_mime(this, "MIME_boundary", NULL);
if(soap_set_mime_attachment(this, buffer, size, SOAP_MIME_BINARY, "application/pdf", NULL, NULL, NULL))
{
soap_clr_mime(this);
soapMessage = this->error;
}
Note that in the line
if(soap_set_mime_attachment(this, buffer, size, SOAP_MIME_BINARY, "application/pdf", NULL, NULL, NULL))
I'm still using the size var instead of doing sizeof(buffer) or any other aproach since this one is going to trimm again the data qhen finding the first \0...
Hope this helps...
David G Ortega