Want to send image through RabbitMQ-C but the image file too big. Receiver cannot retrieve the image. So, I converted image to base64 then put it in JSON.
const char *msg;
FILE *image1;
if (image1 = fopen(path, "rb")) {
fseek(image1, 0, SEEK_END); //used to move file pointer to a specific position
// if fseek(0 success, return 0; not successful, return non-zero value
//SEEK_END: end of file
length = ftell(image1); //ftell(): used to get total size of file after moving the file pointer at the end of the file
sprintf(tmp, "size of file: %d bytes", length);
//convert image to base64
std::string line;
std::ifstream myfile;
myfile.open(path, std::ifstream::binary);
std::vector<char> data((std::istreambuf_iterator<char>(myfile)), std::istreambuf_iterator<char>() );
std::string base64_encode(unsigned char const* bytes_to_encode, unsigned int in_len);
std::string code = base64_encode((unsigned char*)&data[0], (unsigned int)data.size());
//convert std::string to const char
const char* base64_Image = code.c_str();
json j ;
j.push_back("Title");
j.push_back("content");
j.push_back(base64_Image);
std::string sa = j.dump();
msg = sa.c_str(); //convert std::string to const char*
}
else {
return;
}
Using RabbitMQ-C to send the message(msg) to receiver but failed
[error point to here]
is const char* cannot use amqp_cstring_bytes(msg) to convert to amqp_bytes_t??
respo = amqp_basic_publish(conn, 1, amqp_cstring_bytes(exchange), amqp_cstring_bytes(routing_key),0, 0, NULL, amqp_cstring_bytes(msg));
and get this error
If there is a handler for this exception, the program may be safely continued.```
Anyone know how to send image as JSON using RabbitMQ-C & C++ ?
amqp_cstring_bytes expects a C string, which is normally terminated by a NUL byte. Your PNG file is almost guaranteed to contain a NUL byte, so that explains why your message got cut off midway through.
As for the code in your paste: the pointer returned by sa.c_str() is only valid while sa is alive and unmodified. Once you exit the block containing sa's definition, the variable is dead and buried.
Instead, get a buffer of the appropriate size with amqp_bytes_alloc and return that:
amqp_bytes_t bytes = amqp_bytes_malloc(sa.length());
strncpy((char *)bytes.bytes, sa.c_str(), sa.length());
then pass the bytes object to amqp_basic_publish. Don't forget to ampqp_bytes_free it when you're done.
Related
I have been trying to encode the binary data of an application as base64 (specifically boosts base64), but I have run into an issue where the carriage return after the dos header is not being encoded correctly.
it should look like this:
This program cannot be run in DOS mode.[CR]
[CR][LF]
but instead its outputting like this:
This program cannot be run in DOS mode.[CR][LF]
it seems this first carriage return is being skipped, which then causes the DOS header to be invalid when attempting to run the program.
the code for the base64 algorithm I am using can be found at: https://www.boost.org/doc/libs/1_66_0/boost/beast/core/detail/base64.hpp
Thanks so much!
void load_file(const char* filename, char** file_out, size_t& size_out)
{
FILE* file;
fopen_s(&file, filename, "r");
if (!file)
return false;
fseek(file, 0, SEEK_END);
size = ftell(file);
rewind(file);
*out = new char[size];
fread(*out, size, 1, file);
fclose(file);
}
void some_func()
{
char* file_in;
size_t file_in_size;
load_file("filename.bin", &file_in, file_in_size);
auto encoded_size = base64::encoded_size(file_in_size);
auto file_encoded = new char[encoded_size];
memset(0, file_encoded, encoded_size);
base64::encode(file_encoded, file_in, file_in_size);
std::ofstream orig("orig.bin", std::ios_base::binary);
for (int i = 0; i < file_in_size; i++)
{
auto c = file_in[i];
orig << c; // DOS header contains a NULL as the 3rd char, don't allow it to be null terminated early, may cause ending nulls but does not affect binary files.
}
orig.close();
std::ofstream encoded("encoded.txt"); //pass this output through a base64 to file website.
encoded << file_encoded; // for loop not required, does not contain nulls (besides ending null) will contain trailing encoded nulls.
encoded.close();
auto decoded_size = base64::decoded_size(encoded_size);
auto file_decoded = new char[decoded_size];
memset(0, file_decoded, decoded_size); // again trailing nulls but it doesn't matter for binary file operation. just wasted disk space.
base64::decode(file_decoded, file_encoded, encoded_size);
std::ofstream decoded("decoded.bin", std::ios_base::binary);
for (int i = 0; i < decoded_size; i++)
{
auto c = file_decoded[i];
decoded << c;
}
decoded.close();
free(file_in);
free(file_encoded);
free(file_decoded);
}
The above code will show that the file reading does not remove the carriage return, while the encoding of the file into base64 does.
Okay thanks for adding the code!
I tried it, and indeed there was "strangeness", even after I simplified the code (mostly to make it C++, instead of C).
So what do you do? You look at the documentation for the functions. That seems complicated since, after all, detail::base64 is, by definition, not part of public API, and "undocumented".
However, you can still read the comments at the functions involved, and they are pretty clear:
/** Encode a series of octets as a padded, base64 string.
The resulting string will not be null terminated.
#par Requires
The memory pointed to by `out` points to valid memory
of at least `encoded_size(len)` bytes.
#return The number of characters written to `out`. This
will exclude any null termination.
*/
std::size_t
encode(void* dest, void const* src, std::size_t len)
And
/** Decode a padded base64 string into a series of octets.
#par Requires
The memory pointed to by `out` points to valid memory
of at least `decoded_size(len)` bytes.
#return The number of octets written to `out`, and
the number of characters read from the input string,
expressed as a pair.
*/
std::pair<std::size_t, std::size_t>
decode(void* dest, char const* src, std::size_t len)
Conclusion: What Is Wrong?
Nothing about "dos headers" or "carriage returns". Perhaps maybe something about "rb" in fopen (what's the differences between r and rb in fopen), but why even use that:
template <typename Out> Out load_file(std::string const& filename, Out out) {
std::ifstream ifs(filename, std::ios::binary); // or "rb" on your fopen
ifs.exceptions(std::ios::failbit |
std::ios::badbit); // we prefer exceptions
return std::copy(std::istreambuf_iterator<char>(ifs), {}, out);
}
The real issue is: your code ignored all return values from encode/decode.
The encoded_size and decoded_size values are estimations that will give you enough space to store the result, but you have to correct it to the actual size after performing the encoding/decoding.
Here's my fixed and simplified example. Notice how the md5sums checkout:
Live On Coliru
#include <boost/beast/core/detail/base64.hpp>
#include <fstream>
#include <iostream>
#include <vector>
namespace base64 = boost::beast::detail::base64;
template <typename Out> Out load_file(std::string const& filename, Out out) {
std::ifstream ifs(filename, std::ios::binary); // or "rb" on your fopen
ifs.exceptions(std::ios::failbit |
std::ios::badbit); // we prefer exceptions
return std::copy(std::istreambuf_iterator<char>(ifs), {}, out);
}
int main() {
std::vector<char> input;
load_file("filename.bin", back_inserter(input));
// allocate "enough" space, using an upperbound prediction:
std::string encoded(base64::encoded_size(input.size()), '\0');
// encode returns the **actual** encoded_size:
auto encoded_size = base64::encode(encoded.data(), input.data(), input.size());
encoded.resize(encoded_size); // so adjust the size
std::ofstream("orig.bin", std::ios::binary)
.write(input.data(), input.size());
std::ofstream("encoded.txt") << encoded;
// allocate "enough" space, using an upperbound prediction:
std::vector<char> decoded(base64::decoded_size(encoded_size), 0);
auto [decoded_size, // decode returns the **actual** decoded_size
processed] // (as well as number of encoded bytes processed)
= base64::decode(decoded.data(), encoded.data(), encoded.size());
decoded.resize(decoded_size); // so adjust the size
std::ofstream("decoded.bin", std::ios::binary)
.write(decoded.data(), decoded.size());
}
Prints. When run on "itself" using
g++ -std=c++20 -O2 -Wall -pedantic -pthread main.cpp -o filename.bin && ./filename.bin
md5sum filename.bin orig.bin decoded.bin
base64 -d < encoded.txt | md5sum
It prints
d4c96726eb621374fa1b7f0fa92025bf filename.bin
d4c96726eb621374fa1b7f0fa92025bf orig.bin
d4c96726eb621374fa1b7f0fa92025bf decoded.bin
d4c96726eb621374fa1b7f0fa92025bf -
I serialize the file via the code beneath, and send it over winsocks, this works fine with textfiles, but when I tried to send a jpg, the string contains \0 as some of the character elements, so the sockets only send part of the string, thinking \0 is the end, i was considering replacing \0 with something else, but say i replace it with 'xx', then replace it back on the other end, what if the file had natural occurrences of 'xx' that get lost? Sure I could make a large, unlikely sequence, but that bloats the file.
Any help appreciated.
char* read_file(string path, int& len)
{
std::ifstream infile(path);
infile.seekg(0, infile.end);
size_t length = infile.tellg();
infile.seekg(0, infile.beg);
len = length;
char* buffer = new char[len]();
infile.read(buffer, length);
return buffer;
}
string load_to_buffer(string file)
{
char* img;
int ln;
img = read_file(file, ln);
string s = "";
for (int i = 1; i <= ln; i++){
char c = *(img + i);
s += c;
}
return s;
}
Probably somewhere in your code (that isn't seen in the code you have posted) you use strlen() or std::string::length() to send the data, and/or you use std::string::c_str() to get the buffer. This results in truncated data because these functions stop at \0.
std::string is not good to handle binary data. Use std::vector<char> instead, and remove the new[] stuff.
I need to create a custom reading callback function that can read contents of a file in the form of a std::string into a uint8_t * buf. I tried multiple different methods found around the internet and on stackoverflow but sometimes it works and other the the program infinitely loops or stops execution half way.
I have no problems with amr/3gp files but all wav/pcm files are causing some problems for some reason. All I know its something to do with the reading function I have so far.
Ideally I would like to be able to give the program any type of file and then it converts it.
This is how I am calling the readCallback function from the code:
//create the buffer
uint8_t * avio_ctx_buffer = NULL;
//allocate space for the buffer using ffmpeg allocation method
avio_ctx_buffer = (uint8_t *) av_malloc(avio_ctx_buffer_size);
//Allocate and initialize an AVIOContext for buffered I/O.
//audio variable contains the contents of the audio file
avio_ctx = avio_alloc_context(avio_ctx_buffer, avio_ctx_buffer_size,0, &audio, &readCallback, NULL, NULL);
Here is the callback function that works on some types of files:
static int readCallback(void* opaque, uint8_t * buf, int buf_size){
std::string * file =static_cast<std::string *>(opaque);
if(file->length() == 0){
return AVERROR_EOF; //if we reach to the end of the string, return
// return End of file
}
// Creating a vector of the string size
std::vector<uint8_t> array(file->length());
//Copying the contents of the string into the vector
std::copy(file->begin(),file->end(),array.begin());
//Copying the vector into buf
std::copy(array.begin(),array.end(),buf);
return file->length();
}
After tyring some stuff for awhile, I got a solution using std::stringstream and it works well with several formats I tested with so far: 3gp/amr,wav/pcm,mp3.
Here a the snippet of code:
//Create a string stream that contains the audio
std::stringstream audio_stream(audio);
//create the buffer
uint8_t * avio_ctx_buffer = NULL;
//allocate space for the buffer using ffmpeg allocation method
avio_ctx_buffer = (uint8_t *) av_malloc(avio_ctx_buffer_size);
//Allocate and initialize an AVIOContext for buffered I/O.
//Pass the stringstream audio_stream
avio_ctx = avio_alloc_context(avio_ctx_buffer, avio_ctx_buffer_size,0,&audio_stream, &readCallback, NULL, NULL);
The callback function:
static int readFunction1(void* opaque, uint8_t * buf, int buf_size){
//Cast the opaque pointer to std::stringstream
std::stringstream * me =static_cast<std::stringstream *>(opaque);
//If we are at the end of the stream return FFmpeg's EOF
if(me->tellg() == buf_size){
return AVERROR_EOF;
}
// Read the stream into the buf and cast it to char *
me->read((char*)buf, buf_size);
//return how many characters extracted
return me->tellg();
}
I'm trying to get image that are stored in BLOB and then save it as jpg.
Here i retrieve the binary data and save it in str;
string str;
SQLCHAR buf[500] = {0};
while ((SQL_SUCCEEDED(SQLGetData(StmtHandle, colnum, SQL_C_BINARY, buf, sizeof(buf), NULL))))
{
string data(reinterpret_cast< const char* >(buf), reinterpret_cast< const char* >(buf) + sizeof(buf));
str = str + data;
}
Then i write it in the file
ofstream file;
file.open("C:\\Users\\tom\\Desktop\\img.jpeg");
file << str;
file.close();
and i get the incorrect image.
What's wrong with this method of data extraction (i used this) ?
I'm not familiar with ODBC programming, but at first sight, one issue I can see is that you assume your data length is multiple of your buffer size. But the last read is not guaranteed to return exactly 500 bytes of data.
You should write something like that. Maybe:
string str;
SQLCHAR buf[500];
SQLLEN cbLeft; // #bytes remained
while ((SQL_SUCCEEDED(SQLGetData(StmtHandle,
colnum,
SQL_C_BINARY,
buf,
sizeof(buf),
&cbLeft))))
// ^^^^^^^
{
string data(reinterpret_cast< const char* >(buf),
reinterpret_cast< const char* >(buf)
+ cbLeft);
// ^^^^^^
str = str + data;
Please take a few minutes to review Using Length/Indicator Values in order to check how the length/indicator value is used.
I have a C++ linux server, with basic server sockets. Here's what i'm using to server png images:
string ms = "HTTP/1.0 200 OK\r\nContent-type: image/png\r\n\r\n";
ifstream myfile("xampl.png", ios::in|ios::binary|ios::ate);
string line;
char* memblock;
streampos size;
if(myfile.is_open()){
size = myfile.tellg();
memblock = new char [size];
myfile.seekg (0, ios::beg);
myfile.read (memblock, size);
myfile.close();
}
ms.append(string(memblock));
cout << "\"" << ms << "\"" << endl;
char* msg = new char[ms.size()+1];
copy(ms.begin(), ms.end(), msg);
msg[ms.size()] = '\0';
int len;
ssize_t bytes_sent;
len = strlen(msg);
bytes_sent = send(new_sd, msg, len, 0);
I know i'm trying to read the png file as a binary file, but i have no idea what else to do. When i telnet to this server, i get a response with weird characters which make me believe that i have served the file, but when i check it out in my browser, i get the image not found icon, in all browsers. Please help...
OK, let's start with the equivalent PHP code to get rid of the mess it is to do the same thing in C++
// send required headers as plain text
header("Content-type: image/png");
// read the image as a binary block
$img_data = file_get_contents("xample.png");
// send it
echo $img_data;
C++ equivalent:
string headers = "HTTP/1.0 200 OK\r\nContent-type: image/png\r\n\r\n";
send (new_sd, headers.data(), headers.length(), 0);
ifstream f("xampl.png", ios::in|ios::binary|ios::ate);
if(!f.is_open()) error ("bloody file is nowhere to be found. Call the cops");
streampos size = f.tellg();
char* image = new char [size];
f.seekg (0, ios::beg);
f.read (image, size);
f.close();
send (new_sd, image, size, 0);
Converting memblock to a string is not going to work if it has embedded null characters, which it almost certainly does, and you don't pass the length. I'm not sure why you're doing all that faffing about with char*s and std::strings, but once you have your memblock and size, use them for send. If you want to prefix the response string, just send it first.