Do I need to encode using Base 64 in my web service? - web-services

I am transferring messages to mobile devices via a web service. The data is an xml string, which I compress using GZipStream, then encode using Base64.
I am getting out-of memory exceptions in the emulator and looking to optimise the process so I have stopped passing the string around by value and removed unecessary copies of byte arrays.
Now I'm wondering about the Base64 encoding. It increases the size of the message, the processing and the memory requirements. Is it strictly necessary?
Edit: Here is how I decompress:
public static byte[] ConvertMessageStringToByteArray(ref string isXml)
{
return fDecompress(Convert.FromBase64String(isXml));
}
public static byte[] fDecompress(byte[] ivBytes)
{
const int INT_BufferSize = 2048;
using (MemoryStream lvMSIn = new MemoryStream(ivBytes))
using (GZipInputStream lvZipStream = new GZipInputStream(lvMSIn, ivBytes.Length))
using (MemoryStream lvMSOut = new MemoryStream())
{
byte[] lvBuffer = new byte[INT_BufferSize];
int liSize;
while (true)
{
liSize = lvZipStream.Read(lvBuffer, 0, INT_BufferSize);
if (liSize <= 0)
break;
lvMSOut.Write(lvBuffer, 0, liSize);
}
return lvMSOut.ToArray();
}
}

gzip (which is inside the GZipStream) produces binary data - they won't fit into a 7-bit text message (SOAP is a text message) unless you do somethning like base64 encoding on them.
Perhaps the solution is to not gzip/encode (decode/ungzip) a whole buffer, but use streams for that - connect a gzipping stream to an encoding stream and read the result from the output of the latter (or connect the decoding stream to the ungzipping stream). This way you have a chance to consume less memory.

Related

How do I clear a stream in C++ for a nanoPB protocol buffer to use?

I'm using nanopb in a project on ESP32, in platformIO. It's an arduino flavored C++ codebase.
I'm using some protobufs to encode data for transfer. And I've set up the memory that the protobufs will use at the root level to avoid re-allocating the memory every time a message is sent.
// variables to store the buffer/stream the data will render into...
uint8_t buffer[MESSAGE_BUFFER_SIZE];
pb_ostream_t stream = pb_ostream_from_buffer(buffer, sizeof(buffer));
// object to hold the data on its way into the encode action...
TestMessage abCounts = TestMessage_init_zero;
Then I've got my function that encodes data into this stream via protobufs (using nanoPB)...
void encodeABCounts(int32_t button_a, int32_t button_b, String message)
{
// populate our data structure...
abCounts.a_count = button_a;
abCounts.b_count = button_b;
strcpy(abCounts.message, message.c_str());
// encode the data!
bool status = pb_encode(&stream, TestMessage_fields, &abCounts);
if (!status)
{
Serial.println("Failed to encode");
return;
}
// and here's some debug code I'll discuss below....
Serial.print("Message Length: ");
Serial.println(stream.bytes_written);
for (int i = 0; i < stream.bytes_written; i++)
{
Serial.printf("%02X", buffer[i]);
}
Serial.println("");
}
Ok. So the first time this encode action occurs this is the data I get in the serial monitor...
Message Length: 14
Message: 080110001A087370656369616C41
And that's great - everything looks good. But the second time I call encodeABCounts(), and the third time, and the forth, I get this...
Message Length: 28
Message: 080110001A087370656369616C41080210001A087370656369616C41
Message Length: 42
Message: 080110001A087370656369616C41080210001A087370656369616C41080310001A087370656369616C41
Message Length: 56
Message: 080110001A087370656369616C41080210001A087370656369616C41080310001A087370656369616C41080410001A087370656369616C41
...etc
So it didn't clear out the buffer/stream when the new data went in. Each time the buffer/stream is just getting longer as new data is appended.
How do I reset the stream/buffer to a state where it's ready for new data to be encoded and stuck in there, without reallocating the memory?
Thanks!
To reset the stream, simply re-create it. Now you have this:
pb_ostream_t stream = pb_ostream_from_buffer(buffer, sizeof(buffer));
You can recreate it by assigning again:
stream = pb_ostream_from_buffer(buffer, sizeof(buffer));
Though you can also move the initial stream declaration to inside encodeABCounts() to create it every time, if you don't have any particular reason to keep it around after use. The stream creation is very lightweight, as it just stores the location and size of the buffer.

Using byte array (cpp) in GRPC

I am having a lot of trouble with GRPC when using byte array. This is by .proto
message myType {
int32 format = 1;
bytes data = 2;
}
I am using CPP for Server implementation and Java for Client. Using ByteString in Java is a breeze but cannot deserialize in CPP (byte[] being changed from what was being sent from Java).
buffer is a byte[] byte buffer[<large_size>] And I'm converting the byte array (it's an image) into a smaller byte array, and it's crashing when trying to convert the byte[] received from grpc. The conversion function in CPP is good as I used it with the same image before using GRPC
This is the deserialization code for CPP. Here "req" is a myType object, and buffer is a byte[]
myFormat = req->format();
dataLen = req->data().length();
memcpy(buffer, req->data().c_str(), dataLen);
From what I understand, req->data() is in cpp std::string format
On the client side, you should pass both the parameter and its length.
parameter.set_framemat(mat, 12);
Do check if the length of the array at the server side is not zero. Note that bytes is char array and grpc is de marshalling it as string. So if say the array is filled with null characters the data length comes as zero.
I was trying a similar use case
Proto file
message Parameters {
bytes framemat = 16;
};
Client Snippet
const char mat[12] = {0}
parameter.set_framemat(mat);
stream->Write(parameter);
Server Snippet
std::thread reader([&]() {
::nokia::nas::Parameters request;
while (stream->Read(&request))
{
//get the params
grpc::string mat = request.framemat();
logger->info(" Length of Bytes= {} , mat.length()}
Output was zero!
So I changed the client input to check with some char strings and sure enough the data length was coming as 4 at the server side; because this time it was a valid string and string length matched.
const char mat[12] = "sdsd";
Better way is to specify in the proto the data as well as the data length
message StreamBytes {
bytes data_bytes =1;
int32 data_length = 2;
};

Is java.util.zip.GZIPOutputStream's output byte array portable?

I've this following code to serialize and compress a String:
private byte[] toZip(String xml) {
try{
ByteArrayOutputStream bos = new ByteArrayOutputStream();
GZIPOutputStream gz = new GZIPOutputStream(bos);
ObjectOutputStream oos = new ObjectOutputStream(gz);
oos.writeObject(xml);
oos.flush();
oos.close();
return bos.toByteArray();
} catch (IOException e){
log.error("Error", e);
if(log.isEnabledFor(MucamLogger.FINEST))log.finest(xml);
return null;
}
}
Are the returned byte[] portable?. I store it on a blob field on the data base. Could it be retrieved and decompress with any non-Java program (C++, .Net)?. Do this non-Java program recover the original String text?.
Yes, gzip streams are portable and the original uncompressed data will be recovered exactly. Assuming of course that you are faithfully transporting the binary compressed data. I often see that get messed up due to end-of-line conversions or unicode text conversions that could be easily avoided.

Encrypting flexible amount of data with GPGME

I'm currently writing a C++ application and would like to use GPGME for message signing, encryption and key management. I know I can encrypt data in this way:
err = gpgme_op_encrypt(mContext, recipients,...);
if(err) {
// .. error handling
}
result = gpgme_op_encrypt_result(mContext);
if(result->invalid_recipients){
//error handling
}
nbytes = gpgme_data_seek(encrypted_text, 0, SEEK_SET);
if(nbytes == -1) {
//error handling
}
buffer = malloc(MAXLEN);
nbytes = gpgme_data_read(encrypted_text, buffer, MAXLEN);
But as one can see I would have to use MAXLEN as limit for reading the encrypted data in my buffer. Is there a way to determine how long my encrypted data result will be in advance (given the plaintex)? Or will I have to accept the static limit?
I'm not familiar with this particular API but the gpgme_data_seek and gpgme_data_read call look like they may behave like read() and seek() from the file I/O system.
(1) Simply allocate as much buffer as you can effort (lets say N).
(2) Call n=gpgme_data_read(...,N) until N!=n.
(3) Check for error conditions (my guess is n<0)
proceed until you have processed all data you are interested in.

Audio over socket c++

I am capturing some audio from my microphone using SFML.
The data is being stored in samples of type Int16*.
Int16* samples;
My question is. What should I do to this samples to stream it over a socket to be played in another place? I ask in relation of data type. Do I need to convert this Int16 array to another type? Or can I just send this Int16* as it is?
EDIT
void BroadcastRecorder::loadBufferFromSamples()
{
//m_samples is of type vector<Int16*>
if (!m_samples.empty()){
m_buffer.loadFromSamples(&m_samples[0], m_samples.size(), 1, getSampleRate());
m_samples.clear();
}
}
void Broadcaster::Send()
{
//load the buffer with the samples
if(!m_recorder->empty()){
m_recorder->loadBufferFromSamples();
const sf::SoundBuffer& buffer = m_recorder->getBuffer();
size_t dataLength = m_recorder->GetSamplesSize();
wxSocketClient * socket = new wxSocketClient(wxSOCKET_NOWAIT);
socket->Notify(false);
// ------------- DATA----------------------
wxString data = "";
wxString strToPrepend(_("--myboundary\r\nContent-Type: audio/wav\r\n"));
wxString strToAppend(_("\r\n--myboundary\r\n"));
// ------------- HEADER -----------------------
wxString header = "";
header.append("POST ");
header.append("/cgi-bin/operator/transmit");
header.append(" HTTP/1.0\r\n");
header.append("Content-Type: multipart/form-data; boundary=--myboundary\r\n");
header.append("Content-Length: " + wxString::Format(wxT("%i"),(dataLength + strToPrepend.Len() + strToAppend.Len()) ) + "\r\n");
header.append("Authorization: Basic keykeykeykey\r\n");
header.append("\r\n");
//-------------- CONNECTION ---------------
wxString host = _("192.168.50.11");
wxIPV4address * address = new wxIPV4address();
address->Hostname(host);
address->Service(8084);
if (socket->Connect(*address)){
//Write header
socket->Write(header.c_str(),header.Len());
//Write data
socket->Write(strToPrepend.c_str(),strToPrepend.Len());
const sf::Int16* samples = buffer.getSamples();
const char* bytesData = reinterpret_cast<const char*>(samples);
socket->Write(bytesData,dataLength);
socket->Write(strToAppend.c_str(),strToAppend.Len());
socket->Close();
}
delete socket;
delete address;
}
}
I am getting only some noises between gaps.
BTW. The audio is being sent to an IP camera p2 connector.
The data format is just the way your application treats them. After all you send raw bytes over a socket. And you can do it with anything you want
Int16 data;
const char* pBytesOfData = (const char*) &data;
int size = sizeof (Int16);
send( socket, pBytesOfdata, size, flags);
When the bytes arrive on the second end it is up to you to interpret them correctly. Probably you will want again treat them as Int16. You need to have a protocol (common way of communication) to do it right (maybe send size of the data at the begining of the transmission, etc).
You can also take a look on libraries that ease serialization: Boost.Asio and Boost.Serialization.
Firstly, You need to create and bind a socket. Then you have to send the data stored in "samples" to another peer by using socket API. For using socket API to send the data, you need to convert this data to char*. As send API of socket takes input of data you need to send as char*. For more information about sending you can go through this link. This is for windows. For Unix you can check the manpage for send API for unix.
Int16* is a pointer. The samples you get should also have an associated length. Your data will likely be between addresses: [samples, samples + length) (where samples is the address to the first sample).
To play the samples remotely (actual code will depend on what APIs you use):
open socket
in a loop
get samples from your microphone
transmit the data over socket
on the server, you will have to read samples in a loop and send them to whatever sound output API you use.
Sockets work with bytes, so in the end you will send bytes. As long as the way you interpret these bytes on the receiving side matches the data you sent, you can send whatever you want in those bytes.
In this case sending the samples directly without conversion seems the most trivial thing to do, but you will probably need to send the size of the sample before, most likely in a fixed length format, for example:
[size on 4 bytes][sample on `size` bytes]
[] [] [] [][] [] [] [] [] []