Using byte array (cpp) in GRPC - c++

I am having a lot of trouble with GRPC when using byte array. This is by .proto
message myType {
int32 format = 1;
bytes data = 2;
}
I am using CPP for Server implementation and Java for Client. Using ByteString in Java is a breeze but cannot deserialize in CPP (byte[] being changed from what was being sent from Java).
buffer is a byte[] byte buffer[<large_size>] And I'm converting the byte array (it's an image) into a smaller byte array, and it's crashing when trying to convert the byte[] received from grpc. The conversion function in CPP is good as I used it with the same image before using GRPC
This is the deserialization code for CPP. Here "req" is a myType object, and buffer is a byte[]
myFormat = req->format();
dataLen = req->data().length();
memcpy(buffer, req->data().c_str(), dataLen);
From what I understand, req->data() is in cpp std::string format

On the client side, you should pass both the parameter and its length.
parameter.set_framemat(mat, 12);

Do check if the length of the array at the server side is not zero. Note that bytes is char array and grpc is de marshalling it as string. So if say the array is filled with null characters the data length comes as zero.
I was trying a similar use case
Proto file
message Parameters {
bytes framemat = 16;
};
Client Snippet
const char mat[12] = {0}
parameter.set_framemat(mat);
stream->Write(parameter);
Server Snippet
std::thread reader([&]() {
::nokia::nas::Parameters request;
while (stream->Read(&request))
{
//get the params
grpc::string mat = request.framemat();
logger->info(" Length of Bytes= {} , mat.length()}
Output was zero!
So I changed the client input to check with some char strings and sure enough the data length was coming as 4 at the server side; because this time it was a valid string and string length matched.
const char mat[12] = "sdsd";
Better way is to specify in the proto the data as well as the data length
message StreamBytes {
bytes data_bytes =1;
int32 data_length = 2;
};

Related

Is there a way to force Protocol Buffer to use the constant field size?

I have a file with constant sized (or so I hoped) protobuf messages defined here:
message FrameData {
required int32 index = 1;
required bytes timeStamp = 2;
required int32 timeStampSize = 3;
required bytes frame = 4;
required int32 frameSize = 5;
}
The file contains hundreds protobuf messages, and all the frames should always be the same size. When I load the file, however, I noticed that I sometimes get corrupted data, usually when index has a wide dynamic range.
Protobuf shrinks the data as much as possible, packing ints based on their value - I suspect that it causes my FrameData objects to have slightly different sizes.
Is there a way to force protobuf to use a constant field size? Specifically for int32?
(Another option is to use bytes type for all fields, but I'd like to avoid that)
If you want the integer to have fixed length, you can use the corresponding fixed size integer type: int32 -> sfixed32, uint32 -> fixed32, and so on.
However, I don't think it's a good idea to 'guess' the length of the serialized protobuf message. Instead, you should also save the length in your file. For example:
FILE *fp = fopen("data", "w");
FrameData frame;
string serialized;
frame.SerializeToString(&serialized);
// write length first
size_t length = serialized.size();
fwrite(reinterpret_cast<const char*>(&length), sizeof(length), 1, fp);
// then write the serialized data
fwrite(serialized.c_str(), 1, serialized.size(), fp);
// write other protobuf messages
When parsing the file:
FILE *fp = fopen("data", "r");
size_t length = 0;
// read length first
fread(&length, sizeof(length), 1, fp);
// then read serialized data
char *buf = new char[length];
fread(buf, length, 1, fp);
// Parse protobuf
FrameData frame;
frame.ParseFromArray(buf, length);
// read other messages.

how to read an http request(an http proxy server)

I have to make an http proxy. In the proxy side I have to parse the http request sent by the user.
The question is: How to read binary data from the client such at the final I will get an array of char that contains the full request .
What I did is: Read 1 Byte each time.
char c;
int n = read(con,&c,1);
I saw many implementation where we use 1024 bytes as the size of the buffer , but are we sure that the size of the request will not exceeds 1024?
Normally in first place I have to allocate the memory for the buffer array , so how can i know the size of the request to allocate the same size of memory?
My full methods:
void readToken(int con,char *token){
char c;
int i=0;
do{
int n = read(con,&c,1);
*token++ = c;
}while(c!=' ' && c!='\n');
}
void readLine(int con,char *line){
char c;int i=0;
do{
int n = read(con,&c,1);
*line++ = c;
}while(c!='\n');
}
char * handleRequest(int con){
char resource[30];
char version[5];
char method[4] ;
//i read 4 byte to get the method tyepe
int n = read(con,&method,4);
//here i read until i get a blank space
readToken(con,resource);
//readToken(con,version);
printf("the method is%s\n",method);
printf("the resource asked is%s\n",resource);
//printf("the resource asked is%s\n",version);
printf("the method read is %s",firstLine);
readLine(con,hostLine);
printf("the method read is %s",hostLine);
}
Reading by a single character is terribly inefficient and slows you down tremendously. Instead, you should be reading by chunks of approriate size (1024 seems as good initial guess as any) in the loop and append read buffer to the total data read so far. It is extremely easy to do with C++ std::vector.
Parsing an HTTP request is a quite complicated task, I think it could be easier to use a library like http-parser, which does the parsing for you in a very efficient way.

Audio over socket c++

I am capturing some audio from my microphone using SFML.
The data is being stored in samples of type Int16*.
Int16* samples;
My question is. What should I do to this samples to stream it over a socket to be played in another place? I ask in relation of data type. Do I need to convert this Int16 array to another type? Or can I just send this Int16* as it is?
EDIT
void BroadcastRecorder::loadBufferFromSamples()
{
//m_samples is of type vector<Int16*>
if (!m_samples.empty()){
m_buffer.loadFromSamples(&m_samples[0], m_samples.size(), 1, getSampleRate());
m_samples.clear();
}
}
void Broadcaster::Send()
{
//load the buffer with the samples
if(!m_recorder->empty()){
m_recorder->loadBufferFromSamples();
const sf::SoundBuffer& buffer = m_recorder->getBuffer();
size_t dataLength = m_recorder->GetSamplesSize();
wxSocketClient * socket = new wxSocketClient(wxSOCKET_NOWAIT);
socket->Notify(false);
// ------------- DATA----------------------
wxString data = "";
wxString strToPrepend(_("--myboundary\r\nContent-Type: audio/wav\r\n"));
wxString strToAppend(_("\r\n--myboundary\r\n"));
// ------------- HEADER -----------------------
wxString header = "";
header.append("POST ");
header.append("/cgi-bin/operator/transmit");
header.append(" HTTP/1.0\r\n");
header.append("Content-Type: multipart/form-data; boundary=--myboundary\r\n");
header.append("Content-Length: " + wxString::Format(wxT("%i"),(dataLength + strToPrepend.Len() + strToAppend.Len()) ) + "\r\n");
header.append("Authorization: Basic keykeykeykey\r\n");
header.append("\r\n");
//-------------- CONNECTION ---------------
wxString host = _("192.168.50.11");
wxIPV4address * address = new wxIPV4address();
address->Hostname(host);
address->Service(8084);
if (socket->Connect(*address)){
//Write header
socket->Write(header.c_str(),header.Len());
//Write data
socket->Write(strToPrepend.c_str(),strToPrepend.Len());
const sf::Int16* samples = buffer.getSamples();
const char* bytesData = reinterpret_cast<const char*>(samples);
socket->Write(bytesData,dataLength);
socket->Write(strToAppend.c_str(),strToAppend.Len());
socket->Close();
}
delete socket;
delete address;
}
}
I am getting only some noises between gaps.
BTW. The audio is being sent to an IP camera p2 connector.
The data format is just the way your application treats them. After all you send raw bytes over a socket. And you can do it with anything you want
Int16 data;
const char* pBytesOfData = (const char*) &data;
int size = sizeof (Int16);
send( socket, pBytesOfdata, size, flags);
When the bytes arrive on the second end it is up to you to interpret them correctly. Probably you will want again treat them as Int16. You need to have a protocol (common way of communication) to do it right (maybe send size of the data at the begining of the transmission, etc).
You can also take a look on libraries that ease serialization: Boost.Asio and Boost.Serialization.
Firstly, You need to create and bind a socket. Then you have to send the data stored in "samples" to another peer by using socket API. For using socket API to send the data, you need to convert this data to char*. As send API of socket takes input of data you need to send as char*. For more information about sending you can go through this link. This is for windows. For Unix you can check the manpage for send API for unix.
Int16* is a pointer. The samples you get should also have an associated length. Your data will likely be between addresses: [samples, samples + length) (where samples is the address to the first sample).
To play the samples remotely (actual code will depend on what APIs you use):
open socket
in a loop
get samples from your microphone
transmit the data over socket
on the server, you will have to read samples in a loop and send them to whatever sound output API you use.
Sockets work with bytes, so in the end you will send bytes. As long as the way you interpret these bytes on the receiving side matches the data you sent, you can send whatever you want in those bytes.
In this case sending the samples directly without conversion seems the most trivial thing to do, but you will probably need to send the size of the sample before, most likely in a fixed length format, for example:
[size on 4 bytes][sample on `size` bytes]
[] [] [] [][] [] [] [] [] []

struct char* over tcp

I have this struct:
typedef struct pkt {
unsigned int pktid;
unsigned int pkt_leng;
unsigned int strleng;
char* str;
};
in my code I'm doing this:
mystring = someFunctionWhoReturnString();
pkt* mypkt = (pkt*) malloc(sizeof(pkt));
mypkt -> pkt = htonl(C_MYPKT);
mypkt -> pkt_leng = htonl(sizeof(pkt));
mpykt -> strleng = htonl(mystring.lengh());
mypkt -> str = (char*)malloc(mystring.length());
strcpy(mypkt -> str, mystring.c_str(), mystring.length());
after this, if i check what is stored on mypkt->str, the string its there.
but when i receive the pkt, i got just some garbage (the size is fine) where suppose to get the char* (the rest of the data in the packet arrive ok).
There is some smart way to accomplish this task without use a char[] with static size?
Im working on VC++ 2010.
Realize that sizeof(pkt) == 12, the size of three integers and a pointer. It doesn't include the length of the data the pointer points to. What you are sending over is three integers and the address of the string on the sending machine - an address that's completely useless to the receiving machine, of course.
Instead, you need to prepare a flat buffer that would have three integers immediately followed by character data.

Do I need to encode using Base 64 in my web service?

I am transferring messages to mobile devices via a web service. The data is an xml string, which I compress using GZipStream, then encode using Base64.
I am getting out-of memory exceptions in the emulator and looking to optimise the process so I have stopped passing the string around by value and removed unecessary copies of byte arrays.
Now I'm wondering about the Base64 encoding. It increases the size of the message, the processing and the memory requirements. Is it strictly necessary?
Edit: Here is how I decompress:
public static byte[] ConvertMessageStringToByteArray(ref string isXml)
{
return fDecompress(Convert.FromBase64String(isXml));
}
public static byte[] fDecompress(byte[] ivBytes)
{
const int INT_BufferSize = 2048;
using (MemoryStream lvMSIn = new MemoryStream(ivBytes))
using (GZipInputStream lvZipStream = new GZipInputStream(lvMSIn, ivBytes.Length))
using (MemoryStream lvMSOut = new MemoryStream())
{
byte[] lvBuffer = new byte[INT_BufferSize];
int liSize;
while (true)
{
liSize = lvZipStream.Read(lvBuffer, 0, INT_BufferSize);
if (liSize <= 0)
break;
lvMSOut.Write(lvBuffer, 0, liSize);
}
return lvMSOut.ToArray();
}
}
gzip (which is inside the GZipStream) produces binary data - they won't fit into a 7-bit text message (SOAP is a text message) unless you do somethning like base64 encoding on them.
Perhaps the solution is to not gzip/encode (decode/ungzip) a whole buffer, but use streams for that - connect a gzipping stream to an encoding stream and read the result from the output of the latter (or connect the decoding stream to the ungzipping stream). This way you have a chance to consume less memory.