I have a vector (which is just a wrapper over a char array) that is the input. The pkzip was created using c# sharpZipLib.
I stored the data into a file which I ran through a hex editor zip template that checked out. The input is good, it's not malformed. This all but the compressed data:
50 4B 03 04 14 00 00 00 08 00 51 B2 8B 4A B3 B6
6C B0 F6 18 00 00 40 07 01 00 07 00 00 00 2D 33
31 2F 31 32 38
<compressed data (6390 bytes)>
50 4B 01 02 14 00 14 00 00 00 08 00 51 B2 8B 4A
B3 B6 6C B0 F6 18 00 00 40 07 01 00 07 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 2D 33
31 2F 31 32 38 50 4B 05 06 00 00 00 00 01 00 01
00 35 00 00 00 1B 19 00 00 00 00
I have another vector which is to be the output. The inflated data will have about 67-68k, so I know it fits into the buffer.
For the life of me, I cannot get the minizip to inflate the former and store it into latter.
This is what I have so far:
#include "minizip\zlib.h"
#define ZLIB_WINAPI
std::vector<unsigned char> data;
/*...*/
std::vector<unsigned char> outBuffer(1024 * 1024);
z_stream stream;
stream.zalloc = Z_NULL;
stream.zfree = Z_NULL;
stream.opaque = Z_NULL;
stream.data_type = Z_BINARY;
stream.avail_in = data.size();
stream.avail_out = outBuffer.size();
stream.next_in = &data[0];
stream.next_out = &outBuffer[0];
int ret = inflateInit(&stream);
ret = inflate(&stream, 1);
ret = inflateEnd(&stream);
I used the debugger to step through the method and monitor the ret. inflate returned value -3 with message "incorrect header check".
This is a pkzip, which is a wrapper around zlib, but minizip should be a wrapper library around zlib that should support pkzip, shouldn't it be?
How do I have to modify this to work?
Since it starts with 50 4B 03 04 it is a PKZIP file, according to https://users.cs.jmu.edu/buchhofp/forensics/formats/pkzip.html
If these are zip files, then inflate is the wrong function. The zlib, gzip and zip formats are all different. You can read zip with zlib, if you use the right functions to do so. If you don't have the contrib, maybe download and rebuild zlib.
Here's some old code I have which works for zip files, using the zlib library. I might have moved some headers around, because the official zlib has them under zlib/contrib/minizip.
The arguments are filenames, so you'll have to modify it, or write your array to a file.
// #include <zlib/unzip.h>
#include <zlib/contrib/minizip/unzip.h>
/// return list of filenames in zip archive
std::list<std::string> GetZipFilenames(const char *szZipArchive){
std::list<std::string> results;
unzFile zip = unzOpen(szZipArchive);
if (zip){
unz_global_info info;
int rv = unzGetGlobalInfo(zip, &info);
if (UNZ_OK == unzGoToFirstFile(zip)){
do {
char szFilename[BUFSIZ];
if (UNZ_OK == unzGetCurrentFileInfo(zip, NULL, szFilename, sizeof(szFilename), NULL, 0, NULL, 0))
results.push_back(std::string(szFilename));
} while (UNZ_OK == unzGoToNextFile(zip));
}
}
return results;
}
/// extract the contents of szFilename inside szZipArchive
bool ExtractZipFileContents(const char *szZipArchive, const char *szFilename, std::string &contents){
bool result = false;
unzFile zip = unzOpen(szZipArchive);
if (zip){
if (UNZ_OK == unzLocateFile(zip, szFilename, 0)){
if (UNZ_OK == unzOpenCurrentFile(zip)){
char buffer[BUFSIZ];
size_t bytes;
while (0 < (bytes = unzReadCurrentFile(zip, buffer, sizeof(buffer)))){
contents += std::string(buffer, bytes);
}
unzCloseCurrentFile(zip);
result = (bytes == 0);
}
}
unzClose(zip);
}
return result;
}
If I understand the question correctly, you are trying to decompress the 6390 bytes of compressed data. That compressed data is a raw deflate stream, which has no zlib header or trailer. For that you would need to use inflateInit2(&stream, -MAX_WBITS) instead of inflateInit(&stream). The -MAX_WBITS requests decompression of raw deflate.
Zip-Utils
std::vector<unsigned char> inputBuffer;
std::vector<unsigned char> outBuffer(1024 * 1024);
HZIP hz = OpenZip(&inputBuffer[0], inputBuffer.capacity(), 0);
ZIPENTRY ze;
GetZipItem(hz, 0, &ze);
UnzipItem(hz, 0, &outBuffer[0], 1024 * 1024);
outBuffer.resize(ze.unc_size);
If inputBuffer contains a pkzip file, this snippet will unzip it and store contents in outBuffer.
That is all.
Related
This is very strange! I look around and find nothing. My test code is below:
#include "pch.h"
#include "Windows.h"
#include "openssl/ssl.h"
#pragma comment(lib,"../Include/lib/libssl.lib")
#pragma comment(lib,"../Include/lib/libcrypto.lib")
#pragma comment(lib,"Crypt32.lib")
#pragma comment(lib,"ws2_32.lib")
#include <stdlib.h>
#include <crtdbg.h>
#define _CRTDBG_MAP_ALLOC
const char* priKey = "607BC8BA457EC0A1B5ABAD88061502BEA5844E17C7DD247345CD57E501B3300B4B8889D3DFCF5017847606508DF8B283C701F35007D5F4DBA96023DD3B0CCE062D8F63BCC16021B944A1E88861B70D456BAA1E0E69C90DFCA13B4D4EA5403DA25FEBF94B0515644A7D2DF88299189688DA5D8951512ADC3B1A35BAEEC147F69AB101048B9029E65A77A438A05AE30337E82B0571B80A955C72EA0DB3B457ECC8B81F346624E3D22755FEB3D84F810431974803E13E26BF95AF7AB590E17C0113BFE9B36BE12BE16D89273348E0CC094526CAF54ABF8044565EC9500EBF657265474BC362EBDFD78F513282AAF0EEF9BA0BB00F7FF9C7E61F00465BBD379D6201";
const char* pubKey = "AE7DF3EB95DF1F864F86DA9952BB44E760152986731253C96C135E5391AEFF68F5C1640552F1CCC2BA2C12C0C68C051E343B786F13215CEFC8221D9FA97D50E895EAF50D1AF32DC5EB40C9F1F8CA5145B43CEF83F2A89C9661AFA73A70D32951271C6BEFE1B5F24B512520DA7FD4EEC982F2D8057FE1938FA2FB54D8D282A25D8397298B75E154739EF16B8E2F18368F5BEEAD3D18528B9B1A63C731A71735CDB6102E187EF3377B72B58A124FA280891A79A2BD789D5DBA3618BBD74367F5C50A220204D90A59828C3C81FDBD9D2A91CBF6C8563C6FE987BE21B19BBC340DE4D42290D63909AD5D856D13B8CDC91D5701570045CE3609D4F8884F69120AD9A3";
void rsa_test(const char* n,const char* d)
{
RSA* rsa = RSA_new();
BIGNUM* bn = BN_new();
BIGNUM* bd = BN_new();
BIGNUM* be = BN_new();
BN_set_word(be, 0x10001);
if (n) BN_hex2bn(&bn, n);
if (d) BN_hex2bn(&bd, d);
RSA_set0_key(rsa, bn, be, bd);
//calc hash
const char* msg = "hello,rsa!!";
BYTE shaResult[SHA256_DIGEST_LENGTH] = { 0 };
SHA256((unsigned char*)msg, strlen(msg), shaResult);
//sign
unsigned int olen;
unsigned char sign[256] = { 0 };
RSA_sign(NID_sha256, shaResult, SHA256_DIGEST_LENGTH, sign, &olen,rsa);
RSA_free(rsa);
}
DWORD thread_test(LPVOID lpParam)
{
rsa_test(pubKey, priKey);
return 0;
}
int main()
{
//_CrtSetBreakAlloc(159);
_CrtSetDbgFlag(_CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF);
//CreateThread(nullptr, 0, thread_test, nullptr, 0, nullptr);
//rsa_test(pubKey,priKey);
system("pause");
}
Calling rsa_test(pubKey,priKey) directly in the main thread DO NOT cause memory leaks!
Calling CreateThread(nullptr, 0, thread_test, nullptr, 0, nullptr) comes out memory leaks!!!
Console output as follows:
Detected memory leaks!
Dumping objects ->
{173} normal block at 0x000002BDE6D44000, 264 bytes long.
Data: < _W- :_ s ! 6> D8 89 FD 5F 57 2D C5 3A 5F 82 73 F1 00 21 FA 36
{162} normal block at 0x000002BDE6D43AC0, 264 bytes long.
Data: < > 00 01 02 03 04 05 06 07 08 09 0A 0B 0C 0D 0E 0F
{161} normal block at 0x000002BDE6D2E2A0, 160 bytes long.
Data: <` > 60 F1 0F 91 F6 7F 00 00 00 00 00 00 00 00 00 00
{160} normal block at 0x000002BDE6D2DBA0, 160 bytes long.
Data: <` > 60 F1 0F 91 F6 7F 00 00 00 00 00 00 00 00 00 00
{159} normal block at 0x000002BDE6D48230, 352 bytes long.
Data: < P > 00 00 00 00 00 00 00 00 50 AB D3 E6 BD 02 00 00
{158} normal block at 0x000002BDE6D286B0, 12 bytes long.
Data: < > 00 00 00 00 00 00 00 00 01 00 00 00
Object dump complete.
Then I use _CrtSetBreakAlloc(159) (or other memory id) to location,and here is my call stack screenshot:
my vs2017 showing the break point at CRYPTO_zalloc(crypto/mem.c)
So my question is how to free those leak memory in thread!
Thanks a lot!!
Download my test code(build with visual studio 2017 x64)
https://www.openssl.org/docs/man1.1.0/man3/OPENSSL_thread_stop.html
OPENSSL_thread_stop();
will do it for you. You can call it like below
DWORD thread_test(LPVOID lpParam)
{
rsa_test(pubKey, priKey);
OPENSSL_thread_stop();
return 0;
}
I try to send a user defined struct which contains substructs over AMQP from one node to another. I am using the Apache Qpid library at the moment.
(I'm currently still testing my code on the PC before i rebuild it for my other devices)
my current method consist of a conversion from the struct to a bytestring and sending that over AMQP to deconverse it on the other side.
I do the following
//user defined struct
enum Quality
{
/// <summary>Value is reliable.</summary>
QUALITY_GOOD,
/// <summary>Value not reliable.</summary>
/// <remarks>
/// A variable may be declared unreliable if a sensor is not calibrated or
/// if the last query failed but older samples are still usable.
/// </remarks>
QUALITY_UNCERTAIN,
/// <summary>Value is invalid.</summary>
/// <remarks>
/// A variable may be declared bad if the measured value is out of range or
/// if a timeout occured.
/// </remarks>
QUALITY_BAD
};
struct Payload
{
/// <summary>Identifier that uniquely points to a single instance.</summary>
DdsInterface::Id id = DdsInterface::Id();
/// <summary>Human readable short name.</summary>
std::string name = "default";
/// <summary>Actual value.</summary>
long long value;
/// <summary>Quality of the Value.</summary>
Quality quality = QUALITY_GOOD;
/// <summary>Detailed quality of the variable.</summary>
QualityDetail qualityDetail = 0;
/// <summary>Unit of measure.</summary>
PhysicalQuantity quantity = 0;
Payload();
Payload(const DdsInterface::Id id, const std::string topic, const uint64_t counter);
};
//sender function
void QpidAMQP::AMQPPublish(const Payload& payload, bool durability, bool sync)
{
// Publish to MQTT broker
qpid::messaging::Message message;
message.setDurable(durability);
char b[sizeof (payload)];
memcpy(b, &payload, sizeof(payload));
//create stream of bytes to send over the line
message.setContent(b);
//message.setContent("testIfSend");
std::string temp = message.getContent();
print_bytes(temp.c_str(), sizeof (temp));// used to check the byte data
this->sender.send(message);
this->session.sync(sync);
}
//receiver functions
void *check_for_incoming_messages(QpidAMQP* amqp_instance) //called via pthread
{
qpid::messaging::Message message;
std::cout << "check for incoming messages" << std::endl;
while(amqp_instance->getReceiver()->fetch(message, qpid::messaging::Duration::FOREVER))
{
amqp_instance->on_message(&message);
}
return nullptr;
}
void QpidAMQP::on_message(qpid::messaging::Message *message)
{
/// make sure message topic and payload are copied!!!
if (this->handler)
{
std::string temp = message->getContent();
print_bytes(temp.c_str(), sizeof (temp)); // used to check the byte data
Payload payload; //Re-make the struct
memcpy(&payload, message->getContent().c_str(), message->getContentSize());
handler->ReceivedIntegerValue(payload.id.variableId, payload.value);
}
}
I did check the byte data and they where vastly different.
sender:
[ 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 63 00 00 00 01 00 00 00 00 00 00 00 00 00 00 00 60 32 bf 74 ff 7f 00 00 05 00 00 00 00 00 00 00 74 6f 70 69 63 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ]
receiver:
>[ 74 65 73 74 49 66 53 65 6e 64 00 00 00 7f 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 a0 ed ff 43 57 7f 00 00 07 00 00 00 00 00 00 00 64 65 66 61 75 6c 74 00 6d 05 77 4b 57 7f 00 00 ff ff ff ff ff ff ff ff 00 00 00 00 00 00 00 00 ]
I used the following code to print this out
void print_bytes(const void *object, size_t size)
{
// This is for C++; in C just drop the static_cast<>() and assign.
const unsigned char * const bytes = static_cast<const unsigned char *>(object);
size_t i;
printf("[ ");
for(i = 0; i < size; i++)
{
printf("%02x ", bytes[i]);
}
printf("]\n");
}
When i send only a string instead of the payload it receives on the other end. But for some reason a user defined struct doesn't work.
i want to avoid remapping everything against the qpid map because i will lose the depth of my Payload.id.
If someone has any sugestions to overcome this i would appreciatie it.
Thanks in advance,
Nick
I solved the issue.
the problem was that instead of the string instance it copied a pointer instance. By making the std::string name = "default"; into char name[20] = "default"; it copies the real character string.
This is how the publisher and subscriber encode en decode the message now.
void QpidAMQP::AMQPPublish(const Payload& payload, bool durability, bool sync)
{
// Publish to MQTT broker
//create stream of bytes to send over the line
qpid::messaging::Message message;
message.setDurable(durability);
std::string b;
b.resize(sizeof(Payload));
std::memcpy(const_cast<char*>(b.c_str()), &payload, b.size());
message.setContent(b);
std::string temp = message.getContent();
print_bytes(temp.c_str(), temp.size());
this->sender.send(message);
this->session.sync(sync);
}
void QpidAMQP::on_message(qpid::messaging::Message *message)
{
/// make sure message topic and payload are copied!!!
if (this->handler != nullptr)
{
const std::string temp = message->getContent();
print_bytes(temp.c_str(), temp.size());
Payload payload;
std::memcpy(&payload, temp.c_str(), temp.size());//sizeof(message->getContentBytes().c_str()));
handler->ReceivedIntegerValue(payload.id.variableId, payload.value);
}
}
I have some sample code reading some binary data from file and then writing the content into stringstream.
#include <sstream>
#include <cstdio>
#include <fstream>
#include <cstdlib>
std::stringstream * raw_data_buffer;
int main()
{
std::ifstream is;
is.open ("1.raw", std::ios::binary );
char * buf = (char *)malloc(40);
is.read(buf, 40);
for (int i = 0; i < 40; i++)
printf("%02X ", buf[i]);
printf("\n");
raw_data_buffer = new std::stringstream("", std::ios_base::app | std::ios_base::out | std::ios_base::in | std::ios_base::binary);
raw_data_buffer -> write(buf, 40);
const char * tmp = raw_data_buffer -> str().c_str();
for (int i = 0; i < 40; i++)
printf("%02X ", tmp[i]);
printf("\n");
delete raw_data_buffer;
return 0;
}
With a specific input file I have, the program doesn't function correctly. You could download the test file here.
So the problem is, I write the file content into raw_data_buffer and immediately read it back, and the content differs. The program's output is:
FFFFFFC0 65 59 01 00 00 00 00 00 00 00 00 00 00 00 00 FFFFFFE0 0A 40 00 00 00 00 00 FFFFFF80 08 40 00 00 00 00 00 70 FFFFFFA6 57 6E FFFFFFFF 7F 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 FFFFFFE0 0A 40 00 00 00 00 00 FFFFFF80 08 40 00 00 00 00 00 70 FFFFFFA6 57 6E FFFFFFFF 7F 00 00
The content FFFFFFC0 65 59 01 is overwritten with 0. Why so?
I suspect this a symptom of undefined behavior from using deallocated memory. You're getting a copy of the string from the stringstream but you're only grabbing a raw pointer to the internals that is then immediately deleted. (the link actually warns against this exact case)
const char* tmp = raw_data_buffer->str().c_str();
// ^^^^^ returns a temporary that is destroyed
// at the end of this statement
// ^^^ now a dangling pointer
Any use of tmp would exhibit undefined behavior and could easily cause the problem you're seeing. Keep the result of str() in scope.
I am extremely new to manipulating bitmap data in C++, and I have a problem. I'm trying to follow this example from wikipedia. Here is the code I'm using:
#include <iostream>
#include <fstream>
#include <Windows.h>
using namespace std;
int main()
{
//fileheader
BITMAPFILEHEADER* bf = new BITMAPFILEHEADER;
bf->bfType = 66;
bf->bfSize = 70;
bf->bfOffBits = 54;
//infoheader
BITMAPINFOHEADER* bi = new BITMAPINFOHEADER;
bi->biSize = 40;
bi->biWidth = 2;
bi->biHeight = 2;
bi->biPlanes = 1;
bi->biBitCount = 24;
bi->biCompression = 0;
bi->biSizeImage = 16;
bi->biXPelsPerMeter = 2835;
bi->biYPelsPerMeter = 2835;
bi->biClrUsed = 0;
bi->biClrImportant = 0;
//image data
unsigned char* thedata = new unsigned char;
thedata[0] = 0;
thedata[1] = 0;
thedata[2] = 255;
thedata[3] = 255;
thedata[4] = 255;
thedata[5] = 255;
thedata[6] = 0;
thedata[7] = 0;
thedata[8] = 255;
thedata[9] = 0;
thedata[10] = 0;
thedata[11] = 0;
thedata[12] = 255;
thedata[13] = 0;
thedata[14] = 0;
thedata[15] = 0;
//dc
HDC dc = GetDC(NULL);
//bitmap info
BITMAPINFO* bmi = (BITMAPINFO*)bi;
//handle to bitmap
HBITMAP hbmp = CreateDIBitmap(dc, bi, CBM_INIT, thedata, bmi, DIB_RGB_COLORS);
//output to bmp....?
ofstream outFile;
outFile.open("outtestbmp.bmp");
outFile << hbmp;
outFile.close();
}
I've been searching Google for the past couple days trying to figure out how to get this done but I still can't seem to make it work.
This complies without errors, but the outtestbmp.bmp file in the end is unopenable. Are there any huge mistakes I'm making (probably dozens) that are preventing this from working? (I have high suspicions that using ofstream to output my bmp data is wrong).
EDIT: I've been told that setting bftype to 66 is wrong. What is the correct value?
Also, I've created a .bmp file of what the output should be. Here is the hex data for that bmp:
42 4D 46 00 00 00 00 00 00 00 36 00 00 00 28 00 00 00 02 00 00 00 02 00 00 00 01 00 18
00 00 00 00 00 10 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 FF FF
FF FF 00 00 FF 00 00 00 FF 00 00 00
and here is the data for my .bmp I'm outputting:
42 00 46 00 00 00 CD CD CD CD 36 00 00 00 28 00 00 00 02 00 00 00 02 00 00 00 01 00 18
00 00 00 00 00 10 00 00 00 13 0B 00 00 13 0B 00 00 00 00 00 00 00 00 00 00 00 00 FF FF
FF FF 00 00 FF 00 00 00 FF 00 00 00
bf->bfType == 66;
This is wrong on two counts. Firstly it's the wrong value (the magic value are the bytes 'BM' or 0x4d42 on a little-endian machine), and secondly it's not an assignment.
Then:
unsigned char* thedata = new unsigned char;
Only allocating 1 char?
And this:
outFile << hbmp;
Doesn't do what you think it does. You're just writing out a handle. You didn't need to create a bitmap, you just need to write out the data structures you've initialised (once you've made them correct).
And why are you new'ing everything? There's no need for this stuff to be dynamically allocated (and you're not delete'ing it either).
I would generally do it something like this (some code removed for brevity):
BITMAPFILEHEADER bf;
bf.bfType = 0x4d42;
bf.bfSize = 70;
bf.bfOffBits = 54;
//infoheader
BITMAPINFOHEADER bi;
bi.biSize = 40;
bi.biWidth = 2;
etc...
//image data
unsigned char* thedata = new unsigned char[16]; // Should really calculate this properly
thedata[0] = 0;
....
//output to bmp....?
ofstream outFile;
outFile.open("outtestbmp.bmp");
outFile.write((char *)&bf,sizeof(bf));
outFile.write((char *)&bi,sizeof(bi));
outFile.write((char *)thedata, 16);
outFile.close();
First you don't allocate enough memory for all your pixels, it should be something along the lines of
unsigned char* thedata = new unsigned char[numPixels * bytesPerPixel];
In your case, with a 2 by 2 picture and 24 bpp (bits per pixel), you should allocate 12 bytes.
Each pixel will correspond to three bytes in a row for their red, green and blue channel.
Note: I've never used Window's bitmap creation process but I worked with many libraries to manipulate images, including FreeImage, SDL and GD
HBITMAP is a handle to a bitmap not an actual bitmap. The real bitmap is stored in Kernel memory. You don't need to make any calls to the GDI if you are just writing the bitmap to a file. Even if it was an actual bitmap cout would need an operator overload to actually write the memory structure to a file since it isn't stored in memory the same it is on disk.
All you need to do is write the BITMAPFILEHEADER to a file, then write the BITMAPINFOHEADER to the file, and then write the data (if we are talking RGB, non-indexed).
Here is more info on how bitmaps are actually stored on disk.
Bitmap Storage (MSDN)
Reading a bitmap from a file is just same.
First, you need to allocate enough data for your pixels:
unsigned char* thedata = new unsigned char[2 * 2 * 3];
Then you need to use write and open the file as binary.
Instead of:
outFile.open("outtestbmp.bmp");
outFile << hbmp;
Something like:
outFile.open("outtestbmp.bmp", ios::binary||ios::out);
outfile.write(reinterpret_cast<char *>(bf), sizeof(BITMAPFILEHEADER));
outfile.write(reinterpret_cast<char *>(bi), sizeof(BITMAPINFOHEADER));
outfile.write(reinterpret_cast<char *>(thedata), 2 * 2 * 3);
outfile.close();
I've made a dll hook that will send address for Byte array that I need every time the event occurs on the main apps.
Display code :
void receivedPacket(char *packet){
short header = packet[2];
TCHAR str[255];
_stprintf(str, _T("Header : %lX\n"), header); // This works fine. It return 0x38 as it should.
WriteConsole(GetStdHandle(STD_OUTPUT_HANDLE), str, strlen(str), 0, 0);
WriteConsole(GetStdHandle(STD_OUTPUT_HANDLE), packet, 34, 0, 0);
}
this is the example of the byte array from that address, and this is how I want it to be displayed :
05 01 38 00 60 00 9D 01 00 00 00 00 00 70 C9 7D 0E 00 00 00 00 00 00 00 FF 20 79 40 00 00 00 00
But, with my current code, it will only display weird symbols. So how can we convert all those weird symbol to hex?
I'm actually new to the C++.
Format each byte in the string:
static TCHAR const alphabet[] = _T("0123456789ABCDEF");
for (TCHAR * p = str; *p; ++p)
{
TCHAR const s[2] = { alphabet[p / 16], alphabet[p % 16] };
WriteConsole(GetStdHandle(STD_OUTPUT_HANDLE), s, 2, 0, 0);
}