Sudden Memory Leaks Issue - c++

Environment:
Windows 10 Home 10.0.16299
MS Visual Studio Community 2017 Version 15.6.6
Compile Settings: 64 bit - Multi-Byte Char Set
Since a few days, I'm facing a very annoying issue of memory leaks. I tried a lot but the problems still remains.
Suddently, I got the following messages on application exit :
{7196} normal block at 0x000001FD42D0EA90, 66 bytes long.
Data: <` f ) ) > 60 C6 CC 66 F7 7F 00 00 29 00 00 00 29 00 00 00
f:\dd\vctools\vc7libs\ship\atlmfc\src\mfc\oleinit.cpp(84) : {749} client block at 0x000001FD40F735F0, subtype c0, 104 bytes long.
f:\dd\vctools\vc7libs\ship\atlmfc\src\mfc\dumpcont.cpp(23) : atlTraceGeneral - a CCmdTarget object at $000001FD40F735F0, 104 bytes long
f:\dd\vctools\vc7libs\ship\atlmfc\src\mfc\plex.cpp(29) : {747} normal block at 0x000001FD40F78750, 248 bytes long. Data: < > 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
f:\dd\vctools\vc7libs\ship\atlmfc\src\mfc\occmgr.cpp(794) : {746} normal block at 0x000001FD40F5F1D0, 8 bytes long.
Data: < f > F0 BB A3 66 F7 7F 00 00
and also I got the following messages on application exit (I got 12 in fact)
{348} normal block at 0x00000236C88FE4A0, 69 bytes long.
Data: <` > , , > 60 C6 3E BE F6 7F 00 00 2C 00 00 00 2C 00 00 00
So, when I add the following line.....
_CrtSetBreakAlloc(348);
The compiler breaks with the message The xxxx.exe has triggered a breakpoint.
Always a the same place: the line
'pData = (CStringData*)malloc( nTotalSize );' in CAfxStringMgr::Allocate
CStringData* CAfxStringMgr::Allocate( int nChars, int nCharSize ) throw()
{
size_t nTotalSize;
CStringData* pData;
size_t nDataBytes;
ASSERT(nCharSize > 0);
if(nChars < 0)
{
ASSERT(FALSE);
return NULL;
}
nDataBytes = (nChars+1)*nCharSize;
nTotalSize = sizeof( CStringData )+nDataBytes;
pData = (CStringData*)malloc( nTotalSize );
if (pData == NULL)
return NULL;
pData->pStringMgr = this;
pData->nRefs = 1;
pData->nAllocLength = nChars;
pData->nDataLength = 0;
return pData;
}
The problem arised with function that passed CString as reference parameter. Like this one:
int CFoo::GetData(CString strMainData, CString& strData, int nPos, int nLen)
{
strData = strMainData.Mid(nPos + 2, nLen);
return ( nPos + 2 + nLen);
}
Any clue why I got this ? I'm working on that app since several weeks. I never experienced this issue. I modified the function to use char* as parameter and the, a copy to the CSring in the calling function but the problem remains. This behaviour appeared two weeks ago (but I didn't notice it directly). Could it be a compiler settings ?

Related

Using CMemoryState to detect memory leaks in C++/MFC app

I have a C++/MFC app that has a memory leak, running under 64-bit Windows 10 using VisualStudio 2019. Using the CMemoryState class I'm trying to find the exact location of the leak using code like this:
CMemoryState oldState.Checkpoint();
// code from my app runs here
CMemoryState newState.Checkpoint();
// Compare to previous
CMemoryState diff;
BOOL differs = diff.Difference(oldState, newState);
if(differs)
{
diff.DumpStatistics();
diff.DumpAllObjectsSince();
}
This produces output like
0 bytes in 0 Free Blocks.
19239 bytes in 472 Normal Blocks.
176 bytes in 2 CRT Blocks.
0 bytes in 0 Ignore Blocks.
272 bytes in 2 Client Blocks.
Largest number used: 43459889 bytes.
Total allocations: 215080520 bytes.
Dumping objects ->
{1314867} normal block at 0x00000181073C31C0, 16 bytes long.
Data: < s > D0 7F DD AC 73 00 00 00 00 00 00 00 00 00 00 00
{1272136} normal block at 0x0000018179C19750, 16 bytes long.
Data: < s > D0 7F DD AC 73 00 00 00 00 00 00 00 00 00 00 00
{1265825} normal block at 0x0000018179C18CB0, 16 bytes long.
Data: < _ s > B0 5F DD AC 73 00 00 00 00 00 00 00 00 00 00 00
with thousands of lines like the above. Are these significant, i e, an actual memory leak, or are they some artifact of MFC?

c++ openssl 1.1.1 running RSA algorithm in thread causing memory leaks

This is very strange! I look around and find nothing. My test code is below:
#include "pch.h"
#include "Windows.h"
#include "openssl/ssl.h"
#pragma comment(lib,"../Include/lib/libssl.lib")
#pragma comment(lib,"../Include/lib/libcrypto.lib")
#pragma comment(lib,"Crypt32.lib")
#pragma comment(lib,"ws2_32.lib")
#include <stdlib.h>
#include <crtdbg.h>
#define _CRTDBG_MAP_ALLOC
const char* priKey = "607BC8BA457EC0A1B5ABAD88061502BEA5844E17C7DD247345CD57E501B3300B4B8889D3DFCF5017847606508DF8B283C701F35007D5F4DBA96023DD3B0CCE062D8F63BCC16021B944A1E88861B70D456BAA1E0E69C90DFCA13B4D4EA5403DA25FEBF94B0515644A7D2DF88299189688DA5D8951512ADC3B1A35BAEEC147F69AB101048B9029E65A77A438A05AE30337E82B0571B80A955C72EA0DB3B457ECC8B81F346624E3D22755FEB3D84F810431974803E13E26BF95AF7AB590E17C0113BFE9B36BE12BE16D89273348E0CC094526CAF54ABF8044565EC9500EBF657265474BC362EBDFD78F513282AAF0EEF9BA0BB00F7FF9C7E61F00465BBD379D6201";
const char* pubKey = "AE7DF3EB95DF1F864F86DA9952BB44E760152986731253C96C135E5391AEFF68F5C1640552F1CCC2BA2C12C0C68C051E343B786F13215CEFC8221D9FA97D50E895EAF50D1AF32DC5EB40C9F1F8CA5145B43CEF83F2A89C9661AFA73A70D32951271C6BEFE1B5F24B512520DA7FD4EEC982F2D8057FE1938FA2FB54D8D282A25D8397298B75E154739EF16B8E2F18368F5BEEAD3D18528B9B1A63C731A71735CDB6102E187EF3377B72B58A124FA280891A79A2BD789D5DBA3618BBD74367F5C50A220204D90A59828C3C81FDBD9D2A91CBF6C8563C6FE987BE21B19BBC340DE4D42290D63909AD5D856D13B8CDC91D5701570045CE3609D4F8884F69120AD9A3";
void rsa_test(const char* n,const char* d)
{
RSA* rsa = RSA_new();
BIGNUM* bn = BN_new();
BIGNUM* bd = BN_new();
BIGNUM* be = BN_new();
BN_set_word(be, 0x10001);
if (n) BN_hex2bn(&bn, n);
if (d) BN_hex2bn(&bd, d);
RSA_set0_key(rsa, bn, be, bd);
//calc hash
const char* msg = "hello,rsa!!";
BYTE shaResult[SHA256_DIGEST_LENGTH] = { 0 };
SHA256((unsigned char*)msg, strlen(msg), shaResult);
//sign
unsigned int olen;
unsigned char sign[256] = { 0 };
RSA_sign(NID_sha256, shaResult, SHA256_DIGEST_LENGTH, sign, &olen,rsa);
RSA_free(rsa);
}
DWORD thread_test(LPVOID lpParam)
{
rsa_test(pubKey, priKey);
return 0;
}
int main()
{
//_CrtSetBreakAlloc(159);
_CrtSetDbgFlag(_CRTDBG_ALLOC_MEM_DF | _CRTDBG_LEAK_CHECK_DF);
//CreateThread(nullptr, 0, thread_test, nullptr, 0, nullptr);
//rsa_test(pubKey,priKey);
system("pause");
}
Calling rsa_test(pubKey,priKey) directly in the main thread DO NOT cause memory leaks!
Calling CreateThread(nullptr, 0, thread_test, nullptr, 0, nullptr) comes out memory leaks!!!
Console output as follows:
Detected memory leaks!
Dumping objects ->
{173} normal block at 0x000002BDE6D44000, 264 bytes long.
Data: < _W- :_ s ! 6> D8 89 FD 5F 57 2D C5 3A 5F 82 73 F1 00 21 FA 36
{162} normal block at 0x000002BDE6D43AC0, 264 bytes long.
Data: < > 00 01 02 03 04 05 06 07 08 09 0A 0B 0C 0D 0E 0F
{161} normal block at 0x000002BDE6D2E2A0, 160 bytes long.
Data: <` > 60 F1 0F 91 F6 7F 00 00 00 00 00 00 00 00 00 00
{160} normal block at 0x000002BDE6D2DBA0, 160 bytes long.
Data: <` > 60 F1 0F 91 F6 7F 00 00 00 00 00 00 00 00 00 00
{159} normal block at 0x000002BDE6D48230, 352 bytes long.
Data: < P > 00 00 00 00 00 00 00 00 50 AB D3 E6 BD 02 00 00
{158} normal block at 0x000002BDE6D286B0, 12 bytes long.
Data: < > 00 00 00 00 00 00 00 00 01 00 00 00
Object dump complete.
Then I use _CrtSetBreakAlloc(159) (or other memory id) to location,and here is my call stack screenshot:
my vs2017 showing the break point at CRYPTO_zalloc(crypto/mem.c)
So my question is how to free those leak memory in thread!
Thanks a lot!!
Download my test code(build with visual studio 2017 x64)
https://www.openssl.org/docs/man1.1.0/man3/OPENSSL_thread_stop.html
OPENSSL_thread_stop();
will do it for you. You can call it like below
DWORD thread_test(LPVOID lpParam)
{
rsa_test(pubKey, priKey);
OPENSSL_thread_stop();
return 0;
}

Embedded Google v8 memory leak with sample

I'm trying to embed V8 but I have some memory leaks. The following minimal code reproduce the leaks.
int main(int argc, char* argv[])
{
// V8 version 7.1.302.33
v8::V8::InitializeICUDefaultLocation(argv[0]);
v8::V8::InitializeExternalStartupData(argv[0]);
std::unique_ptr<v8::Platform> platform = v8::platform::NewDefaultPlatform();
v8::V8::InitializePlatform(platform.get());
v8::V8::Initialize();
v8::Isolate::CreateParams create_params;
create_params.array_buffer_allocator = v8::ArrayBuffer::Allocator::NewDefaultAllocator();
v8::Isolate* pIsolate = v8::Isolate::New(create_params); // If I remove this line and the next one, memory leak disappears
pIsolate->Dispose();
v8::V8::Dispose();
v8::V8::ShutdownPlatform();
delete create_params.array_buffer_allocator;
}
With this code, my visual studio 2017 print theses leaks in the output
Detected memory leaks!
Dumping objects ->
{5565} normal block at 0x000001BA6F417950, 8 bytes long.
Data: < i > 00 00 E8 69 18 00 00 00
{5564} normal block at 0x000001BA6F416960, 16 bytes long.
Data : <8 >o > 38 D7 3E 6F BA 01 00 00 00 00 00 00 00 00 00 00
{5563} normal block at 0x000001BA6F3ED720, 56 bytes long.
Data: < >o > o > A0 D6 3E 6F BA 01 00 00 A0 D6 3E 6F BA 01 00 00
{989} normal block at 0x000001BA6F4194E0, 128 bytes long.
Data: < >o > o > A0 D6 3E 6F BA 01 00 00 A0 D6 3E 6F BA 01 00 00
{988} normal block at 0x000001BA6F416CD0, 16 bytes long.
Data: < p ` > 70 B4 60 0A FF 7F 00 00 00 00 00 00 00 00 00 00
{987} normal block at 0x000001BA6F417270, 16 bytes long.
Data : < X ` > 58 B4 60 0A FF 7F 00 00 00 00 00 00 00 00 00 00
{986} normal block at 0x000001BA6F3ED6A0, 56 bytes long.
Data : < >o > o > 20 D7 3E 6F BA 01 00 00 20 D7 3E 6F BA 01 00 00
Object dump complete.
So do you know what I forget ?
Thanks in advance for your help ;)
I copied your code, and ran it on linux environment, with -fsanitize=address flag setted, didn't get any memory leak error.
Here a dirty workaround to remove most memory leaks.
You have to redefine the internal class IsolateAllocator to call private function FreeProcessWidePtrComprCageForTesting().
A first way by adding a new function Free() to be able to call the private function.
namespace v8::internal
{
class IsolateAllocator
{
public:
static void Free() { FreeProcessWidePtrComprCageForTesting(); }
private:
static void FreeProcessWidePtrComprCageForTesting();
};
} // namespace v8::internal
or without adding a function to IsolateAllocator, but using a friend class like it's done in v8 unit test (I just rename the friend class).
namespace v8::internal
{
class IsolateAllocator
{
private:
friend class SequentialUnmapperTest;
static void FreeProcessWidePtrComprCageForTesting();
};
class SequentialUnmapperTest
{
public:
static void Free() { IsolateAllocator::FreeProcessWidePtrComprCageForTesting(); }
};
} // namespace v8::internal
Call the Free() function before v8::V8::Dispose();:
isolate->Dispose();
v8::internal::IsolateAllocator::Free();
v8::V8::Dispose();
v8::V8::ShutdownPlatform();
delete create_params.array_buffer_allocator;
There is still few memory leak due to call v8::V8::InitializeICUDefaultLocation() (which do some allocation using when calling udata_setCommonData()) and other due to base::LazyInstance<std::weak_ptr<CodeRange>>::type process_wide_code_range_ in code-range.cc which store a std::weak_ptr<> which is never destroyed...

CLR have fat or small exception frame?

How detect IMAGE_COR_ILMETHOD_SECT_EH must use Small or Fat?
Also me instrest other internal CLR structure/opcode details. Answer below answers this and many other questions.
/*RVA:0*/ typedef union IMAGE_COR_ILMETHOD{
IMAGE_COR_ILMETHOD_TINY Tiny;
IMAGE_COR_ILMETHOD_FAT Fat;} IMAGE_COR_ILMETHOD;
/*PC = RVA + sizeof( IMAGE_COR_ILMETHOD) = 12 or 4 byte*/ ... Code
/*EH = PC+CodeSize */typedef union IMAGE_COR_ILMETHOD_SECT_EH{
IMAGE_COR_ILMETHOD_SECT_EH_SMALL Small;
IMAGE_COR_ILMETHOD_SECT_EH_FAT Fat;
} IMAGE_COR_ILMETHOD_SECT_EH;
https://github.com/dotnet/coreclr/blob/master/src/inc/corhdr.h
for example
public static Main(string args[]){
int i=0;
try{
Console.Write("OK");
} catch(Exception){
i++
}
0000 4D 5A 90 00 MZ-header
0250 2A 02 17 8C 06 00 00 01 51 2a 00
RVA: 1B 30 02 00 // IMAGE_COR_ILMETHOD_FAT
1D 00 00 00 CodeSize= 29
01 00 00 11 Locals = 11000001
PC0: 00 16 0A i=0
PC3 00 72 01 00 00 70 try{
28 04 00 00 0A call Console.Write
00 00 DE 09
PC12:26 00 06 17 58 0A 00 DE 00 00 2A (2A is ret command)
00 00 00 01 10 00 // IMAGE_COR_ILMETHOD_SECT_EH ??? 1=count
00 00 CorExceptionFlag Flags
03 00 TryOffset
0F TryLength
12 00 HandlerOffset
09 HandlerLength
08 00 00 01 ClassToken
In this case we have small EH-frame. How detect we have small or fat frame?
struct IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_SMALL{
CorExceptionFlag Flags : 16;
unsigned TryOffset : 16;
unsigned TryLength : 8; // relative to start of try block
unsigned HandlerOffset : 16;
unsigned HandlerLength : 8; // relative to start of handler
union {
DWORD ClassToken;
DWORD FilterOffset;
};
} IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_SMALL;
typedef struct IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_FAT
{
CorExceptionFlag Flags;
DWORD TryOffset;
DWORD TryLength; // relative to start of try block
DWORD HandlerOffset;
DWORD HandlerLength; // relative to start of handler
union {
DWORD ClassToken; // use for type-based exception handlers
DWORD FilterOffset; // use for filter-based exception handlers (COR_ILEXCEPTION_FILTER is set)
};
} IMAGE_COR_ILMETHOD_SECT_EH_CLAUSE_FAT;
This is covered in partition II section 25.4.5 of ECMA-335.
If CorILMethod_Sect_FatFormat bit (0x40) is set in the Kind field (the first byte of the structure) then you should use fat, otherwise small. The Kind field can be accessed via Small.SectSmall.Kind or Fat.SectFat.Kind, either should work.

C++ display array of bytes in hex

I've made a dll hook that will send address for Byte array that I need every time the event occurs on the main apps.
Display code :
void receivedPacket(char *packet){
short header = packet[2];
TCHAR str[255];
_stprintf(str, _T("Header : %lX\n"), header); // This works fine. It return 0x38 as it should.
WriteConsole(GetStdHandle(STD_OUTPUT_HANDLE), str, strlen(str), 0, 0);
WriteConsole(GetStdHandle(STD_OUTPUT_HANDLE), packet, 34, 0, 0);
}
this is the example of the byte array from that address, and this is how I want it to be displayed :
05 01 38 00 60 00 9D 01 00 00 00 00 00 70 C9 7D 0E 00 00 00 00 00 00 00 FF 20 79 40 00 00 00 00
But, with my current code, it will only display weird symbols. So how can we convert all those weird symbol to hex?
I'm actually new to the C++.
Format each byte in the string:
static TCHAR const alphabet[] = _T("0123456789ABCDEF");
for (TCHAR * p = str; *p; ++p)
{
TCHAR const s[2] = { alphabet[p / 16], alphabet[p % 16] };
WriteConsole(GetStdHandle(STD_OUTPUT_HANDLE), s, 2, 0, 0);
}