How to use LocalAlloc and LocalReAlloc correctly - c++

I am going to learn how to use LocalAlloc and also LocalReAlloc of the Win32 API. I written the following code, but it gives me exceptions. I don't know what is wrong with the following code.
#include <Windows.h>
#include <iostream>
namespace code
{
namespace memory
{
void allocation()
{
char* string = reinterpret_cast<char*>(LocalAlloc(LPTR, 6 + 1));
CopyMemory(string, "WINAPI", 6);
std::printf("%s\n", string);
string = reinterpret_cast<char*>(LocalReAlloc(string, 6 + 13 + 1, LMEM_MOVEABLE));
CopyMemory(string + 6, "IS THE BEST", 13);
std::printf("%s\n", string);
delete string;
}
}
}
int main(int argc, char* argv[])
{
code::memory::allocation();
return 0;
}
When I compile the above program, it doesn't give me any error but when I run it, it gives me the exception. The following message is from the exception:
---------------------------
Microsoft Visual C++ Runtime Library
---------------------------
Debug Assertion Failed!
Program: ...Windows\00 Windows API Programming\Debug\52_DynamicMemory.exe
File: minkernel\crts\ucrt\src\appcrt\heap\debug_heap.cpp
Line: 904
Expression: _CrtIsValidHeapPointer(block)
For information on how your program can cause an assertion
failure, see the Visual C++ documentation on asserts.
(Press Retry to debug the application)
---------------------------
Abort Retry Ignore
---------------------------

There are several issues with your code.
A complete lack of error handling.
If LocalReAlloc() fails, you are leaking the memory allocated by LocalAlloc().
The second CopyMemory() is exceeding the bounds of the string literal being copied. And you are not ensuring the reallocated memory is null terminated for the following printf(), were the literal being copied properly.
You are not freeing the allocated memory correctly. You must use LocalFree(), not delete.
Try this instead:
#include <Windows.h>
#include <cstdio>
#include <cstring>
namespace code
{
namespace memory
{
void allocation()
{
char* string = static_cast<char*>(LocalAlloc(LMEM_FIXED, 6 + 1));
if (!string) return;
std::strcpy(string, "WINAPI");
std::printf("%s\n", string);
char* newstring = static_cast<char*>(LocalReAlloc(string, 6 + 12 + 1, LMEM_MOVABLE));
if (!newstring) { LocalFree(string); return; }
string = newstring;
std::strcpy(string + 6, " IS THE BEST");
std::printf("%s\n", string);
LocalFree(string);
}
}
}
int main()
{
code::memory::allocation();
return 0;
}

LocalAlloc Allocates the specified number of bytes from the heap, and if fails, returns value is NULL. To get extended error information, call GetLastError.
If the LocalAlloc function succeeds, allocates at least the amount requested, and to free allocated memory call LocalFree
LocalReAlloc used to re-alloceate memory that allocated either by LocalAlloc or LocalReAlloc, note that if LocalReAlloc fails, the original memory is not freed, and the original handle and pointer are still valid.

Related

fopen returning NULL in gdb

Im trying to solve a binary exploitation problem from picoCTF, but I'm having trouble with gdb.
Here is the source code of the problem (I've commented some stuff to help me).
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <sys/types.h>
#include <wchar.h>
#include <locale.h>
#define BUF_SIZE 32
#define FLAG_LEN 64
#define KEY_LEN 4
void display_flag() {
char buf[FLAG_LEN];
FILE *f = fopen("flag.txt","r");
if (f == NULL) {
printf("'flag.txt' missing in the current directory!\n");
exit(0);
}
fgets(buf,FLAG_LEN,f);
puts(buf);
fflush(stdout);
}
// loads value into key, global variables ie not on stack
char key[KEY_LEN];
void read_canary() {
FILE *f = fopen("/problems/canary_3_257a2a2061c96a7fb8326dbbc04d0328/canary.txt","r");
if (f == NULL) {
printf("[ERROR]: Trying to Read Canary\n");
exit(0);
}
fread(key,sizeof(char),KEY_LEN,f);
fclose(f);
}
void vuln(){
char canary[KEY_LEN];
char buf[BUF_SIZE];
char user_len[BUF_SIZE];
int count;
int x = 0;
memcpy(canary,key,KEY_LEN); // copies "key" to canary, an array on the stack
printf("Please enter the length of the entry:\n> ");
while (x<BUF_SIZE) {
read(0,user_len+x,1);
if (user_len[x]=='\n') break;
x++;
}
sscanf(user_len,"%d",&count); // gives count the value of the len of user_len
printf("Input> ");
read(0,buf,count); // reads count bytes to buf from stdin
// compares canary (variable on stack) to key
// if overwriting need to get the value of key and maintain it, i assume its constant
if (memcmp(canary,key,KEY_LEN)) {
printf("*** Stack Smashing Detected *** : Canary Value Corrupt!\n");
exit(-1);
}
printf("Ok... Now Where's the Flag?\n");
fflush(stdout);
}
int main(int argc, char **argv){
setvbuf(stdout, NULL, _IONBF, 0);
int i;
gid_t gid = getegid();
setresgid(gid, gid, gid);
read_canary();
vuln();
return 0;
}
When I run this normally, with ./vuln, I get normal execution. But when I open it in gdb with gdb ./vuln and then run it with run, I get the [ERROR]: Trying to Read Canary message. Is this something that is intended to make the problem challenging? I don't want the solution, I just don't know if this is intended behaviour or a bug. Thanks
I don't want the solution, I just don't know if this is intended behaviour or a bug.
I am not sure whether you'll consider it intended behavior, but it's definitely not a bug.
Your ./vuln is a set-gid program. As such, it runs as group canary_3 when run outside of GDB, but as your group when run under GDB (for obvious security reason).
We can assume that the canary_3 group has read permissions on the canary.txt, but you don't.
P.S. If you printed strerror(errno) (as comments suggested), the resulting Permission denied. should have made the failure obvious.

How to run machine code as a function in c++

system: Windows 10
compiler: MinGW
error: Segmentation fault
I'm trying to run machine code as a function in c++. Here is my code:
#include <iostream>
int main()
{
int(*fun_ptr)(void) = ((int(*)())("\xB8\x0C\x00\x00\x00\xC3"));
std::cout << fun_ptr();
return 0;
}
In online compilers like ideone.com program succesfully print 12 and exits. In my computer I receive "Segmentation fault" error. Can anyone help me?
A string literal such as "\xB8\x0C\x00\x00\x00\xC3" is an object of static storage duration [lex.string]/15. A compiler will typically place such string literal objects in the .rdata section of your binary, i.e., into read-only, non-executable memory. As a consequence, trying to execute the bytes of a string literal will result in an access violation. If you want to execute machine code bytes contained in a global array object, you have to make sure your object is allocated in a section that is executable. For example (targeting Windows with Visual C++):
#include <iostream>
#pragma section("runstuff", read, execute)
__declspec(allocate("runstuff"))
const unsigned char code[] = {
0xB8, 0x0C, 0x0, 0x0, 0x0, 0xC3
};
int main()
{
auto fun_ptr = reinterpret_cast<int(*)()>(&code[0]);
std::cout << fun_ptr();
return 0;
}
Note that stuff like that is inherently not portable and has implementation-defined behavior at best. If you know at build time what machine code you want to run, consider using an assembler and just linking the resulting object file to your executable. If you want to dynamically generate machine code on Windows, you will have to allocate executable memory. To do so, either create a large-enough array in executable (and also writeable) memory (e.g., analogously to my example above) into which you can place your code, or dynamically allocate executable memory, e.g. using VirtualAlloc or using HeapAlloc from a Heap with the executable flag set. You will also want to be aware of the FlushInstructionCache API…
You can do that by using an inline assembler:
#include <iostream>
int code() {
__asm (
".byte 0xB8, 0x0C, 0x00, 0x00, 0x00"
);
}
int main() {
std::cout << code() << std::endl;
return 0;
}
I found a method:
#include <iostream>
#include <windows.h>
using namespace std;
int main() {
unsigned char bytes[] = "\xB8\x0C\x00\x00\x00\xC3";
HANDLE mem_handle = CreateFileMappingA(INVALID_HANDLE_VALUE, NULL, PAGE_EXECUTE_READWRITE, 0, sizeof(bytes), NULL);
void *mem_map = MapViewOfFile(mem_handle, FILE_MAP_ALL_ACCESS | FILE_MAP_EXECUTE, 0x0, 0x0, sizeof(bytes));
memcpy(mem_map, bytes, sizeof(bytes));
int result = ((int (*)(void))mem_map)();
cout << "argument:\n" << result << '\n';
return 0;
}

Why does this code not cause a buffer overflow?

I want to write code that causes a buffer overflow. Could you give me some advice on why my test code below does not cause a buffer overflow?
I think a buffer overflow should occur on this line:
#include <cstdio>
#include <string>
#include <cstring>
using namespace std;
#define buffer_size 10
int main(){
string m_string = "My name is String";
char* node_reference = new char[buffer_size];
unsigned int len = strlen(m_string.c_str());
unsigned int buffer_len = sizeof(node_reference)/sizeof(char);
std::printf("len: %d\n", len);
std::printf("buffer len: %d\n", buffer_len);
for (unsigned int i = 0; i < strlen(m_string.c_str()) + 1; i++) {
node_reference[i] = m_string[i];
std::printf("index: %d and value: %c\n", i, node_reference[i]);
}
return 0;
}
EDIT
I removed typo de_referece line. Thank you for your kind answers to the insufficient questions.
sizeof(node_reference) does not do what you think it does, since node_reference is allocated dynamically. You are obtaining the size of the pointer itself, not the size of the allocated buffer being pointed at. Your printf() of buffer_len should have indicated that to you. sizeof(node_reference) would work only if node_reference had been allocated statically instead.
That being said, your code does cause a buffer overflow. You are allocating 10 bytes for node_reference and then writing 18 bytes into it. The behavior is undefined, so anything can happen. A crash is not guaranteed, if that is what you are expecting. You are certainly corrupting memory, though it may not be memory you would normally see since it is outside of your allocation.

Getting system env in c++ from windows

I try to fetch some system env in windows from my C++ application. I tried getenv and GetEnvironmentVariable but both stuck. A program compiles but when I run it I see blinking pointer for some time, nothing displays and then program crash with message:
RUN FAILED (exit value -1 073 741 819, total time: 10s)
I tried a lot of examples from the net and all of them give the same result. Some examples I tried:
char l_strSingleVal[20];
GetEnvironmentVariable("PATH", l_strSingleVal,20);
printf("VariableName: %s\n",l_strSingleVal);
or:
std::string string_variable;
const std::string MY_VAR = "PATH";
char const* temp = std::getenv(MY_VAR.c_str());
if(temp != NULL)
{
string_variable = std::string(temp);
}
You have undefined behviour:
From GetEnvironmentVariable spec (l_strSingleVal is equivalent to lpBuffer): http://msdn.microsoft.com/en-us/library/windows/desktop/ms683188(v=vs.85).aspx
If lpBuffer is not large enough to hold the data, the return value is
the buffer size, in characters, required to hold the string and its
terminating null character and the contents of lpBuffer are undefined.
Accessing lpBuffer in your case is UB. A 20 character buffer for PATH is way too small. You need to check the return value of GetEnvironmentVariable (which in your case will be telling you the size of the buffer required for successful invocation).
Tried C standard library function getenv? It works for me on my Windows PC.
Example:
#include <stdlib.h>
#include <stdio.h>
int main()
{
char * src = getenv("PATH");
if (src)
printf("value of PATH is: %s", src);
else
printf("empty or not defined");
return 0;
}

Memory leak through new[] without ever calling new

I'm getting a memory leak from the following function:
int ReadWrite(int socket, char *readfile) {
FILE *rf = NULL;
rf = fopen(readfile, "rb");
fseek(rf, 0, SEEK_END);
int len = ftell(rf);
rewind(rf);
char readbuf[len + 1];
int res = fread(readbuf, len, 1, rf);
readbuf[len + 1] = '\0';
fclose(rf);
while (1) {
int wres = write(socket, readbuf, res);
if (wres == 0) {
cerr << "socket closed prematurely" << endl;
close(socket);
return EXIT_FAILURE;
}
if (res == -1) {
if (errno == EINTR)
continue;
cerr << "socket write failure: " << strerror(errno) << endl;
close(socket);
return EXIT_FAILURE;
}
break;
}
return EXIT_SUCCESS;
}
Valgrind tells me I leak the number of bytes that are in readfile (the actual file, not the name of readfile) through this operation:
Address 0x4c3b67e is 0 bytes after a block of size 14 alloc'd
at 0x4A07C84: operator new[](unsigned long) (vg_replace_malloc.c:363)
What's confusing me is I don't ever use new[] in my code. I checked fopen, ftell, and fread to see if they have hidden "gotcha's" where they call new[] somewhere but didn't find anything in the documentation on cplusplus.com. I've tried all different combinations of new char[]/delete[], malloc/free, and stack-allocated variables (above) but I get the same valgrind message every time. Any ideas? Thanks.
you call
char readbuf[len + 1];
and then later
readbuf[len + 1] = '\0';
wouldn't that overflow the array?
Well, you are declaring your readbuf array with non-constant size (i.e with run-time size). This is formally illegal in C++. Such feature exists in C99, but not in C++. Your code will not even compile in a pedantic C++ compiler. And your question is tagged [C++].
But it is quite possible that your compiler implements this feature as a non-standard extension, and that it creates such arrays through an implicit call to new[]. This is why you get the error message that refers to new[], even though you are not using new[] explicitly.
Of course, it is the compiler's responsibility to deallocate such arrays when they end their lifetime. I suspect that the compiler does all it has to do, but the valgrind is confused by something in compiler's actions, which makes it conclude that it is a memory leak.
Moreover, as others already noted, you are making out of bounds access to your array, which can also lead to absolutely any problems at run-time, including the strange report from valgrind.
I found out the problem actually had to do with the Makefile I was using. Thanks for the insight on my slip-up with the char[] bounds though!