Drawing Text with MFC CDC - c++

I cannot set text alignment correctly. For instance, if I do this, then bottom alignment gets lost
memDC.SetTextAlign(TA_BOTTOM);
memDC.SetTextAlign(TA_RIGHT);
memDC.TextOutW(textRect.left, textRect.top, _T("HELLo"));
And if I do this, then right alignment gets lost.
memDC.SetTextAlign(TA_RIGHT);
memDC.SetTextAlign(TA_BOTTOM);
memDC.TextOutW(textRect.left, textRect.top, _T("HELLo"));
There does not seem to exist a way to keep both alignments. Any suggestions to fix this?

They're bitflags:
memDC.SetTextAlign(TA_RIGHT | TA_BOTTOM);

Related

C++ Memory Writing for the PS3

I'm learning how to hack PS3 games, where I need to edit what is stored at certain memory addresses. Here is an example of what I see people do to to achieve this:
*(char*)0x1786418 = 0x40;
This line of code turns on super speed for COD Black Ops II.
I'm not 100% sure what is going on here. I know that 0x1786418 is the address and 0x40 sets the value at that address. But I'm not so sure what *(char*) does and how does 0x40 turn on super speed?
An explanation of this syntactically and how it turns on super speed would be much appreciated.
Thanks!
You should consider understanding the basics of the programming language before you try to go into reverse-engineering. That's definitely an advanced topic that you don't want to use as a way to get started. It'll make things unnecessarily more difficult for you.
I'm not 100% sure what is going on here. I know that 0x1786418 is the address and 0x40 sets the value at that address.
This is as much as anyone here might be able to tell you, unless the person who reverse-engineered the software shows up here and explains it.
But I'm not so sure what *(char*) does
This is a way to take the address and interpret it as a pointer to a byte (chars in C are 1 byte of memory) and then the outside * dereferences the pointer to allow the value referenced by the pointer to be modified, in this case, set to the value 0x40.
and how does 0x40 turn on super speed?
This is very specific to the game itself. Someone must've figured out where data about player movement speed is stored in memory (specifically for the PS3) and is updating it this way.
Something like this could easily break by a simple patch because code changes can make certain things end up at different addresses, requiring additional reverse-engineering efforts.
if anyone is seing this and wants to know how to set prestiges or enable red boxes or what not ill explain how (MW3 Will be my example)
so for a prestige it would be like *(char*)0x01C1947C = 20;
that would set prestige 20 if you dont understand for 20 you can either do 20 or 0x14 would also equal prestige 20 say you want prestige 12 you could do 12 or 0xC
if you dont know how just search prestige you want then in hex :)
now for stuff like red boxes (assuming you know about bools / if statements and voids im not going to cover them only how you would set it)
now you would do for red boxes (enable) ps. bytesOn can be called anything
char bytesOn[] = { 0x60, 0x00, 0x00, 0x00 };
write_process(0x65D14, bytesOn, sizeof(bytesOn));
whateverYourBoolIsCalled = true;
now to turn it off works the same except you have to get the other bytes :)
ill add 1 more example if you want to set a name in a sprx
char name[] = { "name here" };
write_process(0x01BBBC2C, name, sizeof(name));
there is shorter ways of doing this but i think its the best way to understand doing it this way :)
so ye this has been my tut :)

Failure at displaying Bitmaps in wxWidgets

My application has two Pictures embedded in the Frame. My code is as follows:
wxMemoryInputStream istream1(Bild_png, sizeof Bild_png);
wxImage Bild_png(istream1, wxBITMAP_TYPE_PNG);
new wxStaticBitmap(p_img, wxID_ANY, wxBitmap(Bild_png));
vbox->Add(p_img ,0);
(vbox is the Sizer)
When I start the App, I've a "T-" at the left-upper corner in both Bitmaps. When I change the notebookitem("screen") and get back to the first Screen (where the Bitmaps are) the "-T" has disappeared...
How can I fixed it, so that I will never see the failure?
i had to call Layout() at the upmost sizer. That has solved my problem. It means at the end:
vbox->Layout()
#catalin, I don't think that to post aorund 2000 lines of sourcecode is a better way. I had choose this little snipped, because it says all what needed. A expert with wxWidgets had give me - with this four lines - the hint that something is fault with the sizer, not with the pic.
Haven't you been advised before to search the samples? For example widgets; just grep for wxStaticBitmap and I'm sure you'll find something useful.
This is just a poor way of asking a question.
In your c++ snippet you're using Bild_png even before it was declared - really? Then you mention both Bitmaps and notebookitem("screen") which are just unknown items to anyone else but you.
IMO it is just too... wrong to receive a good answer...

Write information into empty window with c++

So I'm building this game engine thinggy, and found it to be VERY hard to create some kind of an overlay with debug information into the main game window with D3D11 or at all draw text, so I thought I'd create an other window to contain my debug data.
I got the window created fine and all, but I have no idea how to write my debug info into it. I do not want to use the windows form designer as that would have to convert my project into a CLR project which I do not want.
I have been googling now for 3 hours at least (honest) and tried various solutions but none of them really seemed practical to use/they were not working.
The debug info I'd like to write originates from global float values. An example would be CAM_POS_X which holds a floating point value which indicates at which X co ordinate the camera is currently at.
Something like this is desired:
|SiriusAlpha 0.1 Debug window_ |
|Current X position: CAM_POS_X|
|Current Y position: CAM_POS_Y|
|Current Z position: CAM_POS_Z|
|Current YAW: CAM_YAW______|
|Current PITCH: CAM_PITCH___|
|Current FPS: CUR_FPS_______|
All of these values are not nescessarily floating point variables. They could be strings, doubles, integers or even booleans.
If anyone would be willing to explain to me how to do this in D3D11 and I could skip the whole debug window schenennigans I'd be even happier.
Otherways, I'd be delighted if somebody could explain to me how this is done.
Have you tried TextOut()? Read the article on msdn. You should already have a device context, the rest is quick and easy.
The TextOut function writes a character string at the specified location, using the currently selected font, background color, and text color.
Printing to a string is trivial.
wchar buf[128];
swprintf(buf, "Current X Position: %f", CAM_X_POS);
TextOut(yourDC, screenXPos, screenYPos, &buf, sizeof(buf));
I haven't tested this, but from the MSDN documentation this should work fine.

Passing D3DFMT_UNKNOWN into IDirect3DDevice9::CreateTexture()

I'm kind of wondering about this, if you create a texture in memory in DirectX with the CreateTexture function:
HRESULT CreateTexture(
UINT Width,
UINT Height,
UINT Levels,
DWORD Usage,
D3DFORMAT Format,
D3DPOOL Pool,
IDirect3DTexture9** ppTexture,
HANDLE* pSharedHandle
);
...and pass in D3DFMT_UNKNOWN format what is supposed to happen exactly? If I try to get the surface of the first or second level will it cause an error? Can it fail? Will the graphics device just choose a random format of its choosing? Could this cause problems between different graphics card models/brands?
I just tried it out and it does not fail, mostly
When Usage is set to D3DUSAGE_RENDERTARGET or D3DUSAGE_DYNAMIC, it consistently came out as D3DFMT_A8R8G8B8, no matter what I did to the back buffer format or other settings. I don't know if that has to do with my graphics card or not. My guess is that specifying unknown means, "pick for me", and that the 32-bit format is easiest for my card.
When the usage was D3DUSAGE_DEPTHSTENCIL, it failed consistently.
So my best conclusion is that specifying D3DFMT_UNKNOWN as the format gives DirectX the choice of what it should be. Or perhaps it always just defaults to D3DFMT_A8R8G8B.
Sadly, I can't confirm any of this in any documentation anywhere. :|
MSDN doesn't say. But I'm pretty sure you'd get "D3DERR_INVALIDCALL" as a result.
If the method succeeds, the return
value is D3D_OK. If the method fails,
the return value can be one of the
following: D3DERR_INVALIDCALL,
D3DERR_OUTOFVIDEOMEMORY,
E_OUTOFMEMORY.
I think this falls into the "undefined" category. Some drivers will fail the allocations, while others may default to something. I've never seen anything in the WDK that says that this condition needs to be handled. I'm guessing if you enable the debug DX runtime you will see an error message.

Problem clearing Listview Header image on Vista

I'm having a problem on Vista with the Listview control, in particular setting custom icons on the header. Normally under XP or any of the previous version of Windows, if I added an icon (in C++), I could do so with the following:
HeaderItem.mask = HDI_FORMAT | HDI_IMAGE;
Header_GetItem(HeaderHWND, Column, &HeaderItem);
TurnOn(HeaderItem.fmt, HDF_IMAGE);
HeaderItem.iImage = Image;
if (Header_SetItem(HeaderHWND, Column, &HeaderItem) == 0)
printf("Failed to set header [%d:%.8X]\n", GetLastError(), GetLastError());
and then to remove the image, on a particular column, I could use the same process but instead of turning on the HDF_IMAGE bit, you just turn it off.
On Vista, however, when I turn it off it doesn't seem to actually be accepting the change. So, for instance, when I start my fmt is:
0x4000 (or basically HDF_STRING)
I turn on the icon, and it becomes:
0x5800 (or basically HDF_STRING | HDF_IMAGE | HDF_BITMAP_ON_RIGHT)
I then turn it off again, but the result is:
0x4800 (or basically HDF_STRING | HDF_IMAGE)
I've checked, and I setting it to HDF_STRING only, but once HDF_IMAGE is set, it seems to be impossible to remove. Header_SetImage doesn't return any errors, so I'm at a loss as far as what to do. I've also tried removing the Imagelist from the control, but it still leaves the space as if there still was an image there.
At the end of the day I need to be able to add and remove icons from the header, and when they are removed I need all the header space available again (as they were before any were displayed. Any help would be greatly appreciated - thanks in advance!
If you read the documentation http://msdn.microsoft.com/en-us/library/bb775247(VS.85).aspx, if you indicate HDI_IMAGE in mask then iImage should be a valid index, you have to set it to I_IMAGENONE in order to remove it.
If you want to remve an image you have to do something like this:
HeaderItem.mask = HDI_FORMAT | HDI_IMAGE;
Header_GetItem(HeaderHWND, Column, &HeaderItem);
HeaderItem.fmt &= ~(HDF_IMAGE | HDF_BITMAP_ON_RIGHT);
HeaderItem.iImage = I_IMAGENONE;
Header_SetItem(HeaderHWND, Column, &HeaderItem);
Uhg, I just figured it out - they've changed slightly the way things work now as far as passed parameters.
Before, I always set the iImage to 0 when I was removing the HDF_IMAGE attribute - but it looks like now if you perform a Set, and your mask includes HDI_IMAGE, then it will not remove the HDF_IMAGE bit, even though you explicitly do.
So, the solution is to make sure not to send anything image-related if you're trying to remove it. Since I scoured the net and couldn't find anything about this, hopefully this post will now help anyone else who has a similar problem.