Copy char* to tchar* - c++

I am trying to point a char* to a tchar* that is inside of a struct.
typedef struct data {
int count;
TCHAR* msg;
COLORREF colour;
} DATA, *PDATA; // DATA non-pointer, PDATA pointer
pdata = (PDATA)malloc(sizeof(pdata));
pdata->colour = RGB(255, 0, 0);
pdata->msg = (TCHAR*)malloc(sizeof(TCHAR) * 20);
// copy tchar to tchar in pdata->msg
// I have tried _tcscpy and _tcscpy_s but it is not working
SetWindowLongPtr(hwnd, 0, (LONG)pdata);
Then I am trying to get the data through
(PDATA)GetWindowLongPtr(hwnd, 0);
This is being called in a separate file and then used to show the text on the screen

Related

I cannot convert a GDI+ bitmap to base 64 in C++ (Microsoft Visual C++)

I cannot convert a GDI+ bitmap to base 64 in C++
I have so far:
Downloaded and included this library: https://github.com/ReneNyffenegger/cpp-base64
Written the function that you can see below.
Called it, passing in a bitmap that I know has data. I know that the bitmap has data because I am using in another function called directly after this one.
The problem is that the charPixels array is always full of zeros.
Secondly can you please explain to me what the safe way of unlocking the bits is. I fear that if I just do it in the finally, if the bits aren't actually locked at that point I will get an exception.
Stride is a positive number it is: 1280.
Also I have access to 'finally' because it is a MS extension to C++.
[Note: Code is updated because I pasted the wrong code in by mistake]
std::string GdiBitmapToBase64(Gdiplus::Bitmap* gdiBitmap, int width, int height)
{
unsigned char* charPixels = nullptr;
try
{
Gdiplus::Rect rect = Gdiplus::Rect(0, 0, width, height);
Gdiplus::BitmapData gdiBitmapData;
gdiBitmap->LockBits(&rect, Gdiplus::ImageLockMode::ImageLockModeRead, PixelFormat32bppARGB, &gdiBitmapData);
auto stride = gdiBitmapData.Stride;
if (stride < 0) stride = -stride;
charPixels = new unsigned char[height * stride];
memset(charPixels, 0, height * stride);
memcpy(charPixels, gdiBitmapData.Scan0, stride);
std::string ret = base64_encode(charPixels, stride);
gdiBitmap->UnlockBits(&gdiBitmapData);
return ret;
}
finally
{
if(charPixels != nullptr)
{
delete[] charPixels;
}
}
}
Here is the code which calls this method. This may help:
void CLIScumm::Wrapper::ScreenUpdated(const void* buf, int pitch, int x, int y, int w, int h, PalletteColor* color)
{
const unsigned char* bufCounter = static_cast<const unsigned char*>(buf);
for (int hightCounter = 0; hightCounter < h; hightCounter++, bufCounter = bufCounter + pitch)
{
for (int widthCounter = 0; widthCounter < w; widthCounter++)
{
PalletteColor currentColor = *(color + *(bufCounter + widthCounter));
gdiBitmap->SetPixel(x + widthCounter, y + hightCounter, Gdiplus::Color(currentColor.r, currentColor.g, currentColor.b));
}
}
_screenUpdated->Invoke(gcnew System::String(GdiBitmapToBase64(gdiBitmap, DISPLAY_DEFAULT_WIDTH, DISPLAY_DEFAULT_HEIGHT).c_str()));
}
And the declarations:
namespace CLIScumm {
public ref class Wrapper {
...
private:
...
Gdiplus::Graphics* gdiGraphics;
Gdiplus::Bitmap* gdiBitmap;
...
};
And the initialization:
void CLIScumm::Wrapper::init()
{
if (!hasStarted)
{
try
{
if (!hasStarted)
{
...
Gdiplus::GdiplusStartupInput gdiplusStartupInput;
Gdiplus::GdiplusStartup(&m_gdiplusToken, &gdiplusStartupInput, NULL);
(malloc(sizeof(100000) * DISPLAY_DEFAULT_HEIGHT * DISPLAY_DEFAULT_WIDTH));
gdiBitmap = new Gdiplus::Bitmap(DISPLAY_DEFAULT_WIDTH, DISPLAY_DEFAULT_HEIGHT, PixelFormat32bppARGB);
gdiGraphics = new Gdiplus::Graphics(gdiBitmap);
InitImage();
...
}
}
...
}
}
Thank you all for you help, but I solved my own problem.
I was confusing bitmap pixel data with the actual bytes of a bitmap, and Scan0 is the former. The reason I was getting only zero's is because the first few frames were black.
I followed the example from C++ gdi::Bitmap to PNG Image in memory to get the actual bitmap bytes.
I modified the example function to:
bool SavePngMemory(Gdiplus::Bitmap* gdiBitmap, std::vector<BYTE>& data)
{
//write to IStream
IStream* istream = nullptr;
CreateStreamOnHGlobal(NULL, TRUE, &istream);
CLSID clsid_bmp;
CLSIDFromString(L"{557cf400-1a04-11d3-9a73-0000f81ef32e}", &clsid_bmp);
Gdiplus::Status status = gdiBitmap->Save(istream, &clsid_bmp);
if (status != Gdiplus::Status::Ok)
return false;
//get memory handle associated with istream
HGLOBAL hg = NULL;
GetHGlobalFromStream(istream, &hg);
//copy IStream to buffer
int bufsize = GlobalSize(hg);
data.resize(bufsize);
//lock & unlock memory
LPVOID pimage = GlobalLock(hg);
memcpy(&data[0], pimage, bufsize);
GlobalUnlock(hg);
istream->Release();
return true;
}
I was then able to convert data to base64 by calling:
base64_encode(&data[0], data.size());
I can't vouch for the quality of the code, because I don't understand how everything works.

Populate a Virtual ListView with std::vector<std::string>

I have a vector of thousands of strings:
std::vector<std::wstring> a;
filled with some algorithms.
Following the method described here, here is how I create a ListView as a "virtual list":
hList = CreateWindowEx(0, WC_LISTVIEW, L"", WS_CHILD | WS_VISIBLE | LVS_REPORT | LVS_OWNERDATA, 0, 0, 800, 400, hWnd, (HMENU)ID_LISTVIEW, hInst, NULL);
LV_COLUMN lvcol;
...
ListView_InsertColumn(hList, 0, &lvcol);
ListView_SetItemCountEx(hList, 100000, LVSICF_NOSCROLL);
...
// in the message loop
case WM_NOTIFY:
pdi = (NMLVDISPINFO*) lParam;
pi = pdi->item;
switch (pdi->hdr.code)
{
case LVN_GETDISPINFO:
{
pi.mask = LVIF_TEXT;
pi.pszText = a[pi.iItem]; // the nth item should be the nth string in the vector
}
}
I tried a lot of variations on:
pi.pszText = a[pi.iItem];
but they all failed with such kind of errors:
Error C2440: '=' : cannot convert from 'std::basic_string,std::allocator>' to 'LPWSTR'
What could help to do this?
Note: in fact I would like to display on row n of the ListView : the nth string of vector a concatenated with the number n, like this Blabla217 on the row 217.
Note2: even after Igor's suggestion (i.e. a cast pi.pszText = LPWSTR(a[pi.iItem].c_str());), the ListView is still empty, instead of displaying elements.
I'm not exact sure about the problem you're facing, but one thing for sure, you're passing multi-byte string (std::string, using char) while it is asking for wide-char string (std::wstring, using WCHAR).
Here is a handy code that converts std::string to std::wstring.
inline std::wstring WideFromMulti(
std::string const & multi,
UINT codepage)
{
int cchWide = MultiByteToWideChar(codepage, 0, multi.c_str(), -1, nullptr, 0);
LPWSTR szWide = new wchar_t[cchWide];
MultiByteToWideChar(codepage, 0, multi.c_str(), -1, szWide, cchWide);
std::wstring wide(szWide);
delete[] szWide;
return wide;
}
inline std::wstring WideFromUtf8(
std::string const & utf8)
{
return WideFromMulti(utf8, CP_UTF8);
}
Then you can get LPCWSTR by c_str().
std::string test_str;
std::wstring test_wstr = WideFromUtf8(test_str);
LPCWSTR wszTest = test_wstr.c_str();
What about LPWSTR? Well if you're sure that the string won't get modified, you can cast it by const_cast<LPWSTR>(wszTest). If you're strongly against const_cast, you may create a temporary copy of LPWSTR like this:
std::wstring test(L"Hello world");
LPCWSTR szTestConst = test.c_str();
int cchMax = ::lstrlenW(szTestConst) + 1;
std::vector<WCHAR> v(cchMax);
::lstrcpynW(&v[0], szTestConst, cchMax);
LPWSTR szTest = &v[0];
I don't really know why, but this solved it:
case WM_NOTIFY:
pdi = (NMLVDISPINFO*) lParam;
//pi = pdi->item;
switch (pdi->hdr.code)
{
case LVN_GETDISPINFO:
{
//pi.mask = LVIF_TEXT;
pdi->item.mask = LVIF_TEXT;;
//pi.pszText = a[pi.iItem];
pdi->item.pszText = a[pi.iItem];
}
}

[MFC/C++]Sending CBitmap's bits over socket, and re-construct it on receiver side

I am new with MFC and try to learn it with a project of MFC dialog base on VS2008. Here are the archivements I have done:
First, I have managed to display a list of pictures from a folder to a Listbox Control. After that, I also handled the click event on each line of the listbox to load and show the picture to the Picture Control(type Bitmap) on the right side. You can see the image below for easy understanding: Please click here for the image of my MFC dialog
Here is the code. Note m_ListCtrl and static_picture are variables of the listbox and the picture control:
void CMyClientDlg::OnLbnSelchangeList1(){
CString imagePath;
m_ListCtrl.GetText(m_ListCtrl.GetCurSel(),imagePath);
CImage picture;
picture.Load(imagePath);
if (!picture.IsNull())
{
float screenWidth = 200, screenHeight = 200;
float imageWidth = picture.GetWidth();
float imageHeight = picture.GetHeight();
//scaling:
float pictureRatio = imageWidth/ imageHeight;
float newImageWidth;
float newImageHeight;
int aligmentX = 0;
int aligmentY = 0;
if (pictureRatio <= 1)
{
newImageWidth = imageWidth*(screenHeight/imageHeight);
newImageHeight = screenHeight;
aligmentX = (screenWidth-newImageWidth)/2;
}
else
{
newImageWidth = screenWidth;
newImageHeight = imageHeight*(screenWidth/imageWidth);
aligmentY = (screenHeight - newImageHeight)/2;
}
//end scaling.
CDC *screenDC = GetDC();
CDC mDC;
mDC.CreateCompatibleDC(screenDC);
CBitmap bitMap;
bitMap.CreateCompatibleBitmap(screenDC, screenWidth, screenHeight);
CBitmap *pob = mDC.SelectObject(&bitMap);
mDC.SetStretchBltMode(HALFTONE);
picture.StretchBlt(mDC.m_hDC, aligmentX, aligmentY, newImageWidth, newImageHeight, 0, 0, imageWidth, imageHeight, SRCCOPY);
mDC.SelectObject(pob);
/*.......code to convert bitmap to BYTE* ........*/
/*.......code to send BYTE* over socket........*/
//display the bit map
static_picture.SetBitmap((HBITMAP)bitMap.Detach());
//clean up
ReleaseDC(screenDC);
}
}
So now I would like to advance one more step, and tried to work with socket... and yes, I successfully sent and received simple char* or CString over socket.
What I want to do is: instead showing the picture on this dialog, it shows the image on the other dialog(server).
Somehow I learned that there are 2 funtions that sound work: SetBitmapBits() and GetBitmapBits() (I honestly just read it on some source and have no idead if they suitable for my goal here).
So, I added this piece of code to turn the above bitmap into array of BYTE bmpBuffer:
BITMAP bmpProperties;
bitMap.GetBitmap(&bmpProperties);
int bmpDemension = bmpProperties.bmWidthBytes*bmpProperties.bmHeight;
BYTE* bmpBuffer=(BYTE*)GlobalAlloc(GPTR, bmpDemension);
bitMap.GetBitmapBits(bmpDemension,bmpBuffer);
Then send that array over socket:
UpdateData(TRUE);
char *socketBuffer = reinterpret_cast<char*>(bmpBuffer);
send(m_ClientSocket, socketBuffer, sizeof(socketBuffer), 0);
//clean up after send
GlobalFree((HGLOBAL)bmpBuffer);
On the other dialog. Note: I have hardcoded the demension of the bitmap to 160000, just to simplify the problem:
void CMyServer2Dlg::OnReceive(){
char *socketBuffer = new char [1025];
int iLen;
iLen = recv(m_sConnected, socketBuffer, 1025, NULL);
if(iLen==SOCKET_ERROR)
{
AfxMessageBox("Could not Receive");
}
else
{
BYTE* bmpBuffer = reinterpret_cast<BYTE*>(socketBuffer);
//re-construct the bitmap
CBitmap clone;
CDC *screenDC = GetDC();
CDC mDC;
mDC.CreateCompatibleDC(screenDC);
clone.CreateCompatibleBitmap(screenDC, 200, 200);
clone.SetBitmapBits(160000,bmpBuffer);
//Picture control(type bitmap) has variable "static_picture"
static_picture.SetBitmap((HBITMAP)clone.Detach());
UpdateData(FALSE);
ReleaseDC(screenDC);
GlobalFree((HGLOBAL)bmpBuffer);
}
delete socketBuffer;
And, it just doesn't work... Please tell me where did I mess it up? And sorry for the long post.....
I think the most possible reason is that your receiver doesn't get all data of the picture. I suggest you put a size of the bitmap into the package while sending it, for receiver to get correct size.
Here are some sample code. Be aware they are just for showing the idea, you may need some debugging to make sure they work.
step 1: Pack the size of bitmap. I suppose here the size is less than 64K, so a int is used. If size may be bigger than 64k, you may want to use INT64.
int bmpDemension = bmpProperties.bmWidthBytes*bmpProperties.bmHeight;
int bufferSize = bmpDemension + sizeof(int);
BYTE* bmpBuffer=(BYTE*)GlobalAlloc(GPTR, bufferSize );
bitMap.GetBitmapBits(bmpDemension,bmpBuffer + sizeof(int));
memcpy(bmpBuffer, &bmpDemension, sizeof(int)); // put the size into the head of package.
step 2: Send it out
Be aware, I use bufferSize here, because sizeof(bmpBuffer) returns the pointer size, which is 4, not the space size.
UpdateData(TRUE);
char *socketBuffer = reinterpret_cast<char*>(bmpBuffer);
send(m_ClientSocket, socketBuffer, bufferSize , 0);
//clean up after send
GlobalFree((HGLOBAL)bmpBuffer);
At the receiver side:
First, you read the size of the bitmap, then do receive according to the size of data.
void CMyServer2Dlg::OnReceive(){
char socketBuffer[1025];
int iLen;
iLen = recv(m_sConnected, socketBuffer, sizeof(int), NULL); //read the bigmap size
if(iLen==SOCKET_ERROR)
{
AfxMessageBox("Could not Receive");
}
else
{
int dimension = *((int *) socketBuffer);
char * bitmapBuffer = new char[dimension];
int readSize = dimension;
char * pBuffer = bitmapBuffer;
while (readSize > 0)
{
int sizeToRead = readSize > sizeof(socketBuffer) ? sizeof(socketBuffer) : readSize;
iLen = recv(m_sConnected, socketBuffer, sizeToRead , NULL);
memcpy(pBuffer, socketBuffer, iLen);
pBuffer += iLen;
readSize -= iLen;
}
// when the loop done, you shall have all data in bitmapBuffer.
....
// I leave the remaining code to you.
Again, these code is just to demo the idea.

NULL pointer, thought it seems to be initialized

I get
Debug assertion failed.
p!=0
and it points to:
_NoAddRefReleaseOnCComPtr<T>* operator->() const throw()
{
ATLASSERT(p!=NULL);
return (_NoAddRefReleaseOnCComPtr<T>*)p;
}
in 'atlcomcli.h'
From what I understand it means I have forgot to initialize a pointer somewhere, but all of them seem to be initialized.
When I use normal pointers instead of 'CComPtr', it throws 'Access Violation Reading Location' at 'font->DrawTextA' in 'D3DFont::Draw' in D3DFont.cpp
//D3DFont.h:
#include <D3DX10.h>
#include <atlbase.h>
#include <string>
class D3DFont
{
public:
D3DFont(void);
~D3DFont(void);
bool Create(ID3D10Device *device, std::string name, int width,
int height, int weight, int mipLevels, bool italic, BYTE charset,
BYTE quality, BYTE pitchAndFamily);
void Draw(LPD3DX10SPRITE sprite, std::string text, int charCount,
LPRECT rect, UINT format, D3DXCOLOR color);
private:
CComPtr<ID3DX10Font> font;
};
//D3DFont.cpp:
#include "D3DFont.h"
D3DFont::D3DFont(void){}
D3DFont::~D3DFont(void){}
bool D3DFont::Create( ID3D10Device *device, std::string name,
int width, int height, int weight, int mipLevels, bool italic,
BYTE charset, BYTE quality, BYTE pitchAndFamily )
{
D3DX10_FONT_DESC fd;
ZeroMemory(&fd, sizeof(D3DX10_FONT_DESC));
fd.Height = height;
fd.Width = width;
fd.Weight = weight;
fd.MipLevels = mipLevels;
fd.Italic = italic;
fd.CharSet = charset;
fd.Quality = quality;
fd.PitchAndFamily = pitchAndFamily;
strcpy_s(fd.FaceName, name.c_str());
// INITIALIZING FONT HERE
D3DX10CreateFontIndirect(device, &fd, &font);
return true;
}
void D3DFont::Draw( LPD3DX10SPRITE sprite, std::string text,
int charCount, LPRECT rect, UINT format, D3DXCOLOR color )
{
// ERROR HERE
font->DrawTextA(sprite, text.c_str(), charCount, rect, format, color);
}
And my use of above functions:
if( !font.Create(d3d.GetDevice(), "Impact", 0, 175, 0, 1, false,
OUT_DEFAULT_PRECIS, DEFAULT_QUALITY, DEFAULT_PITCH | FF_DONTCARE) )
{
MessageBox(0, "Could not create font.", "Error!", MB_OK | MB_ICONERROR);
}
// later on...
RECT r = {35, 50, 0, 0};
font.Draw(0, "Test", -1, &r, DT_NOCLIP, d3d.GetColorObj(1.0f, 1.0f, 0.0f, 1.0f));
What could I miss?
'D3DX10CreateFontIndirect' throws 0x8876086C
Can't find what does it mean, but some google threads are related to d3dDevice, so I guess it must be related to it. Will update when I will have more info.
Calling the D3DX10CreateFontIndirect doesn't actually guarantee that your pointer will be initialized.
Rule of thumb : ALWAYS check HRESULTs when using DirectX functions that initialize a pointer:
HRESULT hr = D3DX10CreateFontIndirect(device, &fd, &font);
if(FAILED(hr)){
//Get the last error, display a message, etc.
//Eventually propagate the error if the code can't continue
//with the font pointer uninitialized.
}
When your function returns E_FAIL, do not try to use the pointer afterward. There are great chances that the values of the parameters are simply incorrect (here, your device pointer might be null or your font description might be incorrect).

Issue with Int to LPWSTR function

I am making a win32 program that is a level editing tool to go with the library I am creating for a 2D tile system.
I want to create dialog box displaying the maps properties when the user selects it from the menu. This means a conversion from int to a wchar_t array. I have created a function that I hoped would do this. However currently it just returns a blank string that the return variable is initialized as. This conversion is necessary to work with the SetDlgItemText() function called by the map properties dialog box.
Here is the function I have currently:
LPWSTR IntToLPWSTR(int value)
{
std::ostringstream convert;
std::string out;
convert << value;
out = convert.str();
const char* in;
in = out.c_str();
LPWSTR ret = L"";
MultiByteToWideChar(CP_ACP, MB_COMPOSITE, in, strlen(in), ret, wcslen(ret));
return ret;
}
It is being called from here:
case WM_INITDIALOG:
if (mapToEdit)
{
SetDlgItemText(hDlg, IDC_TILE_WIDTH_LBL, IntToLPWSTR(mapToEdit->TileWidth()));
SetDlgItemText(hDlg, IDC_TILE_HEIGHT_LBL, L"");
SetDlgItemText(hDlg, IDC_MAP_WIDTH_LBL, L"");
SetDlgItemText(hDlg, IDC_MAP_HEIGHT_LBL, L"");
}
else
{
EndDialog(hDlg, LOWORD(wParam));
MessageBox(hWnd, L"You must create a map first", L"Error", 1);
}
Map to edit is simply a pointer to my own map class that contains the properies I want to display. The bottom three calls to SetDlgItemText() pass L"" as their string, the intention is that they will also use the function when it works.
std::to_wstring is simpler, but to point out the problem in your code, you never created a buffer. LPWSTR ret = L""; makes ret a pointer to an array held in static memory. This array cannot be modified.
Here is one way to fix the code by using std::wstring as the buffer:
std::wstring IntToWstring(int value)
{
std::ostringstream convert;
std::string out;
convert << value;
out = convert.str();
std::wstring ret;
// Find proper length
int length = MultiByteToWideChar(CP_ACP, 0, out.c_str(), out.length(), nullptr, 0);
ret.resize(length);
// Probably should also check for errors (got rid of MB_COMPOSITE flag)
MultiByteToWideChar(CP_ACP, 0, out.c_str(), out.length(), &ret[0], length);
return ret;
}
If you don't want to use std::wstring you could dynamically allocate a buffer LPWSTR ret = new LPWSTR[length];.
EDIT
Also, keep in mind that you could simplify the code to the following:
std::wstring IntToWstring(int value)
{
std::wostringstream convert;
convert << value;
convert.str();
}
You don't need to go to a lot of effort to convert an int into a const wchar_t *. Since C++11, you can take a two-step approach to a std::wstring and a const wchar_t * from there:
SetDlgItemText(hDlg, IDC_TILE_WIDTH_LBL, std::to_wstring(mapToEdit->TileWidth()).c_str());
Sure you could put that into a function to make it one step, but keep in mind that you cannot let the std::wstring be destroyed by the time you use the pointer.