Why GetKeyState changed the behavior of ToUnicodeEx? - c++

In the code below:-
BYTE ks[256];
auto keyboard_layout = GetKeyboardLayout(0);
GetKeyboardState(ks);
auto w = WCHAR(malloc(1));
ToUnicodeEx(wParam, MapVirtualKey(wParam, MAPVK_VK_TO_VSC), ks, LPWSTR(&w), 1, 0, keyboard_layout);
wcout << "KEY:" << w << endl;
The output only shows lowercase letters such as :-
KEY:a
KEY:b
KEY:2
Even when pressing SHIFT+A or SHIFT+2
But adding GetKeyState(VK_SHIFT) and/or GetKeyState(VK_CAPITAL) in the code below:-
auto shifted = false;
auto caps = false;
if (GetKeyState(VK_SHIFT) < 0)
{
shifted = true;
cout << "Shifted!" << endl;
}
if (GetKeyState(VK_CAPITAL) < 0)
{
shifted = true;
cout << "Caps!" << endl;
}
BYTE ks[256];
auto keyboard_layout = GetKeyboardLayout(0);
GetKeyboardState(ks);
auto w = WCHAR(malloc(1));
ToUnicodeEx(wParam, MapVirtualKey(wParam, MAPVK_VK_TO_VSC), ks, LPWSTR(&w), 1, 0, keyboard_layout);
wcout << "KEY:" << w << endl;
The behavior of the code changed directly when pressing SHIFT+A or SHIFT+2 to
KEY:A
KEY:B
KEY:#
I tried this on ToUnicode, ToAsciiEx and ToAscii and they shows the same situation as above.
I used WH_KEYBOARD hook in a separate DLL file called hook.dll and linked with a console application.
So my question is: why GetKeyState function enabled the detection of SHIFT and Caps Lock key? In addition,

Using auto w = WCHAR(malloc(1)) is wrong. malloc() dynamically allocates a block of bytes, not characters. WCHAR is 2 bytes in size, but you are allocating only 1 byte. Which doesn't matter since you don't use the pointer anyway. You are type-casting the pointer to a single WCHAR, truncating the pointer value. And then you are dismissing the value when passing &w to ToUnicodeEx() as it will overwrite the value of w. You are then leaking the allocated memory since you are not calling free() to deallocate it.
You don't need the malloc() at all:
WCHAR w;
ToUnicodeEx(..., &w, 1, ...);
wcout << "KEY:" << w << endl;
However, ToUnicodeEx() can potentially return more than 2 characters, so you should allocate extra room to account for that. Just use a local fixed array, like you do for GetKeyboardState(). And do pay attention to the return value, it contains important information.
As for the key states, since you are calling GetKeyboardState(), you don't need to use GetKeyState().
Try something more like this:
BYTE ks[256];
auto keyboard_layout = GetKeyboardLayout(0);
GetKeyboardState(ks);
if (ks[VK_SHIFT] & 0x80) wcout << L"Shifted!" << endl;
if (ks[VK_CAPITAL] & 0x80) wcout << L"Caps!" << endl;
WCHAR w[5] = {};
int ret = ToUnicodeEx(wParam, MapVirtualKey(wParam, MAPVK_VK_TO_VSC), ks, w, 4, 0, keyboard_layout);
switch (ret)
{
case -1:
wcout << L"DEAD KEY:" << w << endl;
break;
case 0:
wcout << L"NO TRANSLATION" << endl;
break;
case 1:
wcout << L"KEY:" << w << endl;
break;
case 2:
case 3:
case 4:
w[ret] = 0;
wcout << L"KEYS:" << w << endl;
break;
}

Related

How to use CRT batch technique in Microsoft SEAL 3.1?

Can you please tell me whether SEAL 3.1 supports PolyCRTBuilder class? I am trying to run the following program but failed because the class is not declared in this scope.
/**
Suppose I have two arrays x = [1,2,3,4,5] and xMean = [3,3,3,3,3]. I composed and encrypted the two array using PolyCRTBuilder ( xCiphertext and xMeanCiphertext ) . If I subtract the two ciphertexts ( xCiphertext MINUS xMeanCiphertext ), I should get xResult = [-2, -1, 0, 1, 2] but after the homomorphic subtraction I am getting xResultDecrypted = [40959, 40960, 0 ,1, 2] . I can relate the overflow result to the plain modulus set but is there a work around for this problem. Here is the code:
*/
#include <iostream>
#include "seal/seal.h"
using namespace std;
using namespace seal;
/*
Helper function: Prints the parameters in a SEALContext.
*/
void print_parameters(shared_ptr<SEALContext> context)
{
// Verify parameters
if (!context)
{
throw invalid_argument("context is not set");
}
auto &context_data = *context->context_data();
/*
Which scheme are we using?
*/
string scheme_name;
switch (context_data.parms().scheme())
{
case scheme_type::BFV:scheme_name = "BFV";
break;
case scheme_type::CKKS:scheme_name = "CKKS";
break;
default:
throw invalid_argument("unsupported scheme");
}
cout << "/ Encryption parameters:" << endl;
cout << "| scheme: " << scheme_name << endl;
cout << "| poly_modulus_degree: " << context_data.parms().poly_modulus_degree() << endl;
/*
Print the size of the true (product) coefficient modulus.
*/
cout << "| coeff_modulus size: " << context_data.
total_coeff_modulus_bit_count() << " bits" << endl;
/*
For the BFV scheme print the plain_modulus parameter.
*/
if (context_data.parms().scheme() == scheme_type::BFV)
{
cout << "| plain_modulus: " << context_data.
parms().plain_modulus().value() << endl;
}
cout << "\\ noise_standard_deviation: " << context_data.
parms().noise_standard_deviation() << endl;
cout << endl;
}
int main(){
cout << "\nTotal memory allocated from the current memory pool: "<< (MemoryManager::GetPool().alloc_byte_count() >> 20) << " MB" << endl;
EncryptionParameters parms(scheme_type::BFV);
//EncryptionParameters parms;
parms.set_poly_modulus_degree(4096);
parms.set_coeff_modulus(coeff_modulus_128(4096));
parms.set_plain_modulus(40961); ////Make the coefficient modulus prime>2n to enable CRT batching
auto context = SEALContext::Create(parms);
print_parameters(context);
IntegerEncoder encoder(parms.plain_modulus());
KeyGenerator keygen(context);
PublicKey public_key = keygen.public_key();
SecretKey secret_key = keygen.secret_key();
// SEALContext context(parms);
// KeyGenerator keygen(context);
// auto public_key = keygen.public_key();
// auto secret_key = keygen.secret_key();
Encryptor encryptor(context, public_key);
Evaluator evaluator(context);
Decryptor decryptor(context, secret_key);
PolyCRTBuilder crtbuilder(context);
int slot_count = crtbuilder.slot_count();
int row_size = slot_count / 2;
vector<uint64_t> x_pod_matrix(slot_count, 0);
x_pod_matrix[0] = 1;
x_pod_matrix[1] = 2;
x_pod_matrix[2] = 3;
x_pod_matrix[3] = 4;
x_pod_matrix[4] = 5;
Plaintext x_plain_matrix;
crtbuilder.compose(x_pod_matrix, x_plain_matrix);
Ciphertext x_encrypted_matrix;
encryptor.encrypt(x_plain_matrix, x_encrypted_matrix);
vector<uint64_t> x_mean_pod_matrix(slot_count, 0);
x_mean_pod_matrix[0] = 3;
x_mean_pod_matrix[1] = 3;
x_mean_pod_matrix[2] = 3;
x_mean_pod_matrix[3] = 3;
x_mean_pod_matrix[4] = 3;
Plaintext x_mean_plain_matrix;
crtbuilder.compose(x_mean_pod_matrix, x_mean_plain_matrix);
Ciphertext x_mean_encrypted_matrix;
encryptor.encrypt(x_mean_plain_matrix, x_mean_encrypted_matrix);
evaluator.sub_plain(x_encrypted_matrix, x_mean_encrypted_matrix);
// Decrypt x_encrypted_matrix
Plaintext x_plain_result;
decryptor.decrypt(x_encrypted_matrix, x_plain_result);
vector<uint64_t> pod_result;
crtbuilder.decompose(x_plain_result, pod_result);
for(int i = 0; i < 5; i++) {
std::cout << pod_result[i] << '\n';
}
return 0;
}
PolyCRTBuilder has been renamed to BatchEncoder. Take a look at the src/examples directory in SEAL v3.1 (or native/examples in a newer version) and you'll see plenty of examples.
Kind of related to your question: the coeff_modulus_128 function hasn't existed in SEAL for quite a while; the same functionality is provided by the CoeffModulus::BFVDefault function. With these changes your code might work in SEAL 3.5 even.

Why does creating 2 variables cause a crash in custom STL, C++ VS2019?

Hello I'm trying to rewrite my own memory manager and STL (nothing fancy, just some basic vector and string features) and I'm getting a strange behaviour. I'm trying to get experience in the memory management field because I'm a high school student with time to spare. The problem is, when I create my first variable everything goes perfectly but after creating the second variable, the program crashes while creating the first variable.
String.h/.cpp
class String {
char* pointer_toBuffer = nullptr;
size_t buffer_length = 0;
IAllocator* Allocator;
public:
String(const char* text, IAllocator* Allocator);}
String::String(const char* text, TuranAPI::MemoryManagement::IAllocator* MemoryAllocator) : Allocator(MemoryAllocator) {
std::cout << "String creation has started: " << text << std::endl;
unsigned int i = 0;
while (text[i] != 0) {
i++;
}
buffer_length = i + 1;
pointer_toBuffer = (char*)Allocator->Allocate_MemoryBlock(buffer_length * sizeof(char));//When I write the Second String part, FirstString crashes directly. I use VSDebug and it says access violation here while creating FirstString. It is successful if I delete the SecondString part.
for (unsigned int letterindex = 0; letterindex < i; letterindex++) {
pointer_toBuffer[letterindex] = text[letterindex];
}
pointer_toBuffer[i] = 0;
}
MemoryManagement.h/cpp
TAPIMemoryAllocator::TAPIMemoryAllocator(MemoryBlockInfo MemoryPool_toUse){
std::cout << "TAPIMemoryAllocator is created!\n";
std::cout << "MemoryPool's start pointer: " << MemoryPool_toUse.address << std::endl;
MemoryPool.address = MemoryPool_toUse.address;
MemoryPool.size = MemoryPool_toUse.size;
SELF = this;
}
void* TAPIMemoryAllocator::Allocate_MemoryBlock(size_t size) {
std::cout << "MemoryPool's start pointer: " << MemoryPool.address << std::endl;
std::cout << "A buffer of " << size << " bytes allocation request found in TAPIMemoryAllocator!\n";
if (SELF == nullptr) {
TMemoryManager First(1024 * 1024 * 1024 * 1);
MemoryBlockInfo FirstMemoryBlock;
FirstMemoryBlock.address = SELF->MemoryPool.address;
FirstMemoryBlock.size = size;
Allocated_MemoryBlocks[0] = FirstMemoryBlock;
return (char*)SELF->MemoryPool.address;
}
void* finaladdress = SELF->MemoryPool.address;
for (unsigned int blockindex = 0; blockindex < MAX_MEMORYBLOCKNUMBER; blockindex++) {
MemoryBlockInfo& MemoryBlock = Allocated_MemoryBlocks[blockindex];
finaladdress = (char*)finaladdress + MemoryBlock.size;
if (size <= MemoryBlock.size && MemoryBlock.address == nullptr) {
std::cout << "Intended block's size is less than found memory block!\n";
MemoryBlock.address = finaladdress;
//You shouldn't change Memory Block's size because all of the allocations before this are based upon the previous size!
//You should move all the previous allocated memory to set the size (which is not ideal!)
//If I'd want to find memory leaks causing this, I could write code here to log the leaks!
return MemoryBlock.address;
}
else if (MemoryBlock.size == 0 && MemoryBlock.address == nullptr) {
std::cout << "An empty block is created for intended block! Block's Array index is: " << blockindex << "\n";
std::cout << "MemoryPool's start pointer: " << MemoryPool.address << std::endl << "MemoryBlock's pointer: " << finaladdress << std::endl;
//This means this index in the Allocated_MemoryBlocks has never been used, so we can add the data here!
MemoryBlock.address = finaladdress;
MemoryBlock.size = size;
return MemoryBlock.address;
}
}
//If you arrive here, that means there is no empty memory block in the Allocated_MemoryBlocks array!
std::cout << "There is no empty memory block in the Allocated_MemoryBlocks array, so nullptr is returned!\n";
return nullptr;
}
TMemoryManager::TMemoryManager(size_t Main_MemoryBlockSize) {
if (SELF != nullptr) {
std::cout << "You shouldn't create a MemoryManager!";
return;
}
std::cout << "TMemoryManager is created!\n";
MainMemoryBlock.address = malloc(Main_MemoryBlockSize);
MainMemoryBlock.size = Main_MemoryBlockSize;
SELF = this;
std::cout << "Main Memory Block's start pointer: " << MainMemoryBlock.address << std::endl;
MemoryBlockInfo TuranAPI_MemoryPool;
TuranAPI_MemoryPool.address = MainMemoryBlock.address;
std::cout << "TuranAPI_MemoryPool.address: " << TuranAPI_MemoryPool.address << std::endl;
TuranAPI_MemoryPool.size = 1024 * 1024 * 10;
TAPIMemoryAllocator Create(TuranAPI_MemoryPool);
}
TMemoryManager* TMemoryManager::SELF = nullptr;
TMemoryManager First(1024 * 1024 * 1024 * 1);
Main.cpp
String FirstString("How are you?", TAPIMemoryAllocator::SELF);
std::cout << FirstString << std::endl; //If I delete the below, it prints "How are you?" as expected
String SecondString("I'm fine, thanks!", TAPIMemoryAllocator::SELF);
std::cout << SecondString << std::endl;
Solved: The problem was in Allocator. When allocator goes out of scope, it's Allocate_MemoryBlock function (it's a virtual function, not static) is deleted. I don't know why it doesn't occur when only one String is created (maybe a compiler optimization) but storing Allocator's itself (All of variables was static already) and assinging SELF as stored one's pointer solved the problem.

vector of structs with weird behavior c++

i have a problem with one assignment that i have. I have to read a .ts file, read the packets that are inside and extract header information from each packet.
I have created a struct Packet that will hold all the info of the header, and i also have a vector in which i will push_back each Packet.
The problem is that the for loop stops for some reason on the 163rd loop. If i loop until lets say i=160, then the code escapes ends the loop, but when i print the vector.size() i get a really huge number which doesn't make sense. i guess it should be an integer value as high as the pushed back number of Packets.Here is the code that i have so far:
int main() {
FILE *ts_file = NULL;
ts_file = fopen64("/home/ddd/Desktop/Assignment/Streams/ddd.ts", "rb");
if (ts_file == NULL){
cout << "No file detected on this path, try again" << endl; // prints !!!Hello World!!!
}
TS_Analyzer *ts_analyzer;
ts_analyzer->parse_file(ts_file);
cout << "Finished main" << endl;
return 0;
}
void TS_Analyzer::parse_file(FILE *ts_file){
cout << "Inside parser" << endl;
fseek(ts_file,0,SEEK_END);
long file_size = ftell(ts_file);
rewind (ts_file);
number_of_packets = file_size/PACKET_SIZE;
unsigned int current_header_add = 0;
unsigned int i=0;
for (unsigned int j=1; i<number_of_packets; j++)
{
i++;
unsigned char TS_raw_header[4];
cout << "current position " << int(current_header_add) << endl;
current_header_add = ftell(ts_file);
fread(&TS_raw_header, sizeof(TS_raw_header), 1, ts_file);
Packet current_packet;
current_packet.sync_byte = TS_raw_header[0];
current_packet.transport_error_indicator = (TS_raw_header[1] & 0x80) >> 7;
current_packet.payload_start_indicator = (TS_raw_header[1] & 0x40) >> 6;
current_packet.transport_priority = (TS_raw_header[1] & 0x20) >> 5;
current_packet.PID = ((TS_raw_header[1] & 31) << 8) | TS_raw_header[2];
current_packet.transport_scrambling_control = (TS_raw_header[3] & 0xC0);
current_packet.adaption_field_control = (TS_raw_header[3] & 0x30) >> 4;
current_packet.continuity_counter = (TS_raw_header[3] & 0xF);
stream_packets.push_back(current_packet);
//cout << hex << int(current_packet.PID) << endl;
//cout << dec << "continuity counter " << int(current_packet.continuity_counter) << endl;
cout << " i " << int(i) << endl;
fseek(ts_file, 184, SEEK_CUR);
}
cout << "##" << endl;
cout << stream_packets.size() << endl;
}
class TS_Analyzer: public Analyzer {
public:
TS_Analyzer();
~TS_Analyzer();
struct Packet {
unsigned char sync_byte;
unsigned char transport_error_indicator;
unsigned char payload_start_indicator;
unsigned char transport_priority;
unsigned int PID;
unsigned char transport_scrambling_control;
unsigned char adaption_field_control;
unsigned char continuity_counter;
};
std::vector<Packet>stream_packets;
int number_of_packets = 0;
void parse_file(FILE *);
};
Any ideas of why the vector push_back breaks the for loop and why i cannot get a correct vector size?
If I put this code through the clang compiler, I get an error on following code:
TS_Analyzer *ts_analyzer;
ts_analyzer->parse_file(ts_file);
>> variable 'ts_analyzer' is uninitialized when used here
I guess you are encountering undefined behavior: As ts_analyzer as ptr is any random value, the data in its members is also very random.
I'm actually surprised that this code runs at all without crashing, though you can always be lucky.
If you like to fix this, try avoiding pointers by creating the object at the stack:
TS_Analyzer ts_analyzer;
ts_analyzer.parse_file(ts_file);
or if you really need allocated memory, at least fill in the pointer:
auto ts_analyzer = std::make_unique<TS_Analyzer>();
ts_analyzer->parse_file(ts_file);

How to implement VERIFY command on NIST PIV cards?

I must be doing something wrong, but I can't see what.
I'm trying to get the VERIFY command to show the number of attempts remaining. (I was trying to enter the PIN as well, but cut back to this when I couldn't get anything to work.) Here's the code fragment that I've been trying:
for (unsigned int basebyte = 0x00; basebyte != 0x100; basebyte += 0x80) {
for (unsigned char add = 0x01; add != 0x20; ++add) {
smartcard::bytevector_t b;
b.push_back(0x00); // CLA
b.push_back(0x20); // INS
b.push_back(0x00); // P1
b.push_back(basebyte + add); // P2 ("the sensible ranges are 0x01..0x1F and 0x81..0x9F")
//b.push_back(0x00); // Lc field -- length of the following data field
b = card.rawTransmit(b);
if (!card.status()) {
cout << "Received error '" << card.status() << "'" << endl;
} else {
if (b[0] == 0x6a && b[1] == 0x88) {
// "Referenced data not found"
continue;
}
cout << " Attempts remaining (" << std::hex << (basebyte + add) << std::dec << "): ";
cout << std::hex;
for (smartcard::bytevector_t::const_iterator i = b.begin(), ie = b.end();
i != ie; ++i) cout << std::setfill('0') << std::setw(2) << int(*i) << ' ';
cout << std::dec << endl;
}
}
}
The rawTransmit function...
bytevector_t rawTransmit(bytevector_t sendbuffer) {
SCARD_IO_REQUEST pioSendPci, pioRecvPci;
if (mProtocol.value() == SCARD_PROTOCOL_T0) {
pioSendPci = pioRecvPci = *SCARD_PCI_T0;
} else if (mProtocol.value() == SCARD_PROTOCOL_T1) {
pioSendPci = pioRecvPci = *SCARD_PCI_T1;
} else {
std::ostringstream out;
out << "unrecognized protocol '" << mProtocol.str() << "'";
throw std::runtime_error(out.str());
}
DWORD rlen = 256;
bytevector_t recvbuffer(rlen);
mResult = SCardTransmit(mHandle, &pioSendPci, &sendbuffer[0],
DWORD(sendbuffer.size()), &pioRecvPci, &recvbuffer[0], &rlen);
recvbuffer.resize(rlen);
return recvbuffer;
}
(bytevector_t is defined as std::vector<unsigned char>.)
All the cards using protocol T0 return 0x6a 0x88 ("Referenced data not found") for all P2 values. All the cards using T1 do the same, except when P2 is 0x81 -- then they say 0x69 0x84 ("Command not allowed, referenced data invalidated").
The cards in question definitely DO have PINs, and I can verify the PIN in the "Security Token Configurator" program provided by the middleware vendor, so I know that the card, reader, and middleware stuff are all working.
It's probably obvious, but I'm new to smartcard programming. Can anyone give me a clue where I'm going wrong?
The Global PIN has ID 00 and the PIV Card Application PIN has 80 (hex) so your tests do not include the known PIV card PIN ID's.

Trying to make an ASCII table in C++, cannot get the "special characters" to display properly

I'm working on an assignment where I need to print out the ASCII table in the table format exactly like the picture below.
http://i.gyazo.com/f1a8625aad1d55585df20f4dba920830.png
I currently can't get the special words/symbols to display (8, 9, 10, 13, 27, 32, 127).
Here it is running:
http://i.gyazo.com/80c8ad48ef2993e93ef9b8feb30e53af.png
Here is my current code:
#include <iomanip>
#include <iostream>
using namespace std;
int main()
{
cout<<"ASCII TABLE:"<<endl;
cout<<endl;
for (int i = 0; i < 128; i++)
{
if (i <= 32)
cout << "|" << setw(2)
<<i
<< setw(3)
<< "^" << char (64+i) <<"|";
if (i >= 33)
cout << "|" << setw(3)
<<i
<< setw(3)
<<char (i) << "|";
if((i+1)%8 == 0) cout << endl;
}
return 0;
}
8 Back Space
9 Horizontal Tab
10 New Line
13 carriage return
27 Escape (Esc)
32 Space
127 Del
As Above these ASCII characters doesn't display any visible or printed character. That's why you might be thinking you are not getting these values.
I'm no sure what's your real problem there, but you didn't get an answer yet about how to print the special codes.
Running your programme I see that you have some minor alignment problems. If that's the problem, note that setw(3) only applies to the next element:
cout << setw(3) << "^" << char (64+i); // prints " ^A" instead of " ^A".
If you try to correct into
cout << setw(3) << "^"+ char (64+i); // ouch !!!!
you'll get undefined behaviour (garbage) because "^" is a pointer to a string and adding char(64+i) is understood as adding an offset of 64+i to this pointer. As this is a rather random address, you'll get garbage. Use a std::string instead.
The other difference I see between your programme's output and the expected result is that you don't print the code of the special chars. If that's the problem, either use a switch statement (very repetitive here), or a lot of if/else or use an associative map.
Here an alternative proposal putting all this together:
map<char, string>special{ { 8, "BS " }, { 9, "\\t " }, { 10, "\\n " }, { 13, "CR " }, { 27, "ESC" }, { 32, "SP " }, { 127, "DEL" } };
cout << "ASCII TABLE:" << endl << endl;
for (int i = 0; i < 128; i++) {
cout << "|" << setw(3)<<i<<setw(4); // setw() only applies to next
if (iscntrl(i) || isspace(i)) { // if its a control char or a space
auto it = special.find(i); // look if there's a special translation
if (it != special.end()) // if yes, use it
cout << it->second;
else cout << string("^") + char(64 + i)+ string(" "); // if not, ^x, using strings
}
else if (isprint(i)) // Sorry I'm paranoïd: but I always imagine that there could be a non printable ctrl ;-)
cout << char(i)+string(" ") ; // print normal char
cout << "|";
if ((i + 1) % 8 == 0) cout << endl;
}
Now some additional advices:
take the effort to indent
instead of manual categorization of chars, use iscntrl(), isspace(), isprint(). As long as you only use ascii, it's manageable to do like you did. But as soons as you move to internationalisation and wide chars it becomes increasinlgy cumbersome to do that whereas there are easy wide equivalents like iswcntrl(), iswspace(), iswprint().
also be rigorous on two consecutive if: If you know that only one of the two should apply, make the effort to write if ... else if these four additional lettes can save you hours of debugging later.