How to read user data on smart card? - c++

First off, I'm still relatively new working with smart cards and I don't know exactly how is data stored and which data is protected on a smart card.
I'm trying to read my student identification smart card which is PIN protected. I've been programming in C++ with default windows smart card library (winscard.lib).
I've successfully read ATR header but as far as I know, ATR header contains information on how to communicate with reader, not user information.
I've tried reading binary from the card and but APDU always returns 6E 00 which indicates "Class not supported" or "Wrong instruction". Here is the code:
switch(dwProtocol)
{
case SCARD_PROTOCOL_T0:
{
pioSendPci = *SCARD_PCI_T0;
break;
}
case SCARD_PROTOCOL_T1:
{
pioSendPci = *SCARD_PCI_T1;
break;
}
default:
{
printf("Detecting protocol failed!");
printf("Press <ENTER> key to terminate!\n");
nResponse = getchar();
lRet = SCardReleaseContext(hContext);
return -1;
}
}
lRet = SCardTransmit(hCard,
&pioSendPci,
(LPCBYTE)&cmdRead,
sizeof(cmdRead),
NULL,
(LPBYTE)&recvbuffer,
&atrLen);
printf("APDU return code:\n");
printf("=================\n");
for(i=0; i<2; i++)
{
printf("%02X ", recvbuffer[i]);
}
printf("\n");
if(lRet!=SCARD_S_SUCCESS)
{
printf("Transmission failed! ErrorCode = 0x%08X\n",lRet);
printf("Press <ENTER> key to terminate!\n");
nResponse = getchar();
lRet = SCardReleaseContext(hContext);
return -1;
}
Where cmdRead is as following:
BYTE cmdRead[] = { 0x00, 0xB0, 0x00, 0x00, 0x00, 0x00, 0xFF };
What could be wrong? Do I need to verify the card first in order to read binary? Is read binary right function to read basic data like student id?

Without a specification of the student application on the card, this is a lengthy and boring process.
Assumed that you have a file system on the card (as opposed to a java card) you need to know, in which file the user data is stored, so that you can SELECT the appropriate file before issuing the READ BINARY pr READ RECORD if its a record-oriented file. You can try to find the correct file ID by trial and error, but... Note, that on smart cards the access conditions are defined with very fine granularity, so there may be files, which can be read without any authentication, and on the other extreme, there may be files only readable after having established a secure channel to be used via secure messaging (encrypted and MAC-Protected command and or response).

I can think of two reasons why the card returns 6E00.
The currently selected application is the card manager or any other applet aside from the one you want to use. You can try to perform a SELECT AID command before sending the READ command. However, you should know the instance AID of the applet you want to select to do so.
The file you are trying to read is protected by secure messaging and your APDU command should be encrypted/MACed which would change the CLA byte to '0C' for example. However, you need to establish a secure channel first before you can do this.
Like #guidot said, this will be very difficult without a specification.

First of all, as #guidot mentioned it's boring process. Not only you have complete information on javacards, but also you should know how to something like hack a smart card, because you don't have any card vendor specification and probably they use security on their card which you should know the keys.
But for your information, according to ISO 7816-4 0x6E00 means "Class not supported". You can check complete list of APDU responses in here.
The class (CLA) byte is usually 0x00, 0xA0, 0xC0 or 0xF0 and sometimes masked with 0x0C that indicates Secure Messaging on some cards.
To reach data inside applet firewall you should select that applet and applet selection occurs when the JCRE(Java Card Runtime Environment) receives a select APDU whose data match the AID of the applet.
And if there be an installed security domain, then you should have those security keys in order to reach a successful applet selection.
To have a list of APDU commands communicating with the card reader check this link.
There's lots of information here about writing a smart card library in C++ , which uses WinSCard.dll to communicate with the reader.
Also this link is about file system structure in java card which would be useful if the applet stores its data in files.
and this link is an example of selecting a file in javacard.
If you want to go further through java card applet implementation, here's a guide on how to implement a java card applet.
Note that, don't forget to read most important existing document like Global platform and ISO 7816.

As Chooch said : In JAVA card ,
1. You should follow AID selection then
2. since you are reader Binary file , select binary EF
2a).Since you are using P1 00 i hope You already selected the particular EF.
Note : Even though i feel your command is wrong to read Binary data in ISO 7816/ ISO 14443 Smartcard .
else AS ISO 7816 4 :
Reader binary should be :
CLA INS 00 BO
P1 - Short File ID : MSB should be 1 : ur SID is 3 : It should be :83
P2 - should be Offset / Start Byte : eg : from 0 means : 00 if it is in between 10 : 0A
Le - should be No of bytes you want to read : eg 20 bytes means : 14
So command should be : 00 B0 83 14 0A : that is it. No need more bytes to read Binary file . if you already selected EF file Insted of 83 you can give 00:
Note : This is considers you dont have security conditions. If you have security condition you have to satisfy that before you read this.

Related

Parsing NMEA sentences from serial

I want to use TinyGPS++ on an Arduino to parse NMEA data and display information on an OLED display. But, instead of using software serial and the TX/RX pins, the NMEA data will be received by USB.
I followed the examples from TinyGPS++, but i encountered two problems:
1)
Only the first 64 characters are received by the Arduino, when i send one NMEA sentence over the serial monitor (Windows, Arduino 1.6.9). How can I overcome this restriction? I help myself by deleting a couple of decimal places, but this is not the preferred way to go.
2)
In the TinyGPS++ BasicExample a sample NMEA string is defined in the read-only memory:
// A sample NMEA stream.
const char *gpsStream =
"$GPRMC,045103.0,A,3014.0,N,09748.0,W,36.88,65.02,030913,,,A*7C\r\n"
"$GPGGA,045104.0,3014.0,N,09749.0,W,1,09,1.2,211.6,M,-22.5,M,,*62\r\n"
"$GPRMC,045200.0,A,3014.0,N,09748.0,W,36.88,65.02,030913,,,A*77\r\n"
"$GPGGA,045201.0,3014.0,N,09749.0,W,1,09,1.2,211.6,M,-22.5,M,,*6C\r\n"
"$GPRMC,045251.0,A,3014.0,N,09748.0,W,36.88,65.02,030913,,,A*7D\r\n"
"$GPGGA,045252.0,3014.0,N,09749.0,W,1,09,1.2,211.6,M,-22.5,M,,*6F\r\n";
and parsed by
while (*gpsStream) {
Serial.print(*gpsStream);
gps.encode(*gpsStream++);
}
I receive my NMEA (unfortunately only one line) this way:
if (Serial.available()) {
while (Serial.available() > 0) {
if(index < 80)
{
inChar = Serial.read();
inData[index] = inChar;
index++;
inData[index] = '\0';
}
}
}
and try to parse it by:
index = 0;
while (index < 80) {
gps.encode(inData[index]);
Serial.print(inData[index]);
index++;
}
But this does not work as desired. Checking if the location isValid() always returns not to be true.
Unfortunately, i have several possible sources for this undesired behavior.
The too short sentences (unlikely)
Incorrect way of reading the data over serial.
I only submit one line.
Something else.
I am not that experienced with neither NMEA, nor the serial data communication, and i have only little experience with Arduino/C. Can you point me into a direction how to solve for this (these) problems?
Basically, you do not need to accumulate NMEA characters. Just feed them to the GPS library as you receive them. You don't provide the entire loop, but it is very common to have a problem there, too.
After struggling with several GPS libraries and their examples, I eventually wrote NeoGPS. It is faster and smaller than all other libraries, it validates the checksum, and the examples are structured correctly. Unlike other libraries, NeoGPS does not store GPS values as floating-point values, so it is able to retain the full accuracy of your GPS device.
If you'd like to try it, be sure to follow the Installation instructions. The NMEA.ino example will emit one line of info (CSV format) for each batch of GPS sentences that you send, ending with the default RMC sentence. Be sure to modify it to use the Serial object instead of gps_port, or simply define it that way:
#define gps_port Serial
It will also show the number of characters that have been parsed, how many good sentences have been received, and how many sentences had checksum errors. That could help with debugging if you are not generating the checksum correctly. This site is useful, too.
Those CSV lines will be sent back over the USB port (to the PC), but you can easily change it to send specific fields to the OLED (see NMEAloc.ino).
Although it is possible to develop something on a PC and then port it to an embedded environment like the Arduino, you have to be careful about (1) linear program structure and (2) ignoring resource limits (program size, MCU speed and RAM). There are a number of quirks with the Arduino environment that usually make it frustrating to port a "sketch" to/from a PC. :P

Exporting Plaintext AES 128 Key to buffer/file Windows Crypto API c++

I'm having a lot of difficulty understanding and implementing the Windows Crypto API to Import and Export Keys in c++.
Despite reading through the MSDN documentation many many times I can't seem to get it to work in the way I want.
Below is a snippet of code from what i'm working on.
if(CryptAcquireContext(&CryptoHandle,NULL,provPointer, PROV_RSA_AES, 0xF0000000))
{
HCRYPTKEY aesKey;
//We now have context on Enhanced AES
if(CryptGenKey(CryptoHandle,CALG_AES_128,CRYPT_EXPORTABLE,&aesKey))
{
DWORD dwBlobLen;
BYTE* pbKeyBlob;
CryptExportKey(aesKey,0,PLAINTEXTKEYBLOB,0,NULL,&dwBlobLen);
if(pbKeyBlob=new BYTE[dwBlobLen])
{
if(CryptExportKey(aesKey, NULL,PLAINTEXTKEYBLOB, 0,pbKeyBlob, &dwBlobLen))
{
//Blah Blah
}
}
}
}
*(Where provPointer is a pointer to the Enhanced crypto api string.
As you might be able to tell from the snippet i'm trying to export a AES 128 key to plaintext.
In the debugger it all executes fine (No visible errors) but I don't understand the outcome at all.
The first call to CryptExportKey fills the dwBlobLen with '28' (What does this mean? Why?)
After the second CryptExport key i've tried writing pbKeyBlob(Which I assume points to the key) to file But I just end up with a constant set of bytes (Same for every try) followed by a set of bytes that I different every time (I assume this is some of the key) (Which add to 28 bytes total)
I'd really appreciate if someone could identify where I've gone wrong. I'm pretty clueless with the whole crypto lingo (Sessions,machine keys, blobs etc.)
In the future I'd like to be able to generate an AES key, use it and export it into a file in a form where I can import it again later.
Thanks in advance.
I'm not an expert on the Windows Cryptography API (or on cryptography in general) but I believe I can shed some light on what's going on here.
The first call to CryptExportKey puts 28 in dwBlobLen because that is the size of the blob that it will created when the key is exported. This is in the MSDN docs: http://msdn.microsoft.com/en-us/library/windows/desktop/aa379931%28v=vs.85%29.aspx
AS far as what you're doing wrong. You aren't doing anything wrong. You are asking CryptExportKey to export a plaintext blob which has the following layout:typedef struct _PLAINTEXTKEYBLOB {
BLOBHEADER hdr;
DWORD dwKeySize;
BYTE rgbKeyData[];
} PLAINTEXTKEYBLOB, *PPLAINTEXTKEYBLOB;
As you can see, the blob starts with a header and a key size (which is the constant set of bytes which you have reported, and should be 12 bytes long), followed by the key data (which is the data that changes every time, and should be 16 bytes long). Remember you are generating a 128 bit key (which is 16 bytes).
The BLOBHEADER has the following layout:typedef struct _BLOBHEADER {
BYTE bType;
BYTE bVersion;
WORD Reserved;
ALG_ID aiKeyAlg;
} BLOBHEADER;
By the way, from the doc on the CryptImportKey function, you can't import the PLAINTEXTBLOB directly, because the BYTE array that you pass to CryptImportKey does not include the keysize. You need to pass a buffer with the BLOBHEADER followed by the key data.

ERROR_NOT_ENOUGH_MEMORY Error when writing INI using WritePrivateProfileString, after 200k calls

I'm making simple dll packet sniffer using C++, that will hook to the apps, and write the received packet into INI file. Unfortunately after 20-30 minutes it crashed the main apps.
When the packet is received, receivedPacket() will be called. After 20+ minutes, WriteCount value is around 150,000-200,000.. and starting to get C++ runtime error/crash, GetLastError() code that I get is 0x8, which is ERROR_NOT_ENOUGH_MEMORY, and the WritePrivateProfileStringA() returns 0
void writeToINI(LPCSTR iSec,LPCSTR iKey,int iVal){
sprintf(inival, _T("%d"), iVal);
WritePrivateProfileStringA(iSec,iKey,inival,iniloc);
//sprintf(strc, _T("%d \n"), WriteCount);
//WriteConsole(GetStdHandle(STD_OUTPUT_HANDLE), strc, strlen(strc), 0, 0);
WriteCount++;
}
void receivedPacket(char *packet,WORD size){
switch ( packet[2] )
{
case 0x30:
// Size : 0x5F
ID = *(signed char*)&packet[0x10];
X = *(signed short*)&packet[0x20];
Y = *(signed short*)&packet[0x22];
Z = *(signed short*)&packet[0x24];
sprintf(inisec, _T("PACKET_%d"), (ID+1));
writeToINI(inisec,"id",ID);
writeToINI(inisec,"x",X);
writeToINI(inisec,"y",Y);
writeToINI(inisec,"z",Z);
}
[.....OTHER CASES.....]
}
Thanks :)
WritePrivateProfileString() and GetPrivateProfileString() are very slow (due to parsing INI file each call), instead you can:
use one of existing parsing libraries, but i am not sure about memory efficiency nor supporting sequential write.
write your own sequential INI writter:
read file (or only part, by part, if it is too big)
find section and key (if not found, create new section at end of file, or find insertion position, if you want sorted sections), save file position of key and next key
change value
save (beginning of original file to position of key + actual changed key + position of next key in original file to end of file) (if new section is created at end, you can simply append new section to original file) (if packets rewrite same ID often, you can add padding whitespace after each key, large to hold any value of desired type (example: change X=1---\n to X=100-\n (change - to whitespace), so you have constant size of key, you can update only part of file) )
database, for example MySQL
write binary file (fastest solution) and make program to read values, or to convert to text
Little note: I use GetPrivateProfileString() few years ago to read settings file (about 1KB of size), reading form HDD: 50ms, reading from USB flash disk: 1000ms!, after changing (1. read file to memory 2. run my own parser) it run in 1ms both on HDD and USB.
Thanks for the reply guys, but looks like the problem wasn't come from WritePrivateProfileStringA().
I just need to add extra size in malloc() for the Hook.
:)

Using IMFSourceResolver::CreateObjectFromByteStream

I am trying to use the IMFSourceResolver::CreateObjectFromByteStream method to create a IMFMediaSource instance for a DRM protected WMA file. I am adapting the ProtectedPlayback sample from the Windows SDK as a playground. The end goal I wish to achieve is to have the playback process fed by a custom implementation if IMFByteStream that I will write.
However, I cannot get either my simple IMFByteStream implementation or the implementations returned by the MFCreateFile function to work. Each returns a HRESULT of MF_E_UNSUPPORTED_BYTESTREAM_TYPE when passed to CreateObjectFromByteStream.
I tested the sample project in its default state (using CreateObjectFromUrl on a file) with a DRM protected WMA file and it worked fine. The file is not corrupt and the license is valid. I don't understand why substituting this bit of code with CreateObjectFromByteStream( MFCreateFile() ) does not work. I have been able to find little documentation that covers using custom byte streams or what the resolver expects from a byte stream instance.
If anybody has any experience with this stuff or any idea what I am doing wrong, some pointers would be appreciated.
The code I am using is here:
IMFByteStream* stream = NULL;
HRESULT hr2 = MFCreateFile(
MF_ACCESSMODE_READ,
MF_OPENMODE_FAIL_IF_NOT_EXIST,
MF_FILEFLAGS_NONE,
L"C:\\IFB.wma",
&stream);
CHECK_HR(hr = pSourceResolver->CreateObjectFromByteStream(
stream,
NULL,
MF_RESOLUTION_MEDIASOURCE,
NULL,
&ObjectType,
&pSource));
I've not included the whole thing because its basically the same as the sample, I've only changed this part.
Thanks,
Steve
#pisomojado
Thanks for the response, I totally forgot I had posted this question.
The problem was, if I remember correctly, that CreateObjectFromByteStream needs a way to identify the content type. You can either do this by passing in a URL as well as the byte stream instance (pwszURL parameter) or by making the byte stream class implement IMFAttributes and handle the call to GetAllocatedString that asks for the content type. Since I was doing neither of these things, the resolver was just rejecting the stream.
I would have thought that the resolver would attempt to recognize the stream content type via the first few bytes like you suggested in your answer, but for me it did not appear to do so. Not sure why this is, but nevermind.
Steve
Some ideas for debugging what's going on here:
First off, do an IMFSourceResolver::CreateObjectFromUrl on your c:\ifb.wma file; make sure that's happy.
Assuming it is, then it's on to looking at what happens in your IMFByteStream while inside the CreateObjectFromByteStream call. Typically, CreateObjectFromByteStream will try to read a couple bytes off the beginning of the IMFByteStream, since there's usually some kind of identifying byte sequence there. Set some breakpoints or do some logging from your IMFByteStream::[Begin]Read to see what you're being asked for, and whether you're faithfully delivering the right bytes.
FWIW, all WMA files (and WMV, and ASF) start like this (it's the ASF header GUID).
30 26 b2 75 8e 66 cf 11 a6 d9 00 aa 00 62 ce 6c

How to hide strings in a exe or a dll?

I discovered that it is possible to extract the hard-coded strings from a binary.
For example the properties view of Process Explorer displays all the string with more than 3 characters.
Here is the code of a simple executable that I wrote to simply test it:
#ifndef _WIN32_WINNT
#define _WIN32_WINNT 0x0501
#endif
#include <stdio.h>
#include <tchar.h>
#include <Windows.h>
int _tmain(int argc, _TCHAR* argv[])
{
_TCHAR* hiddenString1 =_T("4537774B-CC80-4eda-B3E4-7A9EE77991F5");
_TCHAR* hiddenString2 =_T("hidden_password_or_whatever");
for (int i= 0; i<argc; i++) {
if (0 == _tcscmp(argv[i],hiddenString1)) {
_tprintf (_T("The guid argument is correct.\n")); }
else if (0 == _tcscmp(argv[i],hiddenString2)) {
_tprintf (_T("Do something here.\n")); }
}
_tprintf (_T("This is a visible string.\n"));
//Keep Running
Sleep(60000);
return 0;
}
The strings can clearly be extracted from the corresponding executable:
I think that it is a little too easy to find the strings.
My questions are:
How to simply hide hiddenString1 or hiddenString2 in the
executable?
Is there a more secure
way to use "cheat code" than with
some obscure hidden input?
Welcome to the wider world of defensive programming.
There are a couple of options, but I believe all of them depend on some form of obfuscation; which, although not perfect, is at least something.
Instead of a straight string value you can store the text in some other binary form (hex?).
You can encrypt the strings that are stored in your app, then decrypt them at run time.
You can split them across various points in your code, and reconstitute later.
Or some combination thereof.
Bear in mind, that some attacks go further than looking at the actual binary. Sometimes they will investigate the memory address space of the program while it's running. MS came up with something called a SecureString in .Net 2.0. The purpose being to keep the strings encrypted while the app is running.
A fourth idea is to not store the string in the app itself, but rather rely on a validation code to be submitted to a server you control. On the server you can verify if it's a legit "cheat code" or not.
There are many ways to obscure data in an executable. Others here have posted good solutions -- some stronger than others. I won't add to that list.
Just be aware: it's all a cat-and-mouse game: it is impossible to guarantee that nobody will find out your "secret".
No matter how much encryption or other tricks you use; no matter how much effort or money you put into it. No matter how many "NASA/MIT/CIA/NSA" types are involved in hiding it.
It all comes down to simple physics:
If it were impossible for any user to pull out your secret from the executable and "unhide" it, then the computer would not be able to unhide it either, and your program wouldn't be able to use it. Any moderately skilled developer with enough incentive will find the way to unhide the secret.
The moment that you have handed your executable to a user, they have everything they need to find out the secret.
The best you can hope for is to make it so hard to uncover the secret that any benefits you can get from knowing the secret become not worth the hassle.
So, it's OK to try to obscure the data if it's merely "not-nice" for it to be public, or if the consequences of it becoming public would just be "inconvenient". But don't even think of hiding in your program "the password to your master client database", a private key, or some other critical secret. You just can't.
If you have truly critically secret information that your program will somehow need but should NEVER become public information (like a private key), then you will need to have your program talk to a remote server under your control, apply appropriate authentication and authorization controls (that is, make sure only the approved people or computers are able to make the request to the server), and have that server keep the secret and use it.
The simplest way is to encrypt them with something trivial like xor or rot-13, and then decrypt them on the fly when they're used. That will eliminate casual viewing of them, but it won't stop anyone with much experience at reversing.
In addition to those methods Chris mentions you could also use a hashing algorithm. If all you want to do is check if the correct ID was specified you don't actually need to store the whole ID in your program.
Create a hash (MD5, SHA, etc) of the string/password/id you want to compare against, maybe add a 'salt' value to it. Store this in your program
When the program is run, do the same algorithm on the input string/password/id and compare the two hashes to see if they match.
This way the actual text is never stored in your program and they cannot reverse engineer your program to find out what the original text was because hash algorithms are one-way only.
There are URLs for http requests that I would like to hide too.
If your app is making the request, there is no point hiding this. Running an app like fiddler, http analyzer, or one of dozens of other free and readily available methods will show all the traffic your app is creating.
Here is the method I use for this purpose. First, I use the the Strings tool by Sysinternals to display the strings in an EXE or DLL.
I then use the following small tool (see article) to replace these strings with a scrambled array of characters stored as an arithmetic expression: for example: instead of the string:
"this is a test"
I will place the following code: (which is automatically generated by this tool)
WCHAR T1[28];
T1[22] = 69;
T1[15] = 121 - 17;
T1[9] = L':' + -26;
T1[12] = L't' - 1;
T1[6] = 116 - 1;
T1[17] = 117 - 12;
T1[3] = 116 - 1;
T1[14] = L'' - 3;
T1[13] = L'w' - 3;
T1[23] = 69;
T1[26] = L'Y' + 3;
T1[19] = 111 + 0;
T1[21] = L'k' - 34;
T1[27] = L'\\' - 8;
T1[20] = L'B' + 32;
T1[4] = 42 + -10;
T1[25] = L'm' - 17;
T1[16] = L'H' + 18;
T1[18] = L'A' + 56;
T1[24] = 68;
T1[1] = 105 - 1;
T1[11] = L'k' - 6;
T1[10] = 66 + 50;
T1[2] = 105;
T1[0] = 117 - 1;
T1[5] = L'k' - 2;
T1[8] = 89 + 8;
T1[7] = 32;
There are many solutions to this problem and none of them (including mine) is perfect, however there are ways to scramble, disguise, and hide the sensitive strings. You can of course encrypt them and decrypt during runtime (see this article), but I find more important to make these string disappear among the bits and bytes of the executable file and it works. After running my tool, you won't find "this is a test" in the executable file.
Will all your secret codes be GUIDs or was that just an example?
Perhaps store your secret as a binary guid:
const GUID SecretGuid =
{ 0x4537774B, 0xCC80, 0x4eda, { 0x7A, 0x9E, 0xE7, 0x79, 0x91, 0xF5 } };
Then convert your supplied guid from string to binary format and compare the two binary guids.
If there's a specific string you don't want people to be able to see, then encrypt it and decrypt at runtime.
If you don't want people to see your GUID, then construct it from bytes, rather than constructed from a string:
const GUID SecretGuid =
{ 0x4537774B, 0xCC80, 0x4eda, { 0x7A, 0x9E, 0xE7, 0x79, 0x91, 0xF5 } };
The best you can do is to code your password or other string that you want to hide as char array. For example:
std::string s1 = "Hello"; // This will show up in exe in hex editor
char* s2 = "World"; // this will show up in exe in hex editor
char s3[] = {'G', 'O', 'D'}; // this will not show up in exe in hex editor.