32 char secret key generation in ColdFusion - coldfusion

In ColdFusion 9 there is a GenerateSecretKey function. Can we generate the desired length key using GenerateSecretKey? I need to generate a 32 char secret key for my application.

GenerateSecretKey generates a key on the basis of algorithm, you can specify the keysize in bit that can increase the length of key generated.to generate 32 char key you can use CreateUuid.

If you just need a random string 32 characters long you can use createUUID(), however:
Returns
A ColdFusion format UUID, in the format
xxxxxxxx-xxxx-xxxx-xxxxxxxxxxxxxxxx, where x is a hexadecimal digit
(0-9 or A-F). (The character groups are 8-4-4-16.)
Including the dashes, that's 35 characters but
replace(createUUID(),"-","", "all")
left(createUUID(),"32")
right(createUUID(),"32")
will give you a string 32 characters long that may work for you.

Related

Can SAS support numeric that is longer than 16 digits?

I have a requirement set whereby some of the SAS numeric columns must be able to store numeric value that is more than 16 digits. For example:
123,456,789,123,456,789,123,123.9996
It is actually 24.4 by looking at that.
I've studied a few pages such as :
http://www.sfu.ca/sasdoc/sashtml/unixc/z0344718.htm
https://documentation.sas.com/doc/en/pgmsascdc/9.4_3.5/lrcon/p0ji1unv6thm0dn1gp4t01a1u0g6.htm
http://v8doc.sas.com/sashtml/win/numvar.htm#:~:text=The%20maximum%20number%20of%20variables,can%20be%20is%20160%20bytes.
It seems to me that the maximum numeric length that SAS support is 8 bytes which can only support 16 digits whole number. Is there a way to achieve numeric value that is "24.4" like the above example?

MSSQL ODBC refusing valid TINYINT values (Numeric value out of range)

I'm trying to insert into a TINYINT column which according to MSDN docs should take numbers between 0-255. Unfortunately it only takes 0-127. If I try to insert 128 or higher it complains:
[22003] (native 0): [Microsoft][ODBC Driver 11 for SQL Server]Numeric value out of range
The data is bound as SQL_C_SBIGINT because it's a general code meant to take all integers, but I think that that shouldn't matter because for numbers 0-127 it works correctly. There's also an option to send it as a SQL_C_CHAR string, but that returns the same error.
Anyone knows where the problem might lie?
So, the problem was actually happening because as part of a test, I was "selecting" the inserted data back using SQLFetch() and SQLGetData(), and as target datatype used for the SQLGetData() function, I used SQL_C_STINYINT aka the signed version. Therefore the problem didn't happen when inserting the data but when I tried to "select" them back.
The answer is actually already given, but not highlighted.
TINYINT is a one-byte signed integer. All values from Hex 00 to Hex 7F (or Bin 0000 0000 to 0111 1111, or Dec 0 to 127) are positive; all values from Hex 80 to Hex FF (or Bin 1000 0000 to 1111 1111 or Dec 128 to 255) are negative.
I'm afraid that for any signed integer greater than 127, you'll have to use a two-byte integer, SMALLINT.
And SMALLINT is supported by many more database platforms than TINYINT would be.
If you really want a single-byte integer, you'll have to cast your 64bit integer to unsigned char in C++, and put it into an unsigned char host variable.
But then you'll only see the "correct" value, if it's above 127, in your front end, and SELECT-ing from the database using any other tool, you'll see garbage.
Good luck -
Marco the Sane

C++ - A few quetions about my textbook's ASCII table

In my c++ textbook, there is an "ASCII Table of Printable Characters."
I noticed a few odd things that I would appreciate some clarification on:
Why do the values start with 32? I tested out a simple program and it has the following lines of code: char ch = 1; std::cout << ch << "\n"; of code and nothing printed out. So I am kind of curious as to why the values start at 32.
I noticed the last value, 127, was "Delete." What is this for, and what does it do?
I thought char can store 256 values, why is there only 127? (Please let me know if I have this wrong.)
Thanks in advance!
The printable characters start at 32. Below 32 there are non-printable characters (or control characters), such as BELL, TAB, NEWLINE etc.
DEL is a non-printable character that is equivalent to delete.
char can indeed store 256 values, but its signed-ness is implementation defined. If you need to store values from 0 to 255 then you need to explicitly specify unsigned char. Similarly from -128 to 127, have to specify signed char.
EDIT
The so called extended ASCII characters with codes >127 are not part of the ASCII standard. Their representation depends on the so called "code page" chosen by the operating system. For example, MS-DOS used to use such extended ASCII characters for drawing directory trees, window borders etc. If you changed the code page, you could have also used to display non-English characters etc.
It's a mapping between integers and characters plus other "control" "characters" like space, line feed and carriage return interpreted by display devices (possibly virtual). As such it is arbitrary, but they are organized by binary values.
32 is a power of 2 and an alphabet starts there.
Delete is the signal from your keyboard delete key.
At the time the code was designed only 7 bits were standard. Not all bytes (parts words) were 8 bits.

Store 32 bit value as C string in most efficient form

I am trying to find the most efficient way to encode 32 bit hashed string values into text strings for transmission/logging in low bandwidth environments. Complex compression can't be used because the hash values need to be contained in human readable text strings when logged and sent between client and host.
Consider the following contrived examples:
given the key/value map
table[0xFE12ABCD] = "models/texture/red.bmp";
table[0x3EF088AD] = "textures/diagnostics/pink.jpg";
and the string formats:
"Loaded asset (0x%08x)"
"Replaced (0x%08x) with (0x%08x)"
they could be printed as:
"Loaded asset models/texture/red.bmp"
"Replaced models/texture/red.bmp with textures/diagnostics/pink.jpg"
Or if the key/value map is known by the client and server:
"Loaded asset (0xFE12ABCD)"
"Replaced (0xFE12ABCD) with (0x3EF088AD)"
The receiver can then scan for the (0xNNNNNNNN) pattern and expand it locally.
This is what I am doing right now but I would like to find a way to represent the 32 bit value more efficiently. A simple step would be to use a better identifying token:
"Loaded asset $FE12ABCD"
"Replaced $1000DEEE with $3EF088AD"
Which already reduces the length of each token - $ is not used anywhere else so it is reasonable.
However, what other options are there to make that 32 bit value even smaller? I can't use an index - it has to be a full 32 bit value because in some cases the generator of the string has the hash and sometimes it has a string it will hash immediately.
A common solution is to use Base-85 coding. You can code four bytes into five Base-85 digits, since 855 > 232. Pick 85 printable characters and assign them to the digit values 0..84. Then do base conversion to go either way. Since there are 94 printable characters in ASCII, it is usually easy to find 85 that are "safe" in whatever constrains your strings to be "readable".

C/C++: How to convert 6bit ASCII to 7bit ASCII

I have a set of 6 bits that represent a 7bit ASCII character. How can I get the correct 7bit ASCII code out of the 6 bits I have? Just append a zero and do an bitwise OR?
Thanks for your help.
Lennart
ASCII is inherently a 7-bit character set, so what you have is not "6-bit ASCII". What characters make up your character set? The simplest decoding approach is probably something like:
char From6Bit( char c6 ) {
// array of all 64 characters that appear in your 6-bit set
static SixBitSet[] = { 'A', 'B', ... };
return SixBitSet[ c6 ];
}
A footnote: 6-bit character sets were quite popular on old DEC hardware, some of which, like the DEC-10, had a 36-bit architecture where 6-bit characters made some sense.
You must tell us how your 6-bit set of characters looks, I don't think there is a standard.
The easiest way to do the reverse mapping would probably be to just use a lookup table, like so:
static const char sixToSeven[] = { ' ', 'A', 'B', ... };
This assumes that space is encoded as (binary) 000000, capital A as 000001, and so on.
You index into sixToSeven with one of your six-bit characters, and get the local 7-bit character back.
I can't imagine why you'd be getting old DEC-10/20 SIXBIT, but if that's what it is, then just add 32 (decimal). SIXBIT took the ASCII characters starting with space (32), so just add 32 to the SIXBIT character to get the ASCII character.
The only recent 6-bit code I'm aware of is base64. This uses four 6-bit printable characters to store three 8-bit values (6x4 = 8x3 = 24 bits).
The 6-bit values are drawn from the characters:
ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/
which are the values 0 thru 63. Four of these (say UGF4) are used to represent three 8-bit values.
UGF4 = 010100 000110 000101 111000
= 01010000 01100001 01111000
= Pax
If this is how your data is encoded, there are plenty of snippets around that will tell you how to decode it (and many languages have the encoder and decoder built in, or in an included library). Wikipedia has a good article for it here.
If it's not base64, then you'll need to find out the encoding scheme. Some older schemes used other lookup methods of the shift-in/shift-out (SI/SO) codes for choosing a page within character sets but I think that was more for choosing extended (e.g., Japanese DBCS) characters rather than normal ACSII characters.
If I were to give you the value of a single bit, and I claimed it was taken from Windows XP, could you reconstruct the entire OS?
You can't. You've lost information. There is no way to reconstruct that, unless you have some knowledge about what was lost. If you know that, say, the most significant bit was chopped off, then you can set that to zero, and you've reconstructed at least half the characters correctly.
If you know how 'a' and 'z' are represented in your 6-bit encoding, you might be able to guess at what was removed by comparing them to their 7-bit representations.