Authenticating DotNetNuke Users in ColdFusion - coldfusion

Is there any way to authenticate users from other web apps using the DNN logins?
We have a main site that is using DNN and user logins are stored in the asp net membership table. From what I have been reading, the passwords are encrypted using the machine key and then salted. I see where this info is, but can't seem to encrypt passwords correctly using this method.
I'm trying with a Coldfusion web application on the same server where our DNN site is, but it doesn't want to work. You'd think it would be strait forward with the ColdFusion encryption function:
Encrypt(passwordstring, key [, algorithm, encoding, IVorSalt, iterations])
No matter what I try, I never get a matching value.
Any help, insight or pointing me in the right direction would be greatly appreciated!

(Edit: Original answer did not work in all cases. Substantially revised ...)
From what I have read, DNN uses an "SHA1" hash by default. The thread #barnyr posted shows it simply hashes the concatenated salt and password, but with a few twists.
DNN uses UTF-16LE to extract the password bytes, rather than CF's typical UTF-8.
It also extracts the salt and password bytes separately, which may produce different results than just decoding everything as a single string, which is what hash() does. (See demo below)
Given that CF9's Hash function does not accept binary (supported in CF11), I do not think it is possible to duplicate the results with native CF functions alone. Instead I would suggest decoding the strings into binary, then using java directly:
Code:
<cfscript>
thePassword = "DT!#12";
base64Salt = "+muo6gAmjvvyy5doTdjyaA==";
// extract bytes of the salt and password
saltBytes = binaryDecode(base64Salt, "base64");
passBytes = charsetDecode(thePassword, "UTF-16LE" );
// next combine the bytes. note, the returned arrays are immutable,
// so we cannot use the standard CF tricks to merge them
ArrayUtils = createObject("java", "org.apache.commons.lang.ArrayUtils");
dataBytes = ArrayUtils.addAll( saltBytes, passBytes );
// hash binary using java
MessageDigest = createObject("java", "java.security.MessageDigest").getInstance("SHA-1");
MessageDigest.update(dataBytes);
theBase64Hash = binaryEncode(MessageDigest.digest(), "base64");
WriteOutput("theBase64Hash= "& theBase64Hash &"<br/>");
</cfscript>
Demo of Differences:
<cfscript>
theEncoding = "UTF-16LE";
thePassword = "DT!#12";
base64Salt = "+muo6gAmjvvyy5doTdjyaA==";
// extract the bytes SEPARATELY
saltBytes = binaryDecode(base64Salt, "base64");
passBytes = charsetDecode(thePassword, theEncoding );
ArrayUtils = createObject("java", "org.apache.commons.lang.ArrayUtils");
separateBytes = ArrayUtils.addAll( saltBytes, passBytes );
// concatenate first, THEN extract the bytes
theSalt = charsetEncode( binaryDecode(base64Salt, "base64"), theEncoding );
concatenatedBytes = charsetDecode( theSalt & thePassword, theEncoding );
// these are the raw bytes BEFORE hashing
WriteOutput("separateBytes= "& arrayToList(separateBytes, "|") &"<br>");
WriteOutput("concatenatedBytes"& arrayToList(concatenatedBytes, "|") );
</cfscript>
Results:
separateBytes = -6|107|-88|-22|0|38|-114|-5|-14|-53|-105|104|77|-40|-14|104|68|0|84|0|33|0|64|0|49|0|50|0
concatenatedBytes = -6|107|-88|-22|0|38|-114|-5|-14|-53|-105|104|-3|-1|68|0|84|0|33|0|64|0|49|0|50|0

Most likely the password is not encrypted, it is hashed. Hashing is different from encrypting, because it is not reversible.
You would not use ColdFusion's encrypt() function for this, you would use its hash() function.
So the questions you'll need to answer to figure out how to hash the passwords in CF to be able to auth against the DNN users are:
What algorithm is DNN using to hash the passwords?
How is the salt being used with the password prior to hashing?
Is DNN iterating over the hash X number of times to improve security?
All of those questions must be answered to determine how CF must use the hash() function in combination with the salt and user-submitted passwords.
I'll make some assumptions to provide an answer.
If we assume that noiteration is being done and that the salt is simply being appended to the password prior to using SHA1 to hash the password, then you'd be able to reproduce the hash digest like this:
<cfset hashDigest = hash(FORM.usersubmittedPassword & saltFromDB, "SHA1") />

(Posting a new response to keep the "encrypted" process separate from "hashing")
For "encrypted" keys, the DNN side uses the standard algorithms ie DES, 3DES or AES - depending on your machineKey settings. But with a few differences you need to match in your CF code. Without knowing your actual settings, I will assume you are using the default 3DES for now.
Data To Encrypt
The encrypted value is a combination of the salt and password. But as with hashing, DNN uses UTF-16LE. Unfortunately, ColdFusion's Encrypt() function always assumes UTF-8, which will produce a very different result. So you need to use the EncryptBinary function instead.
// sample valus
plainPassword = "password12345";
base64Salt = "x7le6CBSEvsFeqklvLbMUw==";
hexDecryptKey = "303132333435363738393031323334353637383930313233";
// first extract the bytes of the salt and password
saltBytes = binaryDecode(base64Salt, "base64");
passBytes = charsetDecode(plainPassword, "UTF-16LE" );
// next combine the bytes. note, the returned arrays are immutable,
// so we cannot use the standard CF tricks to merge them
ArrayUtils = createObject("java", "org.apache.commons.lang.ArrayUtils");
dataBytes = ArrayUtils.addAll( saltBytes, passBytes );
Encryption Algorithm
With block ciphers, ColdFusion defaults to ECB mode. (See Strong Encryption in ColdFusion) Whereas .NET defaults to CBC mode, which requires an additional IV value. So you must adjust your CF code to match.
// convert DNN hex key to base64 for ColdFusion
base64Key = binaryEncode(binaryDecode( hexDecryptKey, "hex"), "base64");
// create an IV and intialize it with all zeroes
// block size: 16 => AES, 8=> DES or TripleDES
blockSize = 8;
iv = javacast("byte[]", listToArray(repeatString("0,", blocksize)));
// encrypt using CBC mode
bytes = encryptBinary(dataBytes, base64Key, "DESede/CBC/PKCS5Padding", iv);
// result: WBAnoV+7cLVI95LwVQhtysHb5/pjqVG35nP5Zdu7T/Cn94Sd8v1Vk9zpjQSFGSkv
WriteOutput("encrypted password="& binaryEncode( bytes, "base64" ));

Related

Coldfusion 2018 No SOF segment Multi-Page TIFF Error

I have a new CF18 server and I'm getting some errors reading and converting some old images that were readable on my previous CF11 server. FYI GetReadableImageFormats results in "BMP,GIF,JPEG,JPEG 2000,JPEG2000,JPG,PNG,PNM,RAW,TIF,TIFF,WBMP"
Normally I read the files as a Binary and put it into memory for manipulation
<cffile action="readBinary" file="#file_location#" variable="binImage" />
<cfimage action="read" source="#binImage#" name="objImage" isbase64="no">
This now results in an error:
"An exception occurred while trying to read the image. No SOF segment in stream"
Reading the file with action="read" and dumping the left(binImage, 999) results:
"...2015:10:07 17:46:58 Kofax standard Multi-Page TIFF Storage Filter v3.03.000,..."
Then I tried reading it into java using:
<cfset tifFileName="#file_location#">
<cfscript>
ss = createObject("Java","com.sun.media.jai.codec.FileSeekableStream").init(tifFileName);
//create JAI ImageDecoder
decoder = createObject("Java","com.sun.media.jai.codec.ImageCodec").createImageDecoder("tiff", ss, JavaCast("null",""));
</cfscript>
Which yields an error:
"Decoding of old style JPEG-in-TIFF data is not supported."
I found this...
Decoding of old style JPEG-in-TIFF data is not supported
Do you think using TwelveMonkeys ImageIO the best path to follow for my issue?
UPDATE: Based on the suggestion that there is an invalid marker 0xFF9E I tried the following:
<cffile action="readBinary" file="#file_location#" variable="binImage" />
<cfset hexEncoding = binaryEncode(binImage, "hex")>
<cfset new_hexEncoding = replaceNoCase(hexEncoding, 'FF9E', 'FFE9', 'ALL')>
<cfset binImage = binaryDecode(new_hexEncoding, "hex")>
isImage(binImage) returns "NO" and the "No SOF segment in stream" error persists. I looped over the hexEncoding and found the FF9E string 23x. I've never edited raw image code so I'm not sure my replace is correct.
Edit: At this point I'm fairly certain my Search and Replace hexEncoding, 'FF9E', 'FFE9' logic is flawed. there is no occurance of 0xff9e in the binaryEncoded binImage.
This was driving me nuts. I tried everything I could find short of installing extra JAVA libraries or routing it through other executables to make the conversion. In my case there is only one JPEG in a TIFF, so I wrote something that literally grabs the binary data for the JPEG out of the TIFF (doesn't account for pages) and serves it up. Once you have the binary of the JPEG you can write it to a file, do conversions on it, even stream it direct to the browser. Here ya go future people who need this. I didn't write it to do pages or detect what kind of tiff it is since for my uses I already know all that. These things are .bin files, but they are all the same single page jpeg in a tiff and I needed a way to serve them up quickly in a format that browsers don't hate. This runs fast enough to be served up on the fly. Is there a better way? Probably, but this works, self contained, copy and paste, and makes complete sense to anyone that needs to edit it.
<cfscript>
strFileName = "test.tiff";
blnOutputImageToBrowser = true;
blnSaveToFile = true;
strSaveFile = GetDirectoryFromPath(GetCurrentTemplatePath())&"test.jpg"
imgByteArray = FileReadBinary(strFileName);
//Convert to HEX String
hexString = binaryEncode(imgByteArray,"hex"); //Convert binary to HEX String, so we can pattern search it
//Set HEX Length
hexLength = arraylen(imgByteArray);
//Find Start of JPG Data in HEX String
jpegStartHEX = find("FFD8FF",hexString);
jpegStartBIN = (jpegStartHEX-1)/2; //-1 because CF arrays start on 1 and everyone else starts on 1. /2 because the HEX string positions are double the byte array positions
objByteBuffer = CreateObject("java","java.nio.ByteBuffer"); //Init JAVA byte buffer class for us to use later (this makes it go faster than trying to convert the hex string back to binary)
//Find Stop of JPG Data
jpegStopHEX = 0;
jpegStopBIN = 0;
intSearchIDX = jpegStartHEX+6; //Might as well start after the JPEG start block
blnStop = false;
while (intSearchIDX < len(hexString) && jpegStopHEX == 0 && !blnStop) {
newIDX = find("FFD9",hexString,intSearchIDX);
if (newIDX == 0) {
blnStop=true;
}
else {
if (newIDX%2 == 0) { //bad search try again (due to indexing in CF starting on 1 instead of 0, the even numbers are in between hex code [they are pairs like 00 and FF])
intSearchIDX = newIDX+1;
}
else { //Found ya
jpegStopHEX = newIDX;
blnStop=true;
}
}
}
jpegStopBIN = (jpegStopHEX-1)/2; //-1 because CF arrays start on 1 and everyone else starts on 1. /2 because the HEX string positions are double the byte array positions
//Dump JPG Binary into ByteArray from the start and stop positions we discovered
jpegLengthBIN = jpegStopBIN+2-jpegStartBIN;
objBufferImage = objByteBuffer.Allocate(JavaCast( "int", jpegLengthBIN ));
objBufferImage.Put(imgByteArray,JavaCast( "int", jpegStartBIN ),JavaCast( "int", jpegLengthBIN ));
if (blnSaveToFile) { //Dump byte array into test file
fileWrite(strSaveFile,objBufferImage.Array());
}
if (blnOutputImageToBrowser) {
img = ImageNew( objBufferImage.Array() );
ImageResize(img,"1200","","highestPerformance"); //Because we might as well show an example of resizing
outputImage(toBinary(toBase64(img))); //You could skip loading the byte array as an image object and just plop the binary in directly if you don't need to manipulate it any
}
</cfscript>
<cffunction name="outputImage" returntype="void">
<cfargument name="binInput" type="binary">
<cfcontent variable="#binInput#" type="image/png" reset="true" />
<cfreturn>
</cffunction>

QT Blowfish Charset

I need use Blowfish for check password into a database using by CMS.
I work with QBlowfish because I thinking basic class and easy to use but no, I have a problem :
I crypt same string ("AfuYeew7l") with same salt ("$2y$04$pzmGaIWodB/7jTZ8Y.082u")
When I use C++, I have : ("G?(?"?WIk???*?")
So when I use PHP, I have good hash : ("$2y$04$pzmGaIWodB/7jTZ8Y.082uA15Z.5VwrLXiBPebxNZ2ET75NqX1QOq")
My slot who hash :
QByteArray key = NULL;
QByteArray crypt = NULL;
QByteArray pass(ui->_Input_Pass_Clair->text().toLocal8Bit()); // Clear Password
QByteArray hash(ui->_Label_Retour_Hash->text().toLocal8Bit()); // Crypt Password
key.push_back(ui->_Label_Retour_ID->text().toLocal8Bit()); // SALT
key.push_back(ui->_Label_Retour_CA->text().toLocal8Bit()); // SALT
key.push_back(ui->_Label_Retour_SEL->text().toLocal8Bit()); // SALT
key.push_back("$"); // SALT
QBlowfish fish(key);
fish.setPaddingEnabled(true);
crypt = fish.encrypted(pass);
ui->_Label_Retour_Pass_Crypte->setText(crypt);
I think crypt is not on correct formated or also key is not correct formated but I haven't anything idea for correct this
I join source code if necessary

Parametrising encoded web service requests

I am trying to parametrise web service requests in a web performance test. Using Fiddler2 I have recorded a sequence of over 60 web service requests for a transaction performed by my desktop application and saved them as a .webtest file. This web test runs without any errors and the responses that I have checked look correct.
When the web service requests are viewed in Visual Studio 2012 they appear in plain text and so I should be able to edit them to parametrise the values in the SOAP requests. For example, most of the requests contain the text <Database>db1a</Database> (actually it has <Database>db1a</Database>) and I want to change them to get the database name from a context parameter. There are several other items to replace with parameters. For this one transaction there are over 60 web service requests and I have other transactions to record. The .webtest file contains XML and the requests looks like:
<Request Method="POST" Version="1.1" Url="http://example.com/somewhere.asmx" ThinkTime="83" Timeout="60" ParseDependentRequests="True" FollowRedirects="True" RecordResult="True" Cache="False" ResponseTimeGoal="0" Encoding="utf-8">
<Headers>
<Header Name="Content-Type" Value="text/xml; charset=utf-8" />
<Header Name="SOAPAction" Value=""http://example.com/webservices/VariousActionNamesHere"" />
</Headers>
<StringHttpBody ContentType="text/xml; charset=utf-8">PAA/AHgAbQBsACAAdg
... lots more characters not shown
+AA==</StringHttpBody>
</Request>
The StringHttpBody field contains an encoded version of the SOAP request. Visual Studio shows it as plain text. What is the encoding of this field and how can I decode and encode it?
I have installed Release 3.0 of the “Web and Load Test Plugins for Visual Studio Team Test” from http://teamtestplugins.codeplex.com/ . They provide a slightly better interface for editing the SOAP requests one at a time. But they do not allow mass changes.
Converting the web test to a coded web test (ie into C#) shows the SOAP requests as simple text and they could be edited there but I would prefer to keep the flexibility of a .webtest file.
Update: I have posted a partial answer to the question. Whilst it works, it feels the wrong way to do the work because it feels too complicated. So I am looking for a better overall approach.
StringHttpBody is base64 encoded. The raw body of the request is converted to a UTF-16 byte array and then base64 encoded, like so:
Convert.ToBase64String(Encoding.Unicode.GetBytes(oSession.GetRequestBodyAsString()));
For a quick view, you can copy/paste this string into Fiddler's Tools > TextWizard screen then use the From Base64 option to decode.
Here is part of an answer to working with the StringHttpBody fields. This is about decoding and encoding the fields to allow easier understanding and modification.
Read the input XML and find the contents of the StringHttpBody fields. Replace each field contents with the result of calling the following routine on the original contents. Write the all the lines to a new intermediate file. The byte array contains UTF-16 characters as high and low bytes. (All the characters I have seen so far have high byte zero.)
private string DecodeBody(string source) {
byte[] outBytes = Convert.FromBase64String(source);
StringBuilder sb = new StringBuilder();
Assert( (outBytes.Length % 2) != 0 );
for (int ix = 0; ix < outBytes.Length; ix += 2) {
Assert(outBytes[ix] != 0);
sb.Append((char)outBytes[ix + 1]);
}
return sb.ToString();
}
Now have a file containing a simple text version of the .webtest file. This file can easily be edited to parameterise fields of the requests. Have an routine used similar to the one above and writing to another intermediate file. The routine has statements such as:
source = source.Replace("<Database>db1a</Database>", "<Database>{{DatabaseName}}</Database>");
The final intermediate file is then reencoded to create a new .webtest file. Just as before the contents of the StringHttpBody fields are found and replaced with the result of calling a routine. The encoding routine is:
private string EncodeBody(string source) {
StringBuilder sb = new StringBuilder();
byte[] outBytes = new byte[2 * source.Length];
for (int ix = 0; ix < source.Length; ix++) {
char ch = source[ix];
outBytes[2 * ix] = (byte)(((int)ch) & 0xFF);
outBytes[2 * ix + 1] = (byte)((((int)ch) / 256) & 0xFF);
}
sb.Append(Convert.ToBase64String(outBytes));
return sb.ToString();
}
The flow of files is thus:
decode original.webtest > intermediate1
parameterise intermediate1 > intermediate2
encode intermediate2 > final.webtest
On the small number of .webtest files I have tried the encode operation is the inverse of the decode operation, the original file from before decoding is identical to the file after encoding. Having the two intermediate files allows easy checking and searching of the contents of the unencoded file and the effect of the parameterise step.
For
[StringHttpBody ContentType="application/json"]
To Decode the body:
var encodedString = childNode.InnerText;
var encodedStringBytes = Convert.FromBase64String(encodedString);
var decodedString = Encoding.Unicode.GetString(encodedStringBytes);
JObject jsonString =JsonConvert.DeserializeObject(decodedString);
To Encode the body:
childNode.InnerText =
Convert.ToBase64String(Encoding.Unicode.GetBytes(JsonConvert.SerializeObject(jsonString)));
I think this might help.

Trying to use ColdFusion to create HMAC-SHA1 hash for API authentication

I am at my wit's end on this one, I just can't find the right combination of code to make this work. I'm trying to create an authentication digest for an API query. I've tried many CFML functions (for example: Coldfusion HMAC-SHA1 encryption and HMAC SHA1 ColdFusion), but I'm not coming up with the same results that are cited in the API documentation. Here's that example (basically elements of the request header with line breaks as delimiters.):
application/xml\nTue, 30 Jun 2009 12:10:24 GMT\napi.summon.serialssolutions.com\n/2.0.0/search\ns.ff=ContentType,or,1,15&s.q=forest\n
and here's the key:
ed2ee2e0-65c1-11de-8a39-0800200c9a66
which according to the documentation should result in:
3a4+j0Wrrx6LF8X4iwOLDetVOu4=
when the HMAC hash is converted to Base64. Any ideas would be most appreciated!
The problem is your input string, not the functions. The first one works fine. Though I would change the charset to UTF-8, or make it an argument. Otherwise, the results are dependent on the jvm default, which may not always be correct, and can change which would break the code.
Verify you are constructing the sample string correctly. Are you using chr(10) for new lines? Note: It must also end with a new line.
Code:
<cfscript>
headers = [ "application/xml"
, "Tue, 30 Jun 2009 12:10:24 GMT"
, "api.summon.serialssolutions.com"
, "/2.0.0/search"
, "s.ff=ContentType,or,1,15&s.q=forest"
];
theText = arrayToList(headers, chr(10)) & chr(10);
theKey = "ed2ee2e0-65c1-11de-8a39-0800200c9a66";
theHash = binaryEncode( hmacEncrypt(theKey, theText), "base64");
writeDump(theHash);
</cfscript>
Result:
3a4+j0Wrrx6LF8X4iwOLDetVOu4=

output variable stored in database

I'm storing the page content in a database table. The page content also includes some CF variables (for example "...this vendor provides services to #VARIABLES.vendorLocale#").
VARIABLES.vendorLocal is set on the page based on a URL string.
Next a CFC is accessed to get the corresponding page text from the database.
And this is then output on the page: #qryPageContent.c_content#
But #VARIABLES.vendorLocale# is showing up as is, not as the actual variable. Is there anyway to get a "variable within a variable" to be output correctly?
This is on a CF9 server.
If you have a string i.e.
variables.vendorLocal = 'foo';
variables.saveMe = 'This is a string for supplier "#variables.vendorLocal#'"' ;
WriteOutput(variables.saveMe); // This is a string for locale "foo"
then coldfusion will attempt to parse that to insert whatever variable variables.vendorLocale is. To get around this, you can use a placeholder string that is not likely to be used elsewhere. Commonly you'll see [[NAME]] used for this purpose, so in this example
variables.saveMe = 'This is a string for supplier "[[VENDORLOCALE]]'"' ;
WriteOutput(variables.saveMe); // This is a string for supplier "[[VENDORLOCALE]]"
Now you've got that you can then later on replace it for your value
variables.vendorLocal = 'bar';
variables.loadedString = Replace(variables.saveMe,'[[VENDORLOCALE]]',variables.vendorLocal);
WriteOutput(variables.loadedString); // This is a string for locale "bar"
I hope this is of help
There are lots of reasons storing code itself in the database is a bad idea, but that's not your question, so I won't go into that. One way to accomplish what you want is to take the code you have stored as as string, write a temporary file, include that file in the page, then delete that temporary file. For instance, here's a little UDF that implements that concept:
<cfscript>
function dynamicInclude(cfmlcode){
var pathToInclude = createUUID() & ".cfm";
var pathToWrite = expandPath(pathToInclude);
fileWrite(pathToWrite,arguments.cfmlcode);
include pathToInclude;
fileDelete(pathToWrite);
}
language = "CFML";
somecfml = "This has some <b>#language#</b> in it";
writeOutput(dynamicInclude(somecfml));
</cfscript>