Silverlight 4 image upload problem - web-services

I am using Silverlight4 with java webervices in jsp page. I want to save an image to the server so trying to do this with java webservice. I am using below lines of code but output is damaged. I dont understand why. Please help me. This is really important for me. When i try to open 3mb jpeg file contains "Windows Photo Viewer cant open this picture because file appears to be damaged, corrupted or is too large."
Client Side COde
WriteableBitmap wb = new WriteableBitmap(bitmapImage);
byte[] bb = ToByteArray(wb);
public byte[] ToByteArray(WriteableBitmap bmp)
{
int[] p = bmp.Pixels;
int len = p.Length * 4;
byte[] result = new byte[len]; // ARGB
Buffer.BlockCopy(p, 0, result, 0, len);
return result;
}
WebService Code
#WebMethod(operationName = "saveImage")
public Boolean saveImage(#WebParam(name = "img")
byte[] img, #WebParam(name = "path")
String path) {
try{
FileOutputStream fos = new FileOutputStream("C:\\Users\\TheIntersect\\Desktop\\sharp_serializer_dll\\saved.jpg");
fos.write(img);
fos.close();
return true;
}
catch(Exception e){
return false;
}
}

I found my answer on forums.silverlight.net
It is very interesting when i try to call ReadFully(Stream) just after the Stream definition it works but when i call 10 lines of code later it returns all 0.
FUnction
public static byte[] ReadFully(Stream input)
{
byte[] buffer = new byte[input.Length];
using (MemoryStream ms = new MemoryStream())
{
int read;
while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
return ms.ToArray();
}
}
Fail Code
using (Stream str = opd.File.OpenRead())
{
BitmapImage bitmapImage = new BitmapImage();
bitmapImage.SetSource(str);
image.Tag = bitmapImage.UriSource.ToString();
image.Source = bitmapImage;
image.Width = width;
image.Height = height;
image.Stretch = Stretch.Uniform;
container.Child = image;
rtb.Selection.Insert(container);
ServiceReference1.webWordWebServiceClient s = new ServiceReference1.webWordWebServiceClient();
byte[] bb = ReadFully(str);
s.saveImageCompleted += new EventHandler<ServiceReference1.saveImageCompletedEventArgs>(s_saveImageCompleted);
s.saveImageAsync(bb, "gungorrrr");
}
Successfull Code
using (Stream str = opd.File.OpenRead())
{
byte[] bb = ReadFully(str);
BitmapImage bitmapImage = new BitmapImage();
bitmapImage.SetSource(str);
image.Tag = bitmapImage.UriSource.ToString();
image.Source = bitmapImage;
image.Width = width;
image.Height = height;
image.Stretch = Stretch.Uniform;
container.Child = image;
rtb.Selection.Insert(container);
ServiceReference1.webWordWebServiceClient s = new ServiceReference1.webWordWebServiceClient();
(bitmapImage);
s.saveImageCompleted += new EventHandler<ServiceReference1.saveImageCompletedEventArgs>(s_saveImageCompleted);
s.saveImageAsync(bb, "gungorrrr");
}
Link: http://forums.silverlight.net/forums/p/234126/576070.aspx#576070

Related

C++ - EncryptMessage not encrypting the correct data

I've been following this tutorial for SSL/TLS online (well its more of reading the guys source code and following along) but I've hit a bumpy road with this EncryptMessage part because it pushes the data out of the way and encrypts the wrong info.
The pbloBuffer that I send it is:
GET / HTTP/1.1\r\n
HOST: www.google.com\r\n\r\n
But when I do pbMessage = pbloBuffer + Sizes.cbHeader; I end up with (even the microsoft websites says to do this)
1\r\n
HOST: www.google.com\r\n\r\n
Now pbMessage is the code above, and that's inserted under SECBUFFER_DATA so it's not even getting the full data. From what I understand SECBUFFER_DATA is the "user" data that the Webserver will decode and process.
Can you find out how to fix this and properly send the encrypted data?
Full source: (This code is experimental as I am trying to understand it before I makes changes)
int Adaptify::EncryptSend(PBYTE pbloBuffer, int Size) {
SECURITY_STATUS scRet{ 0 };
SecBufferDesc Message{ 0 };
SecBuffer Buffers[4]{ 0 };
DWORD cbMessage = 0, cbData = 0;
PBYTE pbMessage = nullptr;
SecPkgContext_StreamSizes Sizes = { 0 };
QueryContextAttributesW(&hContext, SECPKG_ATTR_STREAM_SIZES, &Sizes);
pbMessage = pbloBuffer + Sizes.cbHeader;
cbMessage = (DWORD)strlen((const char*)pbMessage);
Buffers[0].BufferType = SECBUFFER_STREAM_HEADER;
Buffers[0].cbBuffer = Sizes.cbHeader;
Buffers[0].pvBuffer = pbloBuffer;
Buffers[1].BufferType = SECBUFFER_DATA;
Buffers[1].pvBuffer = pbMessage;
Buffers[1].cbBuffer = cbMessage;
Buffers[2].BufferType = SECBUFFER_STREAM_TRAILER;
Buffers[2].cbBuffer = Sizes.cbTrailer;
Buffers[2].pvBuffer = pbMessage + cbMessage;
Buffers[3].BufferType = SECBUFFER_EMPTY;
Buffers[3].cbBuffer = SECBUFFER_EMPTY;
Buffers[3].pvBuffer = SECBUFFER_EMPTY;
Message.cBuffers = 4;
Message.pBuffers = Buffers;
Message.ulVersion = SECBUFFER_VERSION;
scRet = EncryptMessage(&hContext, 0, &Message, 0);
if (send(hSocket, (const char*)pbloBuffer, Buffers[0].cbBuffer + Buffers[1].cbBuffer + Buffers[2].cbBuffer, 0) < 0) {
MessageBox(0, L"Send error", 0, 0);
}
return 0;
}
first - you need call QueryContextAttributesW only once after InitializeSecurityContextW return SEC_E_OK - no sense call it every time, when you need send data. and save result. say inherit your class from SecPkgContext_StreamSizes - class Adaptify : SecPkgContext_StreamSizes; and call on end handshake (once) QueryContextAttributesW(&hContext, SECPKG_ATTR_STREAM_SIZES, this);
about send send data - in your case Buffers[1].pvBuffer of course must point to your actual data pbloBuffer but not to pbloBuffer + Sizes.cbHeader code can be like this:
int Adaptify::EncryptSend(PBYTE pbloBuffer, int Size) {
SECURITY_STATUS ss = SEC_E_INSUFFICIENT_MEMORY;
if (PBYTE Buffer = new BYTE[cbHeader + Size + cbTrailer]) {
memcpy(Buffer + cbHeader, pbloBuffer, Size);
SecBuffer sb[4] = {
{ cbHeader, SECBUFFER_STREAM_HEADER, Buffer},
{ Size, SECBUFFER_DATA, Buffer + cbHeader},
{ cbTrailer, SECBUFFER_STREAM_TRAILER, Buffer + cbHeader + Size},
};
SecBufferDesc sbd = {
SECBUFFER_VERSION, 4, sb
};
if (SEC_E_OK == (ss = ::EncryptMessage(this, 0, &sbd, 0)))) {
if (SOCKET_ERROR == send(hSocket, (const char*)Buffer, sb[0].cbBuffer+sb[1].cbBuffer+sb[2].cbBuffer+sb[3].cbBuffer, 0))
ss = WSAGetLastError();
}
delete [] Buffer;
}
return ss;
}
so you need allocate new buffer with cbHeader + Size + cbTrailer size (wher Size is your actual message Size and copy your message at Buffer + cbHeader

Give a file as input to Pocketsphinx on Android

I am using the latest pocketsphinx android demo (mighty computer),which takes input from microphone. I want to give a wav file as input to the same. I tried using decoder.processrow() function. But I don't know how to configure the decoder using hmm, lm etc.
Code to process files in pocketsphinx-java
Config c = Decoder.defaultConfig();
c.setString("-hmm", "../../model/en-us/en-us");
c.setString("-lm", "../../model/en-us/en-us.lm.dmp");
c.setString("-dict", "../../model/en-us/cmudict-en-us.dict");
Decoder d = new Decoder(c);
URL testwav = new URL("file:../../test/data/goforward.wav");
FileInputStream stream = new FileInputStream(new File(testwav)));
d.startUtt();
byte[] b = new byte[4096];
try {
int nbytes;
while ((nbytes = stream.read(b)) >= 0) {
ByteBuffer bb = ByteBuffer.wrap(b, 0, nbytes);
// Not needed on desktop but required on android
bb.order(ByteOrder.LITTLE_ENDIAN);
short[] s = new short[nbytes/2];
bb.asShortBuffer().get(s);
d.processRaw(s, nbytes/2, false, false);
}
} catch (IOException e) {
fail("Error when reading goforward.wav" + e.getMessage());
}
d.endUtt();
System.out.println(d.hyp().getHypstr());
for (Segment seg : d.seg()) {
System.out.println(seg.getWord());
}
}
Adding to the answer from Nikolay, this is how it can be done on Android, adapting the SpeechRecognizer Android implementation example found here: http://cmusphinx.sourceforge.net/wiki/tutorialandroid
//statically load our library
static {
System.loadLibrary("pocketsphinx_jni");
}
//convert an inputstream to text
private void convertToSpeech(final InputStream stream){
new AsyncTask<Void, Void, Exception>() {
#Override
protected Exception doInBackground(Void... params) {
try {
Assets assets = new Assets(WearService.this);
File assetsDir = assets.syncAssets();
Config c = Decoder.defaultConfig();
c.setString("-hmm", new File(assetsDir, "en-us-ptm").getPath());
c.setString("-dict", new File(assetsDir, "cmudict-en-us.dict").getPath());
c.setBoolean("-allphone_ci", true);
c.setString("-lm", new File(assetsDir, "en-phone.dmp").getPath());
Decoder d = new Decoder(c);
d.startUtt();
byte[] b = new byte[4096];
try {
int nbytes;
while ((nbytes = stream.read(b)) >= 0) {
ByteBuffer bb = ByteBuffer.wrap(b, 0, nbytes);
// Not needed on desktop but required on android
bb.order(ByteOrder.LITTLE_ENDIAN);
short[] s = new short[nbytes/2];
bb.asShortBuffer().get(s);
d.processRaw(s, nbytes/2, false, false);
}
} catch (IOException e) {
fail("Error when reading inputstream" + e.getMessage());
}
d.endUtt();
System.out.println(d.hyp().getHypstr());
for (Segment seg : d.seg()) {
//do something with the result here
}
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
}.executeOnExecutor(AsyncTask.THREAD_POOL_EXECUTOR);
}

Custom C++ ASP .NET Membership Login

Does anyone know how can I hash the user's password using the salt key provided by the ASP .NET Membership?
I'm developing a C++ Linux application and I have only access to the SQL Server.
Thanks,
Here is the encoding algorithm used by ASP.Net Membership which is written in C#.
It uses System.Security, so you might want to look at MONO if you want to run on Lunix.
Note: I'm not familiar with MONO.
private string EncodePassword(string pass, int passwordFormat, string salt)
{
if (passwordFormat == 0) // MembershipPasswordFormat.Clear
return pass;
byte[] bIn = Encoding.Unicode.GetBytes(pass);
byte[] bSalt = Convert.FromBase64String(salt);
byte[] bRet = null;
if (passwordFormat == 1)
{ // MembershipPasswordFormat.Hashed
HashAlgorithm hm = GetHashAlgorithm();
if (hm is KeyedHashAlgorithm)
{
KeyedHashAlgorithm kha = (KeyedHashAlgorithm)hm;
if (kha.Key.Length == bSalt.Length)
{
kha.Key = bSalt;
}
else if (kha.Key.Length < bSalt.Length)
{
byte[] bKey = new byte[kha.Key.Length];
Buffer.BlockCopy(bSalt, 0, bKey, 0, bKey.Length);
kha.Key = bKey;
}
else
{
byte[] bKey = new byte[kha.Key.Length];
for (int iter = 0; iter < bKey.Length; )
{
int len = Math.Min(bSalt.Length, bKey.Length - iter);
Buffer.BlockCopy(bSalt, 0, bKey, iter, len);
iter += len;
}
kha.Key = bKey;
}
bRet = kha.ComputeHash(bIn);
}
else
{
byte[] bAll = new byte[bSalt.Length + bIn.Length];
Buffer.BlockCopy(bSalt, 0, bAll, 0, bSalt.Length);
Buffer.BlockCopy(bIn, 0, bAll, bSalt.Length, bIn.Length);
bRet = hm.ComputeHash(bAll);
}
}
else
{
byte[] bAll = new byte[bSalt.Length + bIn.Length];
Buffer.BlockCopy(bSalt, 0, bAll, 0, bSalt.Length);
Buffer.BlockCopy(bIn, 0, bAll, bSalt.Length, bIn.Length);
bRet = EncryptPassword(bAll, _LegacyPasswordCompatibilityMode);
}
return Convert.ToBase64String(bRet);
}
private string GenerateSalt()
{
byte[] buf = new byte[SALT_SIZE];
(new RNGCryptoServiceProvider()).GetBytes(buf);
return Convert.ToBase64String(buf);
}

Convert ZipOutputStream to ByteArrayInputStream

I want to compress an InputStream using ZipOutputStream and then get the InputStream from compressed ZipOutputStream without saving file on disc. Is that possible?
I figured it out:
public InputStream getCompressed(InputStream is) throws IOException {
ByteArrayOutputStream bos = new ByteArrayOutputStream();
ZipOutputStream zos = new ZipOutputStream(bos);
zos.putNextEntry(new ZipEntry(""));
int count;
byte data[] = new byte[2048];
BufferedInputStream entryStream = new BufferedInputStream(is, 2048);
while ((count = entryStream.read(data, 0, 2048)) != -1) {
zos.write( data, 0, count );
}
entryStream.close();
zos.closeEntry();
zos.close();
return new ByteArrayInputStream(bos.toByteArray());
}

Why is this encrypted message damaged?

I use the following code to encrypt a string with a key, using the 3-DES algorithm:
private bool Encode(string input, out string output, byte[] k, bool isDOS7)
{
try
{
if (k.Length != 16)
{
throw new Exception("Wrong key size exception");
}
int length = input.Length % 8;
if (length != 0)
{
length = 8 - length;
for (int i = 0; i < length; i++)
{
input += " ";
}
}
TripleDESCryptoServiceProvider des = new TripleDESCryptoServiceProvider();
des.Mode = CipherMode.ECB;
des.Padding = PaddingMode.Zeros;
des.Key = k;
ICryptoTransform ic = des.CreateEncryptor();
byte[] bytePlainText = Encoding.Default.GetBytes(input);
MemoryStream ms = new MemoryStream();
CryptoStream cStream = new CryptoStream(ms,
ic,
CryptoStreamMode.Write);
cStream.Write(bytePlainText, 0, bytePlainText.Length);
cStream.FlushFinalBlock();
byte[] cipherTextBytes = ms.ToArray();
cStream.Close();
ms.Close();
output = Encoding.Default.GetString(cipherTextBytes);
}
catch (ArgumentException e)
{
output = e.Message;
//Log.Instance.WriteToEvent("Problem encoding, terminalID= "+objTerminalSecurity.TerminalID+" ,Error" + output, "Security", EventLogEntryType.Error);
return false;
}
return true;
}
I send the output parameter as is over to a WCF http-binding webservice, and I noticed that the actual encoded string looks different, it looks like there are some \t and \n but the charachters are about the same.
What is going on, why does the server get a different encoded string?
Usually cipher text is base64 encoded in an effort to be binary safe during transmission.
Also I would not use 3DES with ECB. That is awful, you must have copy pasted this from somewhere. Use AES with cbc mode and think about adding a cmac or hmac.