Crystal lang : How to write binary Int to IO - crystal-lang

best practise?
In binary format, please
And the situation of BigEndian or LittleEndian ?

Then it's IO#write_bytes. Endianess can be configured via argument format.

You're looking for IO#<<.

You can convert to string, ex: 3.to_s

Related

c++ read from binary file and convert to utf-8

I would like to read data from a application/octet-stream charset=binary file with fread on linux and convert it to UTF-8 encoding. I tried with iconv, but it doesn't support binary charset. I haven't found any solution yet. Can anyone help me with it?
Thanks.
According to the MIME that you've given, you're reading data that's in non-textual binary format. You cannot convert it with iconv or similar, because it's meant for converting text from one (textual) encoding to another. If your data is not textual, then a conversion to any character encoding is meaningless and will just corrupt the data, but not make it any more readable.
The typical way to present binary as readable text for inspection is hex dump. There's an existing answer for implementing it in c++: https://stackoverflow.com/a/16804835/2079303

View utf-8 tuple element as string in python

I have a list with Unicode utf-8 tuples like:
((u'\u0d2a\u0d31\u0d1e\u0d4d\u0d1e\u0d41',
u'\u0d15\u0d47\u0d3e\u0d23\u0d4d\u200d\u0d17\u0d4d\u0d30\u0d38\u0d4d'),
7.5860818562067314)
I want to convert the utf-8 code as string. I have tried decode. But getting error.
Can anyone please help me out?
Thanks in advance!
You should use .encode('utf-8') method instead of .decode(), because strings are represented in unicode type, and we want to get byte strings.
Here is great how-to about string encoding in python 2.7, must read it:
http://docs.python.org/2/howto/unicode.html
For example, data[0][0].encode('utf-8') produces normal result.
Those are not UTF-8.
print data[0][0], data[0][1]

Why does Qt reject a valid JSON?

Using Qt-5.0, I have this JSON string
{"type":"FILE"}
I expected that fromBinaryData accept .toLocal8Bit() of the string as a valid format but it doesn't.
QString j = "{\"type\":\"FILE\"}";
auto doc = QJsonDocument::fromBinaryData(j.toLocal8Bit());
doc.isNull() // It's true, means the entry is not valid
Did I miss something?
I have no idea of Qt, so I googled for a second. Here's what I found:
What you have is a string, a text representation. It's not the binary format Qt uses internally. The binary data would not be readable. QJsonDocument::fromBinaryData expects such a binary blob.
What you want to do seems to be achieved with QJsonDocument::fromJson which expects an UTF8-encoded Json string.
Instead of fromBinaryData use fromJson with the same argument, I had this exact problem yesterday and that was what worked for me.

How to convert the date format 'MM/DD/YY' to 'DD-MM(in words)-YYYY' like (13-SEC-2013)

I am getting one data filed from one text file at format 'MM/DD/YY'(like '08/13/13'). Now I want to convert it as 'DD-MM(in words)-YYYY' like (13-SEC-2013).
Please help me in this conversion.
Please use this function.
TO_CHAR(TO_DATE(DATE,'MM-DD-YY'),'DD-MON-YYYY')
Or
v_PORT(DateTime)-TO_DATE(TO_CHAR(INPUTPORT),'MM/DD/YY')
o_PORT(String)--TO_CHAR(v_PORT,'DD-MON-YYYY')
Both it's working and coming like this.(13-SEC-2013)
Convert input value to date by using to_char(to_date(in_date, 'mm/dd/yyyy'), 'DD-MON-YYYY')
Credit: Change Date Format
V_PORT=TO_CHAR(TO_DATE(INPUT_PORT,'YYYY-MM-DD'),'DD/MM/YYYY')
NOTE:- IF input port data is in timestamp then you have to use substr function to get only the date.
you can use like SUBSTR(INPUT_PORT,1,10)
All close
TO_CHAR(TO_DATE(I_DATE,'MM/DD/YY'),'DD-MON-YYYY')
Try this it will work
IIF(ISNULL(HIREDATE),TO_DATE(TO_CHAR(SYSDATE,'MM-DD-YY'),'DD-MON-YYYY'),HIREDATE)

How to convert ISO-8859-1 to UTF-8 using libiconv in C++

I'm using libcurl to fetch some HTML pages.
The HTML pages contain some character references like: סלקום
When I read this using libxml2 I'm getting: ׳₪׳¨׳˜׳ ׳¨
is it the ISO-8859-1 encoding?
If so, how do I convert it to UTF-8 to get the correct word.
Thanks
EDIT: I got the solution, MSalters was right, libxml2 does use UTF-8.
I added this to eclipse.ini
-Dfile.encoding=utf-8
and finally I got Hebrew characters on my Eclipse console.
Thanks
Have you seen the libxml2 page on i18n ? It explains how libxml2 solves these problems.
You will get a ס from libxml2. However, you said that you get something like ׳₪׳¨׳˜׳ ׳¨. Why do you think that you got that? You get an XMLchar*. How did you convert that pointer into the string above? Did you perhaps use a debugger? Does that debugger know how to render a XMLchar* ? My bet is that the XMLchar* is correct, but you used a debugger that cannot render the Unicode in a XMLchar*
To answer your last question, a XMLchar* is already UTF-8 and needs no further conversion.
No. Those entities correspond t the decimal value of the Unicode sequence number of your characters. See this page for example.
You can therefore store your Unicode values as integers and use an algorithm to transform those integers to an UTF-8 multibyte character. See UTF-8 specification for this.
This answer was given in the assumpltion that the encoded text is returned as UTF-16, which as it turns out, isn't the case.
I would guess the encoding is UTF-16 or UCS2. Specify this as input for iconv. There might also be an endian issue, have a look here
The c-style way would be (no checking for clarity):
iconv_t ic = iconv_open("UCS-2", "UTF-8");
iconv(ic, myUCS2_Text, inputSize, myUTF8-Text, outputSize);
iconv_close(ic);