Confusion about Array<Any> and Array<Int> - swift3

Starting out with Swift3, and am confused by following:
var o1:Array<Any> = [1,2,3]
var o2:Array<Int> = [1,2,3]
print(type(of:o2))
print(type(of:o2[0]))
o2[0] += 1
print(o2[0])
print(type(of:o1))
print(type(of:o1[0]))
//o1[0] += 1
print(o1[0])
compile and run, I get:
Array<Int>
Int
2
Array<Any>
Int
1
If I uncomment that line, I get a compilation error "Binary operator += cannot be applied to operands of type Any and Int"
ok, so it seems that Swift recognizes that o1[0] is an Int but I can't += it, but can do it on o2[0]. Any insight here as to why Swift says the type is Int but then wont really honor that type?

ok, so it seems that Swift recognizes that o1[0] is an Int but I can't += it, but can do it on o2[0].
Not so. The Swift compiler sees o1[0] as an Any, not an Int. This is why you're getting the error; when you write o1[0] += 1, you're asking Swift to use the += operator on an Any (o1[0]) and an Int (1), and Swift doesn't have any definition of the += operator that takes those two arguments. Since operators are resolved at compile time, this means that even if the value actually is an Int at runtime, the compiler doesn't have a way to know that, so it can't know how to resolve the operator.

Related

Using operator "/=" at 'declare&initialize' part

I'm a student who got interested in computer science recently. I'm studying C++ because I am interested in embedded systems.
When I tried to test the operator /= on my own. I want to learn about it by doing. The code that I wrote was
int a /= --b + 3;
but the compiler gave me an error message. But when I modified it to
int a = 0;
a /= --b + 3;`
it worked well. I thought it is related to l-values and r-values. Why does the 1st example with operator /= give me an error but the 2nd example above is ok? Can I ask you for some reference to get a hint about this question?
PS: When I tested with
int t = t / (--a + 3);
it worked well too! What is the difference? Can you point me to some document about that?
I would like to mention two things.
What is the meaning of this code?
Is it valid C++ syntax?
Let's take a look at both.
when I tested "int a/=--b+3", it has error but when I modified to "int a=0;
a/=--b+3;" , it works well.
Unlike Java, C/C++ does not automatically initialize integer's value by 0 and it contains a garbage value(called "indeterminate value" officially). So int a/=--b+3; is more like int a; a/=--b+3; which is still be a meaningless value.
And when you declare a variable, C/C++ grammar does not allow /=. Here are the ways for variable declaration and initialization. I'm not sure there is more ways.
int a = 1;
int a(1);
int a{1}; (since C++11)
a /= b;
is the same as:
a = a / b;
so this means that this below statement makes no sense:
int a /= (--b + 3);
Because it's equivalent to:
int a = a / (--b + 3);
Assuming that b has already been defined here; the problem is that a hasn't been defined, and so can't be used as part of the assignment.
The problem here is the same as the problem with this statement:
int a = a;
This also explains why the following code does work:
int a = 0;
a /= (--b + 3);
Because it's equivalent to this:
int a = 0;
a = a / (--b + 3);
Because a is known in the second line above, the RHS can be defined, and the new value for a determined.
More generally, operators like /=, *=, +=, -= and %= shouldn't be used during initialisation of a variable. A compiler (such as g++) should respond with an error if you ever try to do this.

Why can it still compile/run after changing locations of different parameters with different types?

Last night I spent 1 hour to debug. I was writing a program of Huffman_coding. Part of my wrong code is blow:
class HTree
{
public:
HTree* left;
HTree* right;
int weight;
char ch;
HTree(){left = right = NULL; ch = '\0'; weight = 0;}
HTree(HTree* l, HTree* r, char c, int w){left = l; right = r; ch = c; weight = w;}
~HTree(){delete left; delete right;}
bool ISleaf(){return !left && !right;}
};
And this is one piece of code in HTree* BuildTree(int* frequency):
QTree.push(new HTree(NULL, NULL, frequency[i], (char)i));
As can be see, the order of these two parameters is inverse - char c, int w and frequency[i], (char)i. I didn't realize this little mistake at the beginning. The output is, of course, wrong.
But my question is, why can it still compile? or why can it still run? Because the answer/cout is totally wrong. And according to the different types, I think it cannot compile successfully - char != int, am I right?
Can anyone tell me why does this happen? Does it automatically converse the int to char and char to int, and then it can compile and run, and got a wrong answer?
Well, let me specify more clearly, my main question is not whether it can automatically converse or not, it's why my answer is wrong if the converse can automatically happen.
The wrong output:
The right output:
Thanks in advance!
C++ can convert chars (that are 8 bit integers) to int using implicit casting, if the target type is wider than the source type, which is the case here (int is usually 16 or 32 bits wide depending on the platform, char is 8 bit).
When assigning an int to a char, the compiler usually prints a warning message (but the assignment will work).
Yes, there are implicit conversions from char to int and from int to char.
It is wrong because instead of frequency you initialize your every HTree object with some i value. And the c member is initialized with frequency. So because of casting your code is gonna compile and work, and produce wrong results all the way.

Is using an assignment operator in a function argument undefined behaviour?

I found some code similar to this in an example my university tutor wrote.
int main(){
int a=3;
int b=5;
std::vector<int>arr;
arr.push_back(a*=b);
std::cout<<arr[0]<<std::endl;
}
Is there a well-defined behaviour for this? Would arr[0] be 3 or 15 (or something else entirely)?
Visual Studio outputs 15, but I have no idea if that's how other compilers would respond to it.
Before push_back is executed, the expression passed as argument will need to be computed. So what is the value of a *= b. Well it will always be a * b and also the new value of a will be set to that one.
It's valid and works as you expect.
The expression is first evaluated and the result is "returned".
auto& temp = (a*=b);
arr.push_back(temp);
The value of an expression with the compound assignment operator is the value of the left operand after the assignment.
So the code your showed is valid. Moreover in C++ (opposite to C) the result is an lvalue. So you may even write :)
arr.push_back( ++( a *= b ) );
In this case the next statement outputs 16.:)

How is this getting evaluated?

I am feeling very stupid to asking this question. but cant figure out the reason on my own.
int main()
{
int target;
int buffer =10;
const int source = 15;
target = (buffer+=source) = 20;
cout << target+buffer;
return 0;
}
target = (buffer+=source) = 20; will become target = (25) = 20.
But if I am giving same statement in my source file, it is giving l-value error.
How the value of target+buffer is printing 40.
Some predefined operators, such as +=, require an operand to be an
lvalue when applied to basic types [§13.5/7]
buffer+=source returns a lvalue reference to buffer. So you have not compile error.
your statement can be evaluate as:
buffer+=source;
buffer=20;
target=20;
But modifying buffer twice in a statement is undefined behavior and another compiler can evaluate something else as result. (Not sure in this case also!)

How to use a Judy array

I am interested in Judy Arrays and try to use it. But i had unable to do any useful thing using it. Every time it gives me casting errors.. Sample c++ code and the error given below.
#include "Judy.h"
#include <iostream>
using namespace std;
int main()
{
int Rc_int; // return code - integer
Word_t Rc_word; // return code - unsigned word
Word_t Index = 12, Index1 = 34, Index2 = 55, Nth;
Word_t PValue; // pointer to return value
//Pvoid_t PJLArray = NULL; // initialize JudyL array
Pvoid_t JudyArray = NULL;
char String[100];
PWord_t _PValue;
JSLI( JudyArray, _PValue, (uint8_t *) String);
return(0);
} // main()
This gives me the error
m.cpp: In function ‘int main()’:
m.cpp:19: error: invalid conversion from ‘long unsigned int**’ to ‘void**’
m.cpp:19: error: initializing argument 1 of ‘void** JudySLIns(void**, const uint8_t*, J_UDY_ERROR_STRUCT*)’
Please anyone help me to figure out what is the error what i'm doing..
Thanks
According to the documentation, you have the _PValue and JudyArray parameters reversed. Make your call look like this:
JSLI( _PValue, JudyArray, (uint8_t *) String);
Also, try not compiling it as C++ code. So far, your test uses no C++ features. I bet it will compile as C code. It looks like JudyArray relies on the fact that C will do certain kinds of implicit conversions between void * and other pointer types.
If this is the case, I'm not sure what to do about it. The error messages you're getting tell me that JSLI is a macro. In order to fix the error message you have in the comments on this answer, you'd have to reach inside the macro and add a typecast.
These kinds of implicit conversions are allowed in C because otherwise using malloc would always require ugly casts. C++ purposely disallows them because the semantics of new make the requirement that the result of malloc be cast to the correct type unimportant.
I don't think this library can be used effectively in C++ for this reason.
It seems that, you pass JudySLIns(void**, const uint8_t*, J_UDY_ERROR_STRUCT*) a wrong parameter, the first one, you'b better check it!
For integer keys there is a C++ wrapper at http://judyhash.sourceforge.net/