Can't use floats in WebGL GLSL shader constant #if expression? - glsl

In WebGL GLSL, I'm trying to do something like:
#if (2.0 > 3.0)
// something
#endif
But this errors with:
ERROR: 0:21: 'syntax error' : invalid expression
ERROR: 0:21: '2.0' : unexpected token after conditional expression
The 1.0 spec says:
A constant expression is one of
• a literal value (e.g., 5 or true)
Aren't floats literal values?
Similarly, I'm also not sure why this doesn't work, since it's a const variable, using a constant expression for the value:
const vec3 x = vec3(1.0);
...
#if (x.x > 1.0)
#endif
ERROR: 0:21: 'x' : unexpected token after conditional expression
ERROR: 0:21: 'syntax error' : invalid expression
ERROR: 0:21: '.' : unexpected token after conditional expression

Ah, from the same document in the Preprocessor section, it says:
Expressions following #if and #elif are restricted to expressions operating on literal integer constants, plus identifiers consumed by the defined operator.
Which is why floats, length() etc don't work.

Related

ternary operator for return switch

According to the accepted answer to a previous post
Declarations are not expressions. There are places where expressions
are allowed, but declararions are not. The left hand side of ?, the
trinary operator, is one of them.
Now, consider the following code segment:
#include <iostream>
using std::cout;
using std::endl;
enum struct status{invalid=0, valid};
status test (void);
int main (void){
status s = test();
cout << static_cast<int>(s) << endl;
return (0);
}
status test (void){
static auto invocation_count = 0;
++invocation_count;
//return (invocation_count % 2) ? (status::invalid) : (status::valid);
(invocation_count % 2) ? (return (status::invalid)) : (return (status::valid));
}
The function test() does not compile (note the compiler error log displays line numbers in the original test code):
g++ -ggdb -std=c++17 -Wall -Werror=pedantic -Wextra -c code.cpp
code.cpp: In function ‘status test()’:
code.cpp:19:31: error: expected primary-expression before ‘return’
(invocation_count % 2) ? (return (status::invalid)) : (return (status::valid));
^~~~~~
code.cpp:19:31: error: expected ‘)’ before ‘return’
code.cpp:19:83: error: expected ‘:’ before ‘;’ token
(invocation_count % 2) ? (return (status::invalid)) : (return (status::valid));
^
code.cpp:19:83: error: expected primary-expression before ‘;’ token
code.cpp:20:1: warning: no return statement in function returning non-void [-Wreturn-type]
}
^
make: *** [makefile:20: code.o] Error 1
However, if the last line inside test(), which is the source of error, were to be commented out and the line above (presently commented out) were to be enabled, the code compiles.
Both the lines use the ternary operator for return switch, albeit differently. And in both cases, the left hand side of the ? inside the ternary operator does not include any declaration (in fact it is the same expression in both cases).
So why does one compile while the other doesn't?
This is a legal expression:
{expression} ? {expression} : {expression}
This is a legal statement:
return {expression};
So:
return (invocation_count % 2) ? (status::invalid) : (status::valid);
is:
return {expression} ? {expression} : {expression};
Which has the form:
return {expression};
That's perfectly legal.
On the other hand consider:
(invocation_count % 2) ? (return (status::invalid)) : (return (status::valid));
This has the form:
{expression} ? {statement} : {statement}
That is not legal because the ?: operator requires expressions before and after the colon.
Parts of ternary operator must be expressions. return is not an expression. It is a statement.

The asterisk is not a character constant?

foo.cpp:
#define ID A
#if ID == A
#warning "hello, world"
#endif
Compilation with g++ -c foo.cpp works fine: (g++ v8.2.0)
foo.cpp:3:2: warning: #warning "hello, world" [-Wcpp]
#warning "hello, world"
^~~~~~~
Now, if I replace #define ID A with #define *, then I get:
foo.cpp:1:12: error: operator '*' has no left operand
#define ID *
^
foo.cpp:2:5: note: in expansion of macro ‘ID’
#if ID == A
^~
What is so special about *? Why does it fail in the #if expression?
There are two things of note in your post. The first, is that it doesn't work as you think. This will produce the warning too
#define ID B
#if ID == A
#warning "hello, world"
#endif
The reason is that in the context of #if the preprocessing tokens ID and A are taken as macros and are expanded. Since A is not defined, it is "expanded" to 0. So is ID via the expansion ID -> B -> 0. So the condition is true here as well.
This also answers why * causes an error. It cannot be expanded further (on account of not being a valid identifier), and therefore you get the comparison * == 0, which is nonsense.
Since your title implies you seek to compare against a character constant, the way to do that would be to define ID to expand into the token sequence of a character constant.
#define ID 'A'
#if ID == 'A'
It should now work as expected. As will #define ID '*'
#if does not what you think it is doing.
In your first example, it tries to evaluate 0 == 0, which is a valid expression with a value of true.
In your second example, it tries to evaluate * == 0, which is not a valid expression.

integer constant is so large that it is unsigned when assigning max type value to enum

Here is my enum declaration :
enum connection_primary_identifier_e : uint64_t
{
INVALID_IDENTIFIER = std::numeric_limits<std::underlying_type<connection_primary_identifier_e>::type>::max(),
}
(same happens if I use uint64_t directly as the type, also if I use -1 or -1ULL)
When I try to compile the file I get the following errors / warnings :
error: integer constant is so large that it is unsigned [-Werror]
error: narrowing conversion of ‘18446744073709551615I128’ from ‘__int128’ to ‘unsigned int’ inside { } [-Werror=narrowing]
error: large integer implicitly truncated to unsigned type [-Werror=overflow]
cc1plus: all warnings being treated as errors
The really weird thing is, the errors are actually produced for non-existing lines (line number is 3 after the last line on file) on another file (which uses the enum), I made sure it isn't a missing parentheses or anything like that.
Update:
Using uint32_t doesn't produce the error.
Using g++ (GCC) 4.8.3
Might be because std::underlying_type was initially underspecified and didn't require a complete type. This unintentionally allowed precisely this code, which uses connection_primary_identifier_e while it's still incomplete.
Starting with C++17, your code is definitely illegal.

Forcing preprocessor error with macro

Is there a way that I can force a preprocessor macro in C++ to emit an error? What I would like to do is define a macro UNKNOWN. I'm writing some code for a robot, and I don't yet know where all of the electronics are being plugged in. I'd like to be able to define the ports in some header file, like
const int MOTOR_PORT = 1;
const int FAN_PORT = 2;
//etc.
However, when I reach a port that I don't yet know, I want to be able to write something like
const int LED_PORT = UNKNOWN;
In debug mode, UNKNOWN would just be defined to some arbitrary value, like 0. However, when compiling in release mode, I want it to throw an error when UNKNOWN is used, so that unassigned ports don't end up in the final release. I know I can use the #error directive to force an error, but can I do something similar in a macro?
I've seen a solution using static_assert, but I unfortunately can't use C++11 for this platform.
Since #error can't result from a macro expansion, you can ensure that the macro expands to something that must be diagnosed, like a syntax error.
For example:
#ifdef RELEASE
#define UNKNOWN #Invalid_use_of_UNKNOWN
#else
#define UNKNOWN 0
#endif
const int MOTOR_PORT = 1;
const int FAN_PORT = 2;
const int LED_PORT = UNKNOWN;
int main(void) {
int x = LED_PORT;
}
The # character isn't part of C's basic character set, so its appearance outside a comment, character constant, or string literal should always result in an error message. ($ would work, except that accepting $ in identifiers is a common extension. ` would probably also work, but # stands out better.)
I've defined the macro so it produces a reasonable error message with gcc:
c.c:9:1: error: stray ‘#’ in program
c.c:9:22: error: ‘Invalid_use_of_UNKNOWN’ undeclared here (not in a function)
and with clang:
c.c:9:22: error: expected expression
const int LED_PORT = UNKNOWN;
^
c.c:2:17: note: expanded from:
#define UNKNOWN #Invalid_use_of_UNKNOWN
^
1 error generated.
(There's a _Pragma operator corresponding to the #pragma directive. It would be nice if there were an _Error operator as well, but there isn't.)
Well, this does not produce a complier error message like #error, but would compile in debug and fail in release:
#ifdef _DEBUG
# define UNKNOWN 1
#else
# define UNKNOWN
#endif
const int port1 = UNKNOWN; // fail in release
You could make a division by zero which will throw a compiler error:
#define UNKNOWN 0/0
The sizeof operator cannot be applied to an incomplete type, so try this:
// Declared, but not defined anywhere.
struct illegal_use_of_unknown_macro;
#define UNKNOWN (sizeof (illegal_use_of_unknown_macro))

Strange behaviour with templates and #defines

I have the following definitions:
template<typename T1, typename T2>
class Test2
{
public:
static int hello() { return 0; }
};
template<typename T>
class Test1
{
public:
static int hello() { return 0; }
};
#define VERIFY_R(call) { if (call == 0) printf("yea");}
With these, I try to compile the following:
VERIFY_R( Test1<int>::hello() );
this compiles fine
VERIFY_R( (Test2<int,int>::hello()) );
this also compiles fine, notice the parentheses around the call.
VERIFY_R( Test2<int,int>::hello() );
This, without the parentheses produces a warning and several syntax errors:
warning C4002: too many actual parameters for macro 'VERIFY_R'
error C2143: syntax error : missing ',' before ')'
error C2059: syntax error : ')'
error C2143: syntax error : missing ';' before '}'
error C2143: syntax error : missing ';' before '}'
error C2143: syntax error : missing ';' before '}'
fatal error C1004: unexpected end-of-file found
What's going on here?
This happens with VS2008 SP1.
The comma inside a macro can be ambiguous: an extra set of parentheses (your second example) is one way of disambiguating. Consider a macro
#define VERIFY(A, B) { if ( (A) && (B) ) printf("hi"); }
then you could write VERIFY( foo<bar, x> y ).
Another way of disambiguating is with
typedef Test1<int,int> TestII;
VERIFY_R( TestII::hello() );
The preprocessor is a dumb text replacement tool that knows nothing about C++. It interprets
VERIFY_R( Test1<int,int>::hello() );
as
VERIFY_R( (Test1<int), (int>::hello()) );
which calls VERIFY_R with too many parameters. As you noted, additional parentheses fix this:
VERIFY_R( (Test1<int,int>::hello()) );
The question remains, however, why you need the preprocessor anyway. The macro you used in your question could just as well be an inline function. If you real code doesn't do anything requiring the preprocessor, try to get rid of macros. They just cause pain.
The comma in <int, int> is treated as an argument separator for the macro, rather than for the template. The compiler therefore thinks you're calling VERIFY_R with two arguments (Test1<int and int>::hello()), when it requires only one. You need to use variadic macros to expand everything supplied to the macro:
#define VERIFY_R(...) { if ((__VA_ARGS__) == 0) printf("yea");}
It is generally a good idea to wrap macro arguments in parentheses, as well, to prevent other kinds of weird substitution errors.
The preprocessor doesn't know that < and > are supposed to be brackets, so it interprets the expression as two macro arguments, Test1<int and int>::hello(), separated by the ,. As you say, it can be fixed by surrounding the entire expression with parentheses, which the preprocessor does recognise as brackets.
I'm not sure if this is an error in your reporting here or the actual problem, but your last VERIFY_R is still referencing Test1, rather than Test2.