I primarily program in Java but I am taking a graphics course for which I need to use C++. I am trying to create an array of objects in order to loop through them and draw them to the screen, but I can't for the life of me figure out how to create this array. I have code now which does not produce any compiler errors, but it doesn't seem to work correctly either. The following code is at the top of my Main.cpp class:
Platform ground("wallstone.tga", 40, 16, 4, 144);
Platform platform1("wallstone.tga", 10, 16, 4, 20);
Platform platforms[2] = {ground, platform1}
When I try: fprintf(stdout, "Size of platforms array: %d", sizeof(platforms)/sizeof(Platform)); it prints out 0.0.
I've tried several ways of creating this array and they all seem to produce errors or that same output of 0.0, so I'm not sure what's going on. If any more of my code is necessary I will certainly be willing to post it. Of course, if there is a better way of approaching this I am grateful. Thanks!
It looks like you are doing everything right. My only guess is that size_t on your platform is larger than int, so providing a correct format specifier (%z instead of %d) may fix the problem:
fprintf(stdout, "Size of platforms array: %z", sizeof(platforms)/sizeof(Platform));
Related
currently I'm using this font in my C++ program:
-misc-fixed-medium-r-normal--12-*-*-*-*-*-iso8859-15
where '12', the size, is also the font size I'm using currently with Linux Mint 18-1.
But when I draw in my program a string it is shown very small! It looks like it has a size of '6'!
Do I need to double the font size for my program, or something like that?
TIA
Regards
Earlybite
I was searching some hours the internet, also here, but I couldn't find an solution. Also in my "pre-version" of my program, I couldn't find the difference, because *there was a normal drawing with XLib and DrawString.
I also noticed, that even size = 40 hadn't a difference to e.g. size = 20. So there had to be a difference in coding.
So I went through the pre-version code line by line and at least I found that little line: XSetFont().
Which makes drawing strings normal.
E.g. like that:
XSetFont(mDisplay, vGC, this->mFontPtr.fid); // <-- HERE!
vGCVal.foreground = mXForeColorA->X_Color.pixel;
XChangeGC(mDisplay,vGC, GCForeground, &vGCVal);
XDrawString(mDisplay, vPix, vGC, x, y, nDrawString.c_str(), (int) nDrawString.length());
I got point sprites working almost immediately, but I'm only stuck on one thing, they are rendered as probably 2x2 pixel sprites, which is not really very easy to see, especially if there's other motion. Now, I've tried tweaking all the variables, here's the code that probably works best:
void renderParticles()
{
for(int i = 0; i < particleCount; i ++)
{
particlePoints[i] += particleSpeeds[i];
}
void* data;
pParticleBuffer->Lock(0, particleCount*sizeof(PARTICLE_VERTEX), &data, NULL);
memcpy(data, particlePoints, sizeof(particlePoints));
pParticleBuffer->Unlock();
pd3dDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, TRUE);
pd3dDevice->SetRenderState(D3DRS_ZWRITEENABLE, TRUE);
pd3dDevice->SetRenderState(D3DRS_POINTSPRITEENABLE, TRUE);
pd3dDevice->SetRenderState(D3DRS_POINTSCALEENABLE, TRUE);
pd3dDevice->SetRenderState(D3DRS_POINTSIZE, (DWORD)1.0f);
//pd3dDevice->SetRenderState(D3DRS_POINTSIZE_MAX, (DWORD)9999.0f);
//pd3dDevice->SetRenderState(D3DRS_POINTSIZE_MIN, (DWORD)0.0f);
pd3dDevice->SetRenderState(D3DRS_POINTSCALE_A, (DWORD)0.0f);
pd3dDevice->SetRenderState(D3DRS_POINTSCALE_B, (DWORD)0.0f);
pd3dDevice->SetRenderState(D3DRS_POINTSCALE_C, (DWORD)1.0f);
pd3dDevice->SetStreamSource(0, pParticleBuffer, 0, sizeof(D3DXVECTOR3));
pd3dDevice->DrawPrimitive(D3DPT_POINTLIST, 0, particleCount);
pd3dDevice->SetRenderState(D3DRS_POINTSPRITEENABLE, FALSE);
pd3dDevice->SetRenderState(D3DRS_POINTSCALEENABLE, FALSE);
}
Ok, so when I change POINTSCALE_A and POINTSCALE_B, nothing really changes much, same for C. POINTSIZE also makes no difference. When I try to assign something to POINTSIZE_MAX and _MIN, no matter what I assign, it always stops the rendering of the sprites. I also tried setting POINTSIZE with POINTSCALEENABLE set to false, no luck there either.
This looks like something not many people who looked around found an answer to. An explanation of the mechanism exists on MSDN, while, yes, I did check stackoverflow and found a similar question with no answer. Another source only suggested seting the max and min variables, which as I said, are pretty much making my particles disappear.
ParticlePoints and particleSpeeds are D3DXVector3 arrays, and I get what I expect from them. A book I follow suggested I define a custom vertex with XYZ and diffuse but I see no reason for this to be honest, it just adds a lot more to a long list of declarations.
Any help is welcome, thanks in advance.
Edit: Further tweaking showed than when any of the scale values are above 0.99999997f (at least between that and 0.99999998f I see the effect), I get the tiny version, if I put them there or lower I pretty much get the size of the texture - though that is still not really that good as it may be large, and it pretty much fails the task of being controllable.
Glad to help :) My comment as an answer:
One more problem that I've seen is you float to dword cast. The official documentation suggests the following conversion *((DWORD*)&Variable (doc) to be put into SetRenderState. I'm not very familiar with C++, but I would assume that this makes a difference, because your cast sets a real dword, but the API expects a float in the dwords memory space.
I'm writing program on c++ that needs to generate graphs and calculate some measures.I'm working with Visual Studio 2013 and Igraph C library. At this point I can create graphs from custom info and calculate some metrics like betweennes and closeness centrality, but when i try to calculate eigenvector centrality, the program crash and show me this message:
"Run-Time Check Failure #3 - The variable 'tgetv0' is being used without being initialized."
The tgetv0 variable is used inside of dgetv.c from Igraph source.
Here is my code:
void GraphObject::calcEigen()
{
igraph_arpack_options_t options;
igraph_real_t value;
igraph_vector_t weights;
igraph_vector_init(&weights, igraph_ecount(&cGraph)); //cGraph is already created.
igraph_vector_init(&eigenRes, igraph_vcount(&cGraph)); //All ..Res igraph_vector_t are declarated in header
igraph_vector_init(&betweennesRes, 0);
igraph_vector_init(&closenessRes, 0);
igraph_arpack_options_init(&options);
igraph_betweenness(&cGraph, &betweennesRes, igraph_vss_all(), 0, 0, 1);
igraph_closeness(&cGraph, &closenessRes, igraph_vss_all(), IGRAPH_ALL, 0, 1);
igraph_eigenvector_centrality(&cGraph, &eigenRes, &value, 0, 1, &weights, &options);
}
The closeness and betwenness are correctly calculated an "couted" but crash on eigenvector function.
After lot of research on documentation, internet and the debugger i cant't figure which is the problem, especially when I tryed the example code in the documentation http://igraph.org/c/doc/igraph-Structural.html#igraph_eigenvector_centrality (copy/paste) and makes the same. Is this a library or example issue, I a'm missing something?
When I init the weights vector and then I call igraph_null(&weights), it works but the result of all eigenvalues is 1, and this is incorrect result. What I'm doing wrong?
Let us assume that Visual Studio is right and we indeed have a variable named tgetv0 that is being used uninitialized. I scanned igraph's source code and it looks like there are two places where it could indeed be the case. One of them is in src/lapack/dnaupd.c, the other one is in src/lapack/dsaupd.c. Both of these files were converted from Fortran using f2c so it is hard to tell whether the issue was present in the original Fortran code or whether this was introduced during the conversion. Either way, you can probably fix this easily by looking up the lines where tgetv0 is declared in src/lapack/dnaupd.c and src/lapack/dsaupd.c and initializing it to a value of 0. In my version, the lines to change are line 486 in src/lapack/dnaupd.c and line 482 in src/lapack/dsaupd.c.
Please add a comment to confirm whether the solution works for you or not - if it works, I'll commit a patch to the igraph source tree.
I'm trying to resize the terminal window I've been printing in with PDCurses. It only works sometimes. Otherwise it just sets itself to the default size, not even returning an error.
Examples of sizes that work:
resize_term(50, 50);
resize_term(100, 100);
resize_term(51, 100);
resize_term(50, 51);
resize_term(2, 60);
Examples of sizes that don't work:
resize_term(51, 51);
resize_term(51, 50);
resize_term(100, 51);
resize_term(60, 2);
Does anyone know why these certain ranges of sizes don't work?
(Also, bear in mind that resize_term takes the width as the second argument, not the first)
I noticed that curses doesn't resize the terminal when it "thinks" it may go out of the bounds of the (physical, real world) screen.
Sorry for the lack of details, I don't know the underlying mechanics of this behaviour.
EDIT :
Here's a quote from the PDCurses documentation :
"resize_term() is effectively two functions: When called with nonzero values for nlines and ncols, it attempts to resize the screen to the given size.[...]"
Obviously emphasized on the "attempt", but it does not give any further information...
What would be the easiest, most cross-platform way to create a bitmap (2D array of integers, or a quad-tree) and display it on the screen? I would also like to be able to save it as a file.
Thanks
It has to be said -- the easiest and most cross platform approach is probably to use printf, with something like:
// y and x loops would surround this...
unsigned char grayscaleValue = /* something */;
printf("%c",grayscaleValue < 128 ? " " : "X");
You could use more than two brightness values.
I also like both Qt and Juce; they're both relatively straightforward cross platform GUI toolkits. They can both be got up and running in an evening or two... the ascii printout (and its variations) can be done in an hour.