CCRenderTexture (alpha) beginWithClear transparent white - cocos2d-iphone

I don`t understand how transparency/alpa works with CCRenderTexture.
With this code i would have expected a half transparent white over the red ColorLayer (Transparency works for the CCLayerColor). What i get is complete white.
This Code is just added to the default Template at the end of the HelloWorldLayer init Method.
CCLayerColor * lc = [CCLayerColor layerWithColor:ccc4(255.0f, 0.0f, 0.0f, 125.0f)];
[self addChild:lc];
CCRenderTexture * rt = [CCRenderTexture renderTextureWithWidth:480.0f
height:320.0f];
[self addChild:rt];
rt.position = ccp(240.0f,160.0f);
[rt beginWithClear:1.0f g:1.0f b:1.0f a:0.5f];
[rt end];
if i change to black i DO get half transparent black:
[rt beginWithClear:0.0f g:0.0f b:0.0f a:0.5f];
With alpha 0.0f and green 1.0 i get green - would have expected a clear layer..
[rt beginWithClear:0.0f g:1.0f b:0.0f a:0.0f];
The real problem ist that in that RT i can`t draw transparency with white in a fragment shader:
gl_FragColor = vec4( 1.0, 1.0, 1.0, 0.5);
results in complete white..
Any idea?

Try it:
[renderTexture.sprite setBlendFunc:(ccBlendFunc){GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA}];

Related

How to make white transparent

I am playing with OGL's blending. I have a white background. I draw quad on this background. The quad is binded with white and black texture. Colors array is filled with color and alpha values:
for (i = 0; i < IMAGE_WIDTH; i++) {
for (j = 0; j < MAGE_HEIGHT; j++) {
c = ((((i&0x8)==0)^((j&0x8))==0))*255;
Image[i][j][0] = (GLubyte) c;
Image[i][j][1] = (GLubyte) c;
Image[i][j][2] = (GLubyte) c;
Image[i][j][3] = (GLubyte) 255;
};
Display textured quad:
glEnable(GL_BLEND);
glBlendFunc (GL_SRC_ALPHA, GL_ONE);
glDisable (GL_DEPTH_TEST);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texCheck);
glLoadIdentity();
glTranslatef (0.0f, 0.0f, -9.0f);
glBegin (GL_QUADS);
glColor3f (1.0f, 1.0f, 1.0f);
glTexCoord2f(0.0, 0.0); glVertex3f(-5.0f, -5.0f, 0.0f);
glTexCoord2f(0.0, 1.0); glVertex3f( 5.0f, -5.0f, 0.0f);
glTexCoord2f(1.0, 1.0); glVertex3f( 5.0f, 5.0f, 0.0f);
glTexCoord2f(1.0, 0.0); glVertex3f(-5.0f, 5.0f, 0.0f);
glEnd();
glDisable(GL_TEXTURE_2D);
glEnable (GL_DEPTH_TEST);
glDisable (GL_BLEND);
Black color of the binded texture on the quad is invisible and works fine. What should I do to make white color of binded texture transparent and black color of binded texure not transparent.
Your blend function should be
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
Supplying GL_ONE as second parameter means that destination color value is taken without taking source alpha value into account so no proper blending occurs.
You have to solve 2 issues:
If different texels of a texture should be blended in different way, then you have to set different alpha channels. If, as in your case, black colored texels should be opaque and white colored texels should be invisible, then the alpha channel of the black texel has to be set to 255 and the alpha channel of the white texel has to be set to 0. Note, any alpha value between 0 and 255 causes a transparent effect, which is more or less strong.
you have set the blending function to glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)
Blending calculates a new target color (the fragment color in the frame buffer), by a function of the original target color and the source color (in your case the color of the texel).
If you set the glBlendFunc
with the functions (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) and you use
glBlendEquation with the equation GL_FUNC_ADD (this is default) then the destination color is
calculated as follows:
C_dest_new = C_src * A_src + C_dest * (1-A_src)
If the alpha channel is equal to 0.0:
C_dest_new = C_src * 0 + C_dest * (1-0) = C_dest
If the alpha channel is equal to 1.0 (255):
C_dest_new = C_src * 1 + C_dest * (1-1) = C_src
Extension to the answer
To set the alpha channel of white texels to 0, the code, which generates the texture, has to be changed somehow like this:
for (i = 0; i < IMAGE_WIDTH; i++) {
for (j = 0; j < MAGE_HEIGHT; j++) {
c = ((((i&0x8)==0)^((j&0x8))==0))*255;
Image[i][j][0] = (GLubyte) c;
Image[i][j][1] = (GLubyte) c;
Image[i][j][2] = (GLubyte) c;
Image[i][j][3] = (GLubyte) (255 - c);
}
}

OpenGL Fade in / fade out a texture2D

I am writing an application that displays .jpg files that are stored as Texture2D (RGB) in OpenGL. I want to smoothly change from one texture2D to the next by fading to black, then fading into the next texture.
After looking for some explanation I wrote something like this.
void renderTexture()
{
glMatrixMode(GL_MODELVIEW);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, mTexture);
gluSphere(mQuad, 1.0f, 50, 50);
glBindTexture(GL_TEXTURE_2D, 0);
}
void fadeToBlack()
{
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
for (GLfloat alpha = 1.0; alpha > 0.0; alpha -= 0.05)
{
glColor4f(0.0, 0.0, 0.0, alpha);
renderTexture();
glFlush();
glutSwapBuffers();
}
glDisable(GL_BLEND);
}
Unfortunately, this does not fade to black but instead switches to black immediately. I must have some misunderstanding on how GL_BLEND is working here. Can somebody please point out what I am doing wrong?
** EDIT: This did the trick. Thanks a lot j-p and Benjamin for the pointers **
void fadeToBlack()
{
for (GLfloat alpha = 1.0; alpha > 0.0; alpha -= 0.001)
{
renderTexture();
glColor4f(alpha, alpha, alpha, alpha);
glFlush();
glutSwapBuffers();
}
glColor4f(1.0, 1.0, 1.0, 1.0);
}
The for loop will be executing so quickly that the texture changes will appear to happen instantly.

DX11 Alpha blending when rendering to a texture

FINAL EDIT:
Resolved... just needed to learn how alpha blending works in-depth. I should have had:
oBlendStateDesc.RenderTarget[a].DestBlendAlpha = D3D11_BLEND_ZERO;
...set to D3D11_BLEND_ONE to preserve the alpha.
When rendering to the backbuffer once the problem would not be noticed as the colours blend normal and that is the final output. When rendering to the texture the same thing applies, just then rendering the texture to the backbuffer the incorrect alpha plays a role in incorrectly blending the texture into the backbuffer.
I then ran into another issue where the alpha seemed to be decreasing. This is because the colour is blended twice, for example...
Source.RBGA = 1.0f, 0.0f, 0.0f, 0.5f
Dest.RGBA = 0.0f, 0.0f, 0.0f, 0.0f
Render into texture...
Result.RGB = Source.RBG * Source.A + Dest.RGB * (1 - Source.A) = 0.5f, 0.0f, 0.0f
Result.A = Source.A * 1 + Dest.A * 1 = 0.5f
Now...
Source.RBGA = 0.5f, 0.0f, 0.0f, 0.5f
Dest.RGBA = 0.0f, 0.0f, 0.0f, 0.0f
Render into backbuffer...
Result.RGB = Source.RBG * Source.A + Dest.RGB * (1 - Source.A) = 0.25f, 0.0f, 0.0f
Result.A = Source.A * 1 + Dest.A * 1 = 0.5f
To resolve this, when rendering the texture into the backbuffer I use the same blendstate but change the SrcBlend to D3D11_BLEND_ONE so the colour is not blended twice.
Hopefully this helps anyone else having a similar problem....
EDITEND
To increase performance I'm attempting to render a string of text that never changes into a texture to save rendering each individual character every time.
Since I'm rendering strictly in 2D, I've disabled the depth & stencil testing while enabling alpha blending.
Problem is there doesn't seem to be any alpha blending happening, whatever is drawn last overwrites the current pixel with its own data... no blending.
I use a single blend state which I do not change. When rendering to the backbuffer the blending works fine. When rendering the final texture to the backbuffer the blending also works fine. It's just when I render to the texture that blending seems to fail.
Here's how I set up my single blend state:
D3D11_BLEND_DESC oBlendStateDesc;
oBlendStateDesc.AlphaToCoverageEnable = 0;
oBlendStateDesc.IndependentBlendEnable = 0; //set to false, dont need loop below... but just incase
for (unsigned int a = 0; a < 8; ++a)
{
oBlendStateDesc.RenderTarget[a].BlendEnable = 1;
oBlendStateDesc.RenderTarget[a].SrcBlend = D3D11_BLEND_SRC_ALPHA;
oBlendStateDesc.RenderTarget[a].DestBlend = D3D11_BLEND_INV_SRC_ALPHA;
oBlendStateDesc.RenderTarget[a].BlendOp = D3D11_BLEND_OP_ADD;
oBlendStateDesc.RenderTarget[a].SrcBlendAlpha = D3D11_BLEND_ONE;
oBlendStateDesc.RenderTarget[a].DestBlendAlpha = D3D11_BLEND_ZERO;
oBlendStateDesc.RenderTarget[a].BlendOpAlpha = D3D11_BLEND_OP_ADD;
oBlendStateDesc.RenderTarget[a].RenderTargetWriteMask = D3D11_COLOR_WRITE_ENABLE_ALL;
}
// Create the blend state from the description
HResult = m_poDevice->CreateBlendState(&oBlendStateDesc, &m_poBlendState_Default);
m_poDeviceContext->OMSetBlendState(m_poBlendState_Default, nullptr, 0xffffff);
Are there any extra steps I am missing to enable blending when rendering to a texture?
EDIT: If I set AlphaToCoverageEnable to true it blends, but looks terrible. That at least confirms it is using the same blend state... just works differently depending on when rendering to backbuffer or a texture : / Here's my texture desc...
m_oTexureDesc.Width = a_oDesc.m_uiWidth;
m_oTexureDesc.Height = a_oDesc.m_uiHeight;
m_oTexureDesc.MipLevels = 1;
m_oTexureDesc.ArraySize = 1;
m_oTexureDesc.Format = DXGI_FORMAT_R32G32B32A32_FLOAT;
m_oTexureDesc.SampleDesc.Count = 1; //No sampling
m_oTexureDesc.SampleDesc.Quality = 0;
m_oTexureDesc.Usage = D3D11_USAGE_DEFAULT; //GPU writes & reads
m_oTexureDesc.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE;
m_oTexureDesc.CPUAccessFlags = 0;
m_oTexureDesc.MiscFlags = 0;
EDIT:
Here's some visualization...
Rendering to backbuffer - AlphaBlending enabled.
Rendering to texture - AlphaBlending enabled.
Rendering to backbuffer - AlphaBlending disabled.
Letter T taken from the font file
*When rendering with AB disabled, the letters match exactly (compare 4 & 3)
*When rendering to the backbuffer with AB enabled, the letters render slightly (hardly noticeable) washed out but still blend (compare 4 & 1)
*When rendering to a texture with AB enabled, the letters render even more noticeably washed out while not blending at all. (compare 4 & 2)
Not sure why the colours are washed out with alpha blending enabled... but maybe its a clue?
EDIT:
If I clear the render target texture to say... 0.0f, 0.0f, 1.0f, 1.0f (RGBA, blue)... this is the result:
Only the pixels with alpha > 0.0f & < 1.0f blend with the colour. Another clue but I have no idea how to resolve this issue...

Making a hole on a CCRenderTexture

I am trying to make a hole on a CCRenderTexture with Cocos2D 2.0.
More specifically I have:
a CCSprite "stars" that shows some stars repeating a png image;
on top of that I have a CCRenderTexture "dark" that completely covers the "stars" sprite.
I want to be able to cut a hole on "dark" in order to show the stars below.
I am using CCRenderTexture (as suggested in some tutorials) but the hole I manage to make is never fully transparent, so the stars visible through the hole are partially obscured.
I really hope some one can show me the light, I spent over a week on this...
This is the code:
#interface MyBackgroundLayer : CCLayerGradient {
}
#end
#implementation MyBackgroundLayer
CCRenderTexture * dark;
CCSprite * stars;
-(id) init
{
if (self=[super init]) {
CGSize size = [[CCDirector sharedDirector] winSize];
// background
stars = [CCSprite spriteWithFile:#"stars.png" rect:CGRectMake(0, 0, size.width, size.height)];
stars.anchorPoint = ccp(0,0);
ccTexParams params = {GL_LINEAR,GL_LINEAR,GL_REPEAT,GL_REPEAT};
[stars.texture setTexParameters:&params];
[self addChild:stars];
// dark layer to cover the background
dark = [CCRenderTexture renderTextureWithWidth:size.width height:size.height];
[dark clear:0 g:0 b:0 a:1.0f];
[self addChild: dark];
dark.position = ccp(size.width/2,size.height/2);
[[dark sprite] setBlendFunc: (ccBlendFunc) { GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA }];
}
return self;
}
-(void)draw
{
[super draw];
[dark beginWithClear:0 g:0 b:0 a:1];
glColorMask(0, 0, 0, 1);
// Here I am using 0.5 as alpha value, this could seems the reason why the hole is not fully transparent.
// However if I change the alpha value to 1.0f or 0.0f the hole becomes completely opaque
ccColor4F color = ccc4f(1.f, 1.f, 1.f, 0.5f);
ccDrawSolidRect(ccp(0,0), ccp(600, 600), color);
glColorMask(1,1,1,1);
[dark end];
}
#end
I think what you're looking for is something like this ( may need some editing).
As for your code ..i thing the problem is in the blend function. Check this out to see how they work.

What is wrong with this text color?

I'm using the openFrameworks ofxPango addon to render text with following code:
ofxPango* pango;
ofxPCContext* context;
ofxPCPangoLayout* layout;
ofImage text_image;
pango = new ofxPango();
context = pango->createContextWithSurface(width, height);
context->color4f(1,1,1, 0.0f);
context->paint();
layout = context->createPangoLayout();
layout->setText(text);
layout->setTextColor(186,34,29, 1.0f);
layout->setWidth(width);
layout->setJustify(true);
//context->paint();
ofxPCPangoFontDescription* fd = new ofxPCPangoFontDescription();
fd->createFromString(font);
layout->setFontDescription(*fd);
layout->show();
text_image.allocate(context->getSurface()->getWidth(), context->getSurface()->getHeight(), OF_IMAGE_COLOR_ALPHA);
text_image.setFromPixels(context->getSurface()->getPixels(), text_image.width, text_image.height, OF_IMAGE_COLOR_ALPHA, true);
I'm having trouble understanding how layout->setTextColor(r,g,b,a) works.
If I run:
0,0,0,1 - text is black as it should be
255,0,0 - text is red as it should be
186,34,29,1 - text appears very light gray (maybe white) when it should be red
186,34,0,1 - text is yellow, although it should be red
Why are these colors coming out wrong?
I think color values are supposed to be within the range 0.0f and 1.0f, where:
0.0f = no color
0.5f = half color
1.0f = full color
Here are some abridged examples from the Cairo library, which ofxPango calls out to:
color_white
1.0, 1.0, 1.0, 1.0
color_black
0.0, 0.0, 0.0, 1.0
color_transparent
0.0, 0.0, 0.0, 0.0
color_magenta
1.0, 0.0, 1.0, 1.0