I have a 3D Webgl scene. I am using Regl http://regl.party/ . Which is WebGL. So I am essentially writing straight GLSL.
This is a game project. I have an array of 3D positions [[x,y,z] ...] which are bullets, or projectiles. I want to draw these bullets as a simple cube, sphere, or particle. No requirement on the appearance.
How can I make shaders and a draw call for this without having to create a repeated duplicate set of geometry for the bullets?
Preferring an answer with a vert and frag shader example that demonstrates the expected data input and can be reverse engineered to handle the CPU binding layer
You create an regl command which encapsulates a bunch of data. You can then call it with an object.
Each uniform can take an optional function to supply its value. That function is passed a regl context as the first argument and then the object you passed as the second argument so you can call it multiple times with a different object to draw the same thing (same vertices, same shader) somewhere else.
var regl = createREGL()
const objects = [];
const numObjects = 100;
for (let i = 0; i < numObjects; ++i) {
objects.push({
x: rand(-1, 1),
y: rand(-1, 1),
speed: rand(.5, 1.5),
direction: rand(0, Math.PI * 2),
color: [rand(0, 1), rand(0, 1), rand(0, 1), 1],
});
}
function rand(min, max) {
return Math.random() * (max - min) + min;
}
const starPositions = [[0, 0, 0]];
const starElements = [];
const numPoints = 5;
for (let i = 0; i < numPoints; ++i) {
for (let j = 0; j < 2; ++j) {
const a = (i * 2 + j) / (numPoints * 2) * Math.PI * 2;
const r = 0.5 + j * 0.5;
starPositions.push([
Math.sin(a) * r,
Math.cos(a) * r,
0,
]);
}
starElements.push([
0, 1 + i * 2, 1 + i * 2 + 1,
]);
}
const drawStar = regl({
frag: `
precision mediump float;
uniform vec4 color;
void main () {
gl_FragColor = color;
}`,
vert: `
precision mediump float;
attribute vec3 position;
uniform mat4 mat;
void main() {
gl_Position = mat * vec4(position, 1);
}`,
attributes: {
position: starPositions,
},
elements: starElements,
uniforms: {
mat: (ctx, props) => {
const {viewportWidth, viewportHeight} = ctx;
const {x, y} = props;
const aspect = viewportWidth / viewportHeight;
return [.1 / aspect, 0, 0, 0,
0, .1, 0, 0,
0, 0, 0, 0,
x, y, 0, 1];
},
color: (ctx, props) => props.color,
}
})
regl.frame(function () {
regl.clear({
color: [0, 0, 0, 1]
});
objects.forEach((o) => {
o.direction += rand(-0.1, 0.1);
o.x += Math.cos(o.direction) * o.speed * 0.01;
o.y += Math.sin(o.direction) * o.speed * 0.01;
o.x = (o.x + 3) % 2 - 1;
o.y = (o.y + 3) % 2 - 1;
drawStar(o);
});
})
<script src="https://cdnjs.cloudflare.com/ajax/libs/regl/1.3.11/regl.min.js"></script>
You can draw all of the bullets as point sprites, in which case you just need to provide the position and size of each bullet and draw them as GL_POINTS. Each “point” is rasterized to a square based on the output of your vertex shader (which runs once per point). Your fragment shader is called for each fragment in that square, and can color the fragment however it wants—with a flat color, by sampling a texture, or however else you want.
Or you can provide a single model for all bullets, a separate transform for each bullet, and draw them as instanced GL_TRIANGLES or GL_TRIANGLE_STRIP or whatever. Read about instancing on the OpenGL wiki.
Not a WebGL coder so read with prejudice...
Encode the vertexes in a texture
beware of clamping use texture format that does not clamp to <0.0,+1.0> like GL_LUMINANCE32F_ARB or use vertexes in that range only. To check for clamping use:
GLSL debug prints
Render single rectangle covering whole screen
and use the texture from #1 as input. This will ensure that a fragment shader is called for each pixel of the screen/view exactly once.
Inside fragment shader read the texture and check the distance of a fragment to your vertexes
based on it render your stuff or dicard() fragment... spheres are easy, but boxes and other shapes might be complicated to render based on the distance of vertex especially if they can be arbitrary oriented (which need additional info in the input texture).
To ease up this you can prerender them into some texture and use the distance as texture coordinates ...
This answer of mine is using this technique:
raytrace through 3D mesh
You can sometimes get away with using GL_POINTS with a large gl_PointSize and a customized fragment shader.
An example shown here using distance to point center for fragment alpha. (You could also just as well sample a texture)
The support for large point sizes might be limited though, so check that before deciding on this route.
var canvas = document.getElementById('cvs');
gl = canvas.getContext('webgl');
var vertices = [
-0.5, 0.75,0.0,
0.0, 0.5, 0.0,
-0.75,0.25,0.0,
];
var vertex_buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
var vertCode =
`attribute vec3 coord;
void main(void) {
gl_Position = vec4(coord, 1.0);
gl_PointSize = 50.0;
}`;
var vertShader = gl.createShader(gl.VERTEX_SHADER);
gl.shaderSource(vertShader, vertCode);
gl.compileShader(vertShader);
var fragCode =
`void main(void) {
mediump float ds = distance(gl_PointCoord.xy, vec2(0.5,0.5))*2.0;
mediump vec4 fg_color=vec4(0.0, 0.0, 0.0,1.0- ds);
gl_FragColor = fg_color;
}`;
var fragShader = gl.createShader(gl.FRAGMENT_SHADER);
gl.shaderSource(fragShader, fragCode);
gl.compileShader(fragShader);
var shaderProgram = gl.createProgram();
gl.attachShader(shaderProgram, vertShader);
gl.attachShader(shaderProgram, fragShader);
gl.linkProgram(shaderProgram);
gl.useProgram(shaderProgram);
gl.bindBuffer(gl.ARRAY_BUFFER, vertex_buffer);
var coord = gl.getAttribLocation(shaderProgram, "coord");
gl.vertexAttribPointer(coord, 3, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(coord);
gl.viewport(0,0,canvas.width,canvas.height);
gl.drawArrays(gl.POINTS, 0, 3);
<!doctype html>
<html>
<body>
<canvas width = "400" height = "400" id = "cvs"></canvas>
</body>
</html>
Related
I've been trying to produce a glossy shading using the Phong model but for some reason instead of glossy appearance, all I get is big white blotch on the front of the sphere. Initially the model worked for a single sphere, but now I've updated the code so I can draw multiple spheres and the model has started to fail despite applying the same logic and I don't know why.
single sphere: diffuse and specular
diffuse, multiple
diffuse+specular, multiple
main part
vec color(const ray& r)
{
vector <sphere> objects;
vector <Light> lighting;
objects.push_back(sphere(vec(0,-100.5,-3), 100, vec(0, 1, 0)));
objects.push_back(sphere(vec(0, 0, -1), 0.5, vec(1, 0, 0)));
objects.push_back(sphere(vec(0, 1 ,-1), 0.5, vec(1, 0, 1)));
lighting.push_back(Light(vec(0, 0, -1), vec(0, -1, 0)));
float infinity = 2000.0;
sphere* closest = NULL;
vec background_color( .678, .847, .902);
vec totalLight(0.0, 0.0, 0.0);
int pos = 0;
for(int j = 0; j < objects.size(); j++)
{
float t = objects[j].intersect(r);
if(t > 0.0)
{
if(t < 2000.0)
{
infinity = t;
closest = &objects[j];
pos = j;
}
}
}
if(infinity == 2000.0)
return background_color;
else
{
float a = objects[pos].intersect(r);
vec view_dir = vec(-2, 2, 10) - r.p_at_par(a);
vec normal = unit_vector((r.p_at_par(a) - closest->centre)/closest->radius);
vec light = unit_vector(vec(-2, 0, 0) - r.p_at_par(a));
vec reflection = 2.0*dot(light, normal)*normal - light;
vec specular = vec(1, 1, 1)*pow(max(0.f, dot(reflection, view_dir)), 256);
vec diffuse = (closest->color)*max(0.f, dot(normal, light));
vec total = diffuse + specular;
return total;
}
}
as I understand, specular = white * dot(view_dir, L_dir)^n * ks and the total lighting is = specular + diffuse + ambient.
You are indeed right on your specular contribution. You can see it as how much light is reflected into my viewing direction.
First of all I don't see you normalising view_dir. Make sure all vectors are normalised. If a and b have length 1 the next is true
Also to help debugging in the future you may want to generate false color images. This images can help you see what's going on. e.g you can just render your flat color, surface normals (xyz to rgb), the number of light sources affecting a certain pixel, ... . This may help you spotting unexpected behaviours.
Hope this helps.
I'm trying to iterate over a large amount of data in my fragment shader in webgl. I want to pass a lot of data to it and then iterate on each pass of the fragment shader. I'm having some issues doing that though. My ideas were the following:
1. pass the data in uniforms to the frag shader, but I can't send very much data that way.
2. use a buffer to send data as I do verts to the vert shader and then use a varying to send data to the frag shader. unfortunately this seems to involve some issues. (a) varying's interpolate between vectors and I think that'll cause issues with my code (although perhaps this is unavoidable ) (b) more importantly, I don't know how to iterate over the data i pass to my fragment shader. I'm already using a buffer for my 3d point coordinates, but how does webgl handle a second buffer and data coming through it.
* I mean to say, in what order is data fetched from each buffer (my first buffer containing 3d coordinates and the second buffer I'm trying to add)? lastly, as stated above, if i want to iterate over all the data passed for every pass of the fragment shader, how can i do that? *
i've already tried using a uniform array and iterate over that in my fragment shader but i ran into limitations I believe since there is a relatively small size limit for uniforms. I'm currently trying the second method mentioned above.
//pseudo code
vertexCode = `
attribute vec4 3dcoords;
varying vec4 3dcoords;
??? ??? my_special_data;
void main(){...}
`
fragCode = `
varying vec4 3dcoords;
void main(){
...
// perform math operation on 3dcoords for all values in my_special_data variable and store in variable my_results
if( my_results ... ){
gl_FragColor += ...;
}
`
Textures in WebGL are random access 2D arrays of data so you can use them to read lots of data
Example:
const width = 256;
const height = 256;
const vs = `
attribute vec4 position;
void main() {
gl_Position = position;
}
`;
const fs = `
precision highp float;
uniform sampler2D tex;
const int width = ${width};
const int height = ${height};
void main() {
vec4 sums = vec4(0);
for (int y = 0; y < height; ++y) {
for (int x = 0; x < width; ++x) {
vec2 xy = (vec2(x, y) + 0.5) / vec2(width, height);
sums += texture2D(tex, xy);
}
}
gl_FragColor = sums;
}
`;
function main() {
const gl = document.createElement('canvas').getContext('webgl');
// check if we can make floating point textures
const ext1 = gl.getExtension('OES_texture_float');
if (!ext1) {
return alert('need OES_texture_float');
}
// check if we can render to floating point textures
const ext2 = gl.getExtension('WEBGL_color_buffer_float');
if (!ext2) {
return alert('need WEBGL_color_buffer_float');
}
// make a 1x1 pixel floating point RGBA texture and attach it to a framebuffer
const framebufferInfo = twgl.createFramebufferInfo(gl, [
{ type: gl.FLOAT, },
], 1, 1);
// make random 256x256 texture
const data = new Uint8Array(width * height * 4);
for (let i = 0; i < data.length; ++i) {
data[i] = Math.random() * 256;
}
const tex = twgl.createTexture(gl, {
src: data,
minMag: gl.NEAREST,
wrap: gl.CLAMP_TO_EDGE,
});
// compile shaders, link, lookup locations
const programInfo = twgl.createProgramInfo(gl, [vs, fs]);
// create a buffer and put a 2 unit
// clip space quad in it using 2 triangles
const bufferInfo = twgl.createBufferInfoFromArrays(gl, {
position: {
numComponents: 2,
data: [
-1, -1,
1, -1,
-1, 1,
-1, 1,
1, -1,
1, 1,
],
},
});
// render to the 1 pixel texture
gl.bindFramebuffer(gl.FRAMEBUFFER, framebufferInfo.framebuffer);
// set the viewport for 1x1 pixels
gl.viewport(0, 0, 1, 1);
gl.useProgram(programInfo.program);
// calls gl.bindBuffer, gl.enableVertexAttribArray, gl.vertexAttribPointer
twgl.setBuffersAndAttributes(gl, programInfo, bufferInfo);
// calls gl.activeTexture, gl.bindTexture, gl.uniformXXX
twgl.setUniforms(programInfo, {
tex,
});
const offset = 0;
const count = 6;
gl.drawArrays(gl.TRIANGLES, offset, count);
// read the result
const pixels = new Float32Array(4);
gl.readPixels(0, 0, 1, 1, gl.RGBA, gl.FLOAT, pixels);
console.log('webgl sums:', pixels);
const sums = new Float32Array(4);
for (let i = 0; i < data.length; i += 4) {
for (let j = 0; j < 4; ++j) {
sums[j] += data[i + j] / 255;
}
}
console.log('js sums:', sums);
}
main();
<script src="https://twgljs.org/dist/4.x/twgl-full.min.js"></script>
I am starting to learn OpenGL now, using the Go programming language (I just couldn't get working with C/C++ on my Windows machine), and so far I've managed to display some rotating cubes in the screen with textures mainly copying and pasting code from tutorials. I've learned a lot, though, but I just can't get some text on the screen with this code that I wrote on my own. I've looked up many tutorials and questions but nothing seems to work, and I suspect there is something wrong with the vertices because I'm pretty sure the textures coordinates are correct and still there's nothing showing up in the screen. Here's the code:
package game
import (
"fmt"
"io/ioutil"
"image"
"image/draw"
"github.com/go-gl/gl/v3.3-core/gl"
mgl "github.com/go-gl/mathgl/mgl32"
"github.com/golang/freetype/truetype"
"golang.org/x/image/font"
"golang.org/x/image/math/fixed"
)
type GameFont struct {
loaded bool
vao uint32
vbo VBOData
pix float32
Texture *Texture
Shader ShaderProgram
}
// Load a TrueType font from a file and generate a texture
// with all important characters.
func (f *GameFont) Load(path string, pix float32) {
contents, err := ioutil.ReadFile(path)
if err != nil {
fmt.Println("Could not read font file: " + path)
panic(err)
}
fontFace, err := truetype.Parse(contents)
if err != nil {
fmt.Println("Could not parse font file: " + path)
panic(err)
}
// Create a texture for the characters
// Find the next power of 2 for the texture size
size := nextP2(int(pix * 16))
fg, bg := image.White, image.Black
rgba := image.NewRGBA(image.Rect(0, 0, size, size))
draw.Draw(rgba, rgba.Bounds(), bg, image.ZP, draw.Src)
d := &font.Drawer{
Dst: rgba,
Src: fg,
Face: truetype.NewFace(fontFace, &truetype.Options{
Size: float64(pix),
DPI: 72,
Hinting: font.HintingNone,
}),
}
// Some GL preps
gl.GenVertexArrays(1, &f.vao)
gl.BindVertexArray(f.vao)
f.vbo.Create()
f.vbo.Bind()
f.Shader = newShaderProgram("data/shaders/font.vert", "data/shaders/font.frag")
f.Shader.Use()
f.Shader.SetUniform("tex", 0)
// Create vertex data (and coordinates in the texture) for each character
// All characters below 32 are useless
for i := 32; i < 128; i++ {
c := string(rune(i))
x, y := i % 16, i / 16
// Draw the character on the texture
d.Dot = fixed.P(x * int(pix), y * int(pix))
d.DrawString(c)
// Vertices
quads := []float32{
0, 0,
0, pix,
pix, 0,
pix, pix,
}
norm := func(n int) float32 {
return float32(n) / 16.0
}
// Texture coordinates (normalized)
texQuads := []float32{
norm(x), 1.0 - norm(y + 1),
norm(x), 1.0 - norm(y),
norm(x + 1), 1.0 - norm(y + 1),
norm(x + 1), 1.0 - norm(y),
}
for v := 0; v < 8; v += 2 {
vQuads, vTexQuads := quads[v:(v+2)], texQuads[v:(v+2)]
// Data is like (X, Y, U, V)
f.vbo.AppendData(vQuads, 2)
f.vbo.AppendData(vTexQuads, 2)
}
}
// Upload data to GPU and we're done
f.Texture = newTextureFromRGBA(rgba)
f.Texture.Bind()
f.Texture.SetGLParam(gl.TEXTURE_MIN_FILTER, gl.LINEAR)
f.Texture.SetGLParam(gl.TEXTURE_MAG_FILTER, gl.LINEAR)
f.Texture.Upload()
f.vbo.UploadData(gl.STATIC_DRAW)
gl.EnableVertexAttribArray(0)
gl.VertexAttribPointer(0, 2, gl.FLOAT, false, 4*4, gl.PtrOffset(0))
gl.EnableVertexAttribArray(1)
gl.VertexAttribPointer(1, 2, gl.FLOAT, false, 4*4, gl.PtrOffset(2*4))
f.loaded = true
}
// Render a text using the font
func (f *GameFont) Render(text string, x, y int, pix float32, color mgl.Vec4) {
if !f.loaded {
return
}
gl.Disable(gl.DEPTH_TEST)
gl.Enable(gl.BLEND)
gl.BlendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA)
gl.BindVertexArray(f.vao)
f.Shader.Use()
f.Shader.SetUniform("projection", mgl.Ortho2D(0, _screen.Width, 0, _screen.Height))
f.Shader.SetUniform("color", color)
f.Texture.Bind()
scale := pix / f.pix
for i := 0; i < len(text); i++ {
index := rune(text[i])
model := mgl.Ident4().Mul4(mgl.Scale3D(scale, scale, 0))
model = model.Add(mgl.Translate3D(float32(x) + float32(i) * pix, float32(y), 0))
f.Shader.SetUniform("model", model)
gl.DrawArrays(gl.TRIANGLE_STRIP, (32-index)*4, 4)
}
gl.Enable(gl.DEPTH_TEST)
gl.Disable(gl.BLEND)
}
Here's the shaders:
Vertex shader
#version 330
uniform mat4 projection;
uniform mat4 model;
layout (location = 0) in vec2 vert;
layout (location = 1) in vec2 vertTexCoord;
out vec2 fragTexCoord;
void main() {
fragTexCoord = vertTexCoord;
gl_Position = projection * model * vec4(vert, 0, 1);
}
Fragment shader
#version 330
uniform sampler2D tex;
uniform vec4 color;
in vec2 fragTexCoord;
out vec4 outputColor;
void main() {
outputColor = color * texture(tex, fragTexCoord);
}
Every "component" of the GameFont struct is working properly (I've used them with the rotating cubes), so every function calls the GL corresponding one.
Also the texture is being drawed correctly, I've saved it to the disk and it looks like this:
And still, there's no text on the screen.
I'm struggling to render a 2D Sprite to a Canvas using Dart and WebGL. I can find very few examples of this online; most are either 3D, or contain tons of spaghetti code with no real explanation of what they're doing. I'm trying to do the simplest thing that renders a sprite.
So far, I've managed to render a green square (two triangles) on a canvas. The bit I'm struggling with, is how to change this from a green square to using my texture (the texture is loaded and bound correctly, I believe). I think this will need changes to the shaders (to take texture co-ords, instead of colour) and something to pass texture coords relating to the vertices in the buffer.
This code also exists in a Gist.
Note: This is just a throwaway sample; most of the code lives in the constructor; I'm not too interested in how tidy the code is for now; I can tidy up when I can see a sprite on the screen!
Note: I'm not interested in using a third-party library; I'm doing this to learn WebGL!
<!DOCTYPE html>
<html>
<head>
<title>MySecondGame</title>
</head>
<body>
<canvas width="1024" height="768"></canvas>
<div style="display: none;">
<img id="img-player" src="assets/player.png" />
</div>
<script id="vertex" type="x-shader">
attribute vec2 aVertexPosition;
void main() {
gl_Position = vec4(aVertexPosition, 0.0, 1.0);
}
</script>
<script id="fragment" type="x-shader">
#ifdef GL_ES
precision highp float;
#endif
uniform vec4 uColor;
void main() {
gl_FragColor = uColor;
}
</script>
<script type="application/dart">
import 'dart:async';
import 'dart:html';
import 'dart:math';
import 'dart:typed_data';
import 'dart:web_gl';
Game game;
main() {
game = new Game(document.querySelector('canvas'));
}
class Game {
RenderingContext _gl;
Buffer vbuffer;
int numItems;
Texture playerTexture;
double elapsedTime;
double fadeAmount;
Game(CanvasElement canvas) {
_gl = canvas.getContext3d();
playerTexture = _gl.createTexture();
_gl.bindTexture(TEXTURE_2D, playerTexture);
_gl.texImage2DUntyped(TEXTURE_2D, 0, RGBA, RGBA, UNSIGNED_BYTE, document.querySelector('#img-player'));
_gl.texParameteri(TEXTURE_2D, TEXTURE_MAG_FILTER, NEAREST);
_gl.texParameteri(TEXTURE_2D, TEXTURE_MIN_FILTER, LINEAR_MIPMAP_NEAREST);
_gl.generateMipmap(TEXTURE_2D);
_gl.bindTexture(TEXTURE_2D, null);
var vsScript = document.querySelector('#vertex');
var vs = _gl.createShader(VERTEX_SHADER);
_gl.shaderSource(vs, vsScript.text);
_gl.compileShader(vs);
var fsScript = document.querySelector('#fragment');
var fs = _gl.createShader(FRAGMENT_SHADER);
_gl.shaderSource(fs, fsScript.text);
_gl.compileShader(fs);
var program = _gl.createProgram();
_gl.attachShader(program, vs);
_gl.attachShader(program, fs);
_gl.linkProgram(program);
if (!_gl.getShaderParameter(vs, COMPILE_STATUS))
print(_gl.getShaderInfoLog(vs));
if (!_gl.getShaderParameter(fs, COMPILE_STATUS))
print(_gl.getShaderInfoLog(fs));
if (!_gl.getProgramParameter(program, LINK_STATUS))
print(_gl.getProgramInfoLog(program));
var aspect = canvas.width / canvas.height;
var vertices = new Float32List.fromList([
-0.5, 0.5 * aspect, 0.5, 0.5 * aspect, 0.5, -0.5 * aspect, // Triangle 1
-0.5, 0.5 * aspect, 0.5,-0.5 * aspect, -0.5, -0.5 * aspect // Triangle 2
]);
vbuffer = _gl.createBuffer();
_gl.bindBuffer(ARRAY_BUFFER, vbuffer);
_gl.bufferData(ARRAY_BUFFER, vertices, STATIC_DRAW);
numItems = vertices.length ~/ 2;
_gl.useProgram(program);
var uColor = _gl.getUniformLocation(program, "uColor");
_gl.uniform4fv(uColor, new Float32List.fromList([0.0, 0.3, 0.0, 1.0]));
var aVertexPosition = _gl.getAttribLocation(program, "aVertexPosition");
_gl.enableVertexAttribArray(aVertexPosition);
_gl.vertexAttribPointer(aVertexPosition, 2, FLOAT, false, 0, 0);
window.animationFrame.then(_gameLoop);
}
_gameLoop(num time) {
elapsedTime = time;
_update();
_render();
window.animationFrame.then(_gameLoop);
}
_update() {
// Use sine curve for fading. Sine is -1-1, so tweak to be 0 - 1.
fadeAmount = (sin(elapsedTime/1000) / 2) + 0.5;
}
_render() {
// Set colour for clearing to.
_gl.clearColor(fadeAmount, 1 - fadeAmount, 0.0, 1.0);
// Clear.
_gl.clear(RenderingContext.COLOR_BUFFER_BIT);
_gl.bindTexture(TEXTURE_2D, playerTexture);
_gl.drawArrays(TRIANGLES, 0, numItems);
_gl.bindTexture(TEXTURE_2D, null);
}
}
</script>
<script src="packages/browser/dart.js"></script>
</body>
</html>
(Tagging this with opengl too because I believe the solution is likely the same for WebGL/OpenGL).
Ok, managed to make this work. You can see the full diff in a gist here.
I might be wrong; but it seems that I was expecting to set the data in the buffers while I was setting them up; but I couldn't find any way to say which data was for which buffer. I split the code into some setup code:
vbuffer = _gl.createBuffer();
_gl.bindBuffer(ARRAY_BUFFER, vbuffer);
_gl.bufferData(ARRAY_BUFFER, vertices, STATIC_DRAW);
numItems = vertices.length ~/ 2;
tbuffer = _gl.createBuffer();
_gl.bindBuffer(ARRAY_BUFFER, tbuffer);
_gl.bufferData(ARRAY_BUFFER, textureCoords, STATIC_DRAW);
aVertexPosition = _gl.getAttribLocation(program, "aVertexPosition");
_gl.enableVertexAttribArray(aVertexPosition);
aTextureCoord = _gl.getAttribLocation(program, "aTextureCoord");
_gl.enableVertexAttribArray(aTextureCoord);
uSampler = _gl.getUniformLocation(program, "uSampler");
and some rendering code:
_gl.bindBuffer(ARRAY_BUFFER, vbuffer);
_gl.vertexAttribPointer(aVertexPosition, 2, FLOAT, false, 0, 0);
_gl.bindBuffer(ARRAY_BUFFER, tbuffer);
_gl.vertexAttribPointer(aTextureCoord, 2, FLOAT, false, 0, 0);
_gl.bindTexture(TEXTURE_2D, playerTexture);
_gl.uniform1i(uSampler, 0);
_gl.drawArrays(TRIANGLES, 0, numItems);
I'm not entirely sure if this is correct (it feels like I'm sending the same vertex and textureCoord every frame), but it's working.
So I wrote a really simple OpenGL program to draw 100x100x100 points drawn as cubes using the Geometry Shader. I wanted to do it to benchmark it against what I could currently do using DirectX11.
With DirectX11, I can easily render these cubes at 60fps (vsync). However, with OpenGL I'm stuck at 40fps.
In both applications, I am:
Using a point tolopology to represent just the position of the cube (stride = 12 bytes).
Only mapping to the Vertex Buffer in the initialise function, only ever once.
Using only two draw calls in total: one to render the cubes, one to render frametime.
Using back-face culling, and depth testing.
Limiting state changes to the minimum I need to draw the cubes (VBO's/Shader Program).
Here is my draw call:
GLboolean CCubeApplication::Draw()
{
auto program = m_ppBatches[0]->GetShaders()->GetProgram(0);
program->Bind();
{
glUniformMatrix4fv(program->GetUniform("g_uWVP"), 1, false, glm::value_ptr(m_matMatrices[MATRIX_WVP]));
glDrawArrays(GL_POINTS, 0, m_uiTotal);
}
return true;
}
This function calls glBindVertexArray and glUseProgram
program->Bind();
And the rest is straight-forward. My Update function does nothing but update the camera's position and view matrix, and is identical in DirectX/OpenGL versions.
My vertex shader is a pass-through, and my fragment shader returns a constant colour. This is my geometry shader:
#version 440 core
// GS_LAYOUT
layout(points) in;
layout(triangle_strip, max_vertices = 36) out;
// GS_IN
in vec4 vOut_pos[];
// GS_OUT
// UNIFORMS
uniform mat4 g_uWVP;
const float f = 0.1f;
const int elements[] = int[]
(
0,2,1,
2,3,1,
1,3,5,
3,7,5,
5,7,4,
7,6,4,
4,6,0,
6,2,0,
3,2,7,
2,6,7,
5,4,1,
4,0,1
);
// GS
void main()
{
vec4 vertices[] = vec4[]
(
g_uWVP * (vOut_pos[0] + vec4(-f,-f,-f, 0)),
g_uWVP * (vOut_pos[0] + vec4(-f,-f,+f, 0)),
g_uWVP * (vOut_pos[0] + vec4(-f,+f,-f, 0)),
g_uWVP * (vOut_pos[0] + vec4(-f,+f,+f, 0)),
g_uWVP * (vOut_pos[0] + vec4(+f,-f,-f, 0)),
g_uWVP * (vOut_pos[0] + vec4(+f,-f,+f, 0)),
g_uWVP * (vOut_pos[0] + vec4(+f,+f,-f, 0)),
g_uWVP * (vOut_pos[0] + vec4(+f,+f,+f, 0))
);
uint uiIndex = 0;
for(uint uiTri = 0; uiTri < 12; ++uiTri)
{
for(uint uiVert = 0; uiVert < 3; ++uiVert)
{
gl_Position = vertices[elements[uiIndex++]];
EmitVertex();
}
EndPrimitive();
}
}
I've seen people talk about instancing or other such rendering methods, but I'm primarily interested in understanding why I can't get at least the same performance from OpenGL as I do with DirectX - seeing as the way I do it in both seem to be virtually identical to me. Identical data, identical shaders. Help?
UPDATE
So I downloaded gDEBugger, and here is my call stack for one frame:
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT)
// Drawing cubes
glBindVertexArray(1)
glUseProgram(1)
glUniformMatrix4fv(0, 1, FALSE, {matrixData})
glDrawArrays(GL_POINTS, 0, 1000000)
// Drawing text
glBindVertexArray(2);
glUseProgram(5);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, 2);
glBindBuffer(GL_ARRAY_BUFFER, 2);
glBufferData(GL_ARRAY_BUFFER, 212992, {textData}, GL_DYNAMIC_DRAW);
glDrawArrays(GL_POINTS, 0, 34);
// Swap buffers
wglSwapBuffers();