Go Lang OpenGL simple shape - blank screen - opengl

I need some assistance into why this piece of code produces a blank green window. I made this by combining examples from https://github.com/Jragonmiris/mathgl/blob/master/examples/opengl-tutorial/tutorial02/main.go and https://github.com/veandco/go-sdl2/blob/master/examples/opengl3.go. I guess i'm not sure if this is a bug with the GoLang sdl/gl framework or an issue with my OpenGL understanding. All this should draw is a cube.
My code is:
package main
import (
"fmt"
// gl "github.com/chsc/gogl/gl33"
"github.com/veandco/go-sdl2/sdl"
// "math"
"github.com/Jragonmiris/mathgl"
"github.com/go-gl/gl"
"runtime"
"time"
)
// var program gl.Program = 0
// var buffer gl.Buffer = 0
func MakeProgram(vert, frag string) gl.Program {
vertShader, fragShader := gl.CreateShader(gl.VERTEX_SHADER), gl.CreateShader(gl.FRAGMENT_SHADER)
vertShader.Source(vert)
fragShader.Source(frag)
vertShader.Compile()
fragShader.Compile()
prog := gl.CreateProgram()
prog.AttachShader(vertShader)
prog.AttachShader(fragShader)
prog.Link()
prog.Validate()
fmt.Println(prog.GetInfoLog())
return prog
}
func main() {
var window *sdl.Window
var context sdl.GLContext
var event sdl.Event
var running bool
var err error
runtime.LockOSThread()
if 0 != sdl.Init(sdl.INIT_EVERYTHING) {
panic(sdl.GetError())
}
window, err = sdl.CreateWindow(winTitle, sdl.WINDOWPOS_UNDEFINED,
sdl.WINDOWPOS_UNDEFINED,
winWidth, winHeight, sdl.WINDOW_OPENGL)
if err != nil {
panic(err)
}
if window == nil {
panic(sdl.GetError())
}
context = sdl.GL_CreateContext(window)
if context == nil {
panic(sdl.GetError())
}
if gl.Init() != 0 {
panic("gl error")
}
gl.ClearColor(1.0, 1.0, 1.0, .5)
gl.Viewport(0, 0, winWidth, winHeight)
program := MakeProgram(vertexShaderSource, fragmentShaderSource)
defer program.Delete()
matrixID := program.GetUniformLocation("MVP")
Projection := mathgl.Perspective(45.0, 4.0/3.0, 0.1, 100.0)
View := mathgl.LookAt(4.0, 3.0, 3.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0)
Model := mathgl.Ident4f()
MVP := Projection.Mul4(View).Mul4(Model)
gl.Enable(gl.DEPTH_TEST)
gl.DepthFunc(gl.LESS)
gl.Enable(gl.BLEND)
gl.BlendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA)
vertexArray := gl.GenVertexArray()
defer vertexArray.Delete()
vertexArray.Bind()
buffer := gl.GenBuffer()
defer buffer.Delete()
buffer.Bind(gl.ARRAY_BUFFER)
gl.BufferData(gl.ARRAY_BUFFER, len(triangle_vertices)*4, &triangle_vertices, gl.STATIC_DRAW)
running = true
for running {
for event = sdl.PollEvent(); event != nil; event = sdl.PollEvent() {
switch t := event.(type) {
case *sdl.QuitEvent:
running = false
case *sdl.MouseMotionEvent:
fmt.Printf(string(t.Timestamp))
}
}
gl.Clear(gl.COLOR_BUFFER_BIT) // | gl.DEPTH_BUFFER_BIT)
program.Use()
matrixID.UniformMatrix4fv(false, MVP)
attribLoc := gl.AttribLocation(0)
attribLoc.EnableArray()
buffer.Bind(gl.ARRAY_BUFFER)
attribLoc.AttribPointer(3, gl.FLOAT, false, 0, nil)
gl.DrawArrays(gl.TRIANGLES, 0, 3)
attribLoc.DisableArray()
time.Sleep(50 * time.Millisecond)
sdl.GL_SwapWindow(window)
}
sdl.GL_DeleteContext(context)
window.Destroy()
sdl.Quit()
}
const (
winTitle = "OpenGL Shader"
winWidth = 640
winHeight = 480
vertexShaderSource = `
#version 330 core
// Input vertex data, different for all executions of this shader.
layout(location = 0) in vec3 vertexPosition_modelspace;
// Values that stay constant for the whole mesh.
uniform mat4 MVP;
void main(){
gl_Position = MVP * vec4 (vertexPosition_modelspace,1.0);
}
`
fragmentShaderSource = `
#version 330 core
// Ouput data
out vec3 color;
void main()
{
// Output color = red
color = vec3(1,0,0);
}
`
)
var triangle_vertices = []float32{
-.5, -.5, -.5,
.5, -.5, -.5,
0.0, 0.5, -.5,
}
So I'm still having trouble drawing a simple shape on the screen. I made a few changes such as simplifying my shape (a triangle). I created coordinates so they would be more towards the -z axis so I would be able to see them but that has not worked. I then set the MVP matrix (moving the camera back some) just to make sure. My shaders are simple as I am only passing in a vec3 vertex position and mat4 MVP matrix so believe shaders are working correctly? Sorry for all the confusion, i think i maybe missing something here.
Update:
I also ran the version commands for opengl:
fmt.Println(gl.GetString(gl.VERSION))
fmt.Println(gl.GetString(gl.VENDOR))
fmt.Println(gl.GetString(gl.RENDERER))
for which the output was:
4.5.0 NVIDIA 347.09
NVIDIA Corporation
GeForce GTX 650 Ti/PCIe/SSE2
Not sure if this has any impact?
Update:
I have looked at some more examples and decided to try and add some sdl attributes but still no luck:
sdl.GL_SetAttribute(sdl.GL_DOUBLEBUFFER, 1)
sdl.GL_SetAttribute(sdl.GL_RED_SIZE, 8)
sdl.GL_SetAttribute(sdl.GL_GREEN_SIZE, 8)
sdl.GL_SetAttribute(sdl.GL_BLUE_SIZE, 8)
sdl.GL_SetAttribute(sdl.GL_ALPHA_SIZE, 8)
Update:
I modified this post to just include more recent code to not scare people away from TLDR.

I finally figured out what my problem was in this code.
The first thing I had to do was
positionAttrib := program.GetAttribLocation("vertexPosition_modelspace")
for all the input variables going into the vertex shader. This was done after binding the VBO for each array.
Next,
If you notice my code above:
gl.BufferData(gl.ARRAY_BUFFER, len(triangle_vertices)*4, &triangle_vertices, gl.STATIC_DRAW)
I simply replaced it with triangle_vertices array, and not the address:
gl.BufferData(gl.ARRAY_BUFFER, len(triangle_vertices)*4, triangle_vertices, gl.STATIC_DRAW)
Doing this seemed to fix it.

I would post this as a comment, but I do not yet have enough reputation.
The solution already provided nearly solved my similar issue, however not quite.
Where the provided solution was
gl.BufferData(gl.ARRAY_BUFFER, len(triangle_vertices)*4, triangle_vertices, gl.STATIC_DRAW)
The actual code which solved my issue is
gl.BufferData(gl.ARRAY_BUFFER, len(triangle_vertices)*4, gl.Ptr(triangle_vertices), gl.STATIC_DRAW)
I have also answered a similar question with this, but in more detail, which can be found here:
OpenGL Vertex Buffer doesn't draw anything in golang

Related

Opengl Model Matrix stretches vertices

I am working on a simple opengl game to learn more about it. But for some reason when I try to rotate my cube over time it becomes stretched. You can see it in the photo:
I think it has to do with my model matrix but I'm not sure. Here is a portion of my code:
model := mgl32.Ident4()
model = model.Mul4(mgl32.HomogRotate3D(float32(glfw.GetTime()) *
mgl32.DegToRad(50.0), mgl32.Vec3{0.5, 1.0, 0.0}))
view := mgl32.Ident4()
view = view.Mul4(mgl32.Translate3D(0.0, 0.0, -3.0))
projection := mgl32.Ident4()
projection = mgl32.Perspective(mgl32.DegToRad(45.0), 800/ float32(600), 0.1, 100)
shader.setMat4("model", model)
shader.setMat4("view", view)
shader.setMat4("projection", projection)
Here is my Minimal, Complete, and Verifiable example. I've removed the texture as it's not necessary to see the problem. I am also using wireframe so that you can see the stretching more clearly.
package main
import(
"github.com/go-gl/glfw/v3.2/glfw"
"github.com/go-gl/gl/v4.5-core/gl"
"log"
"strings"
"fmt"
"github.com/go-gl/mathgl/mgl32"
)
const(
vertexShaderSource = `
#version 450 core
layout (location = 0) in vec3 aPos;
uniform mat4 model;
uniform mat4 view;
uniform mat4 projection;
void main()
{
gl_Position = projection * view * model * vec4(aPos, 1.0);
}` + "\x00"
fragmentShaderSource = `
#version 450 core
out vec4 FragColor;
void main()
{
FragColor = vec4(1.0f, 0.5f, 0.2f, 1.0f);
}` + "\x00"
)
var(
vertices = []float32 {
-0.5,0.5,-0.5,
-0.5,-0.5,-0.5,
0.5,-0.5,-0.5,
0.5,0.5,-0.5,
-0.5,0.5,0.5,
-0.5,-0.5,0.5,
0.5,-0.5,0.5,
0.5,0.5,0.5,
0.5,0.5,-0.5,
0.5,-0.5,-0.5,
-0.5,-0.5,0.5,
-0.5,0.5,0.5,
-0.5,0.5,-0.5,
-0.5,-0.5,0.5,
0.5,-0.5,0.5,
}
indices = []int {
0,1,3, // -z
3,1,2,
5,4,7, // +z
7,6,5,
9,8,7, // +x
7,6,9,
0,1,11, // -x
11,1,10,
12,4,7, // +y
7,3,12,
13,1,14, // -y
14,1,2,
}
)
func main() {
if err := glfw.Init(); err != nil {
log.Fatalln("failed to initialize glfw:", err)
}
defer glfw.Terminate()
glfw.WindowHint(glfw.ContextVersionMajor, 4)
glfw.WindowHint(glfw.ContextVersionMinor, 5)
glfw.WindowHint(glfw.OpenGLProfile, glfw.OpenGLCoreProfile)
glfw.WindowHint(glfw.OpenGLForwardCompatible, glfw.True)
window, err := glfw.CreateWindow(800, 600, "LearnOpenGL", nil, nil)
if err != nil {
panic(err)
}
defer window.Destroy()
window.MakeContextCurrent()
// Initialize Glow
if err := gl.Init(); err != nil {
panic(err)
}
gl.Viewport(0, 0, 800, 600)
//SHADERS
vertexID, err := loadShader(vertexShaderSource, gl.VERTEX_SHADER)
if err != nil {
panic(err)
}
fragmentID, err := loadShader(fragmentShaderSource, gl.FRAGMENT_SHADER)
if err != nil {
panic(err)
}
programID := gl.CreateProgram()
gl.AttachShader(programID, vertexID)
gl.AttachShader(programID, fragmentID)
gl.DeleteShader(vertexID)
gl.DeleteShader(fragmentID)
gl.LinkProgram(programID)
// VBO / VAO / EBO data
var vbo, vao, ebo uint32
gl.GenVertexArrays(1, &vao)
gl.GenBuffers(1, &vbo)
gl.GenBuffers(1, &ebo)
gl.BindVertexArray(vao)
gl.BindBuffer(gl.ARRAY_BUFFER, vbo)
gl.BufferData(gl.ARRAY_BUFFER, len(vertices)*4, gl.Ptr(vertices), gl.STATIC_DRAW)
gl.BindBuffer(gl.ELEMENT_ARRAY_BUFFER, ebo)
gl.BufferData(gl.ELEMENT_ARRAY_BUFFER, len(indices) * 4, gl.Ptr(indices), gl.STATIC_DRAW)
gl.VertexAttribPointer(0, 3, gl.FLOAT, false, 0, gl.PtrOffset(0))
gl.EnableVertexAttribArray(0)
gl.BindBuffer(gl.ARRAY_BUFFER, 0)
gl.BindVertexArray(0)
gl.Enable(gl.DEPTH_TEST)
gl.PolygonMode(gl.FRONT_AND_BACK, gl.LINE)
for !window.ShouldClose() {
gl.ClearColor(0.2, 0.3, 0.3, 1.0)
gl.Clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT)
gl.UseProgram(programID)
model := mgl32.Ident4()
view := mgl32.Ident4()
projection := mgl32.Ident4()
model = model.Mul4(mgl32.HomogRotate3D(float32(glfw.GetTime()) * mgl32.DegToRad(50), mgl32.Vec3{0.5, 1, 0}))
view = view.Mul4(mgl32.Translate3D(0, 0, -3))
projection = projection.Mul4(mgl32.Perspective(mgl32.DegToRad(45), 800/float32(600), 0.1, 100))
gl.UniformMatrix4fv(gl.GetUniformLocation(programID, gl.Str("model\x00")), 1, false, &model[0])
gl.UniformMatrix4fv(gl.GetUniformLocation(programID, gl.Str("view\x00")), 1, false, &view[0])
gl.UniformMatrix4fv(gl.GetUniformLocation(programID, gl.Str("projection\x00")), 1, false, &projection[0])
gl.BindVertexArray(vao)
gl.DrawElements(gl.TRIANGLES, 36, gl.UNSIGNED_INT, gl.PtrOffset(0))
window.SwapBuffers()
glfw.PollEvents()
}
}
func loadShader(file string, shaderType uint32) (uint32, error) {
shader := gl.CreateShader(shaderType)
csources, free := gl.Strs(string(file))
gl.ShaderSource(shader, 1, csources, nil)
free()
gl.CompileShader(shader)
var status int32
gl.GetShaderiv(shader, gl.COMPILE_STATUS, &status)
if status == gl.FALSE {
var logLength int32
gl.GetShaderiv(shader, gl.INFO_LOG_LENGTH, &logLength)
log := strings.Repeat("\x00", int(logLength+1))
gl.GetShaderInfoLog(shader, logLength, nil, gl.Str(log))
return 0,fmt.Errorf("failed to compile %v: %v", file, log)
}
return shader, nil
}
HomogRotate3D creates a 3D rotation Matrix that rotates by (radian) angle about some arbitrary axis given by a Normalized Vector.
mgl32.Vec3{1, 1, 0}.Normalize()
Is apparently all you needed is normalize the axis of rotation vector and it stops the distortion. So your model rotation line:
model = model.Mul4(mgl32.HomogRotate3D(float32(glfw.GetTime()) *
mgl32.DegToRad(50.0), mgl32.Vec3{1.0, 1.0, 0.0}.Normalize()))
Have fun!

Go go-gl OpenGL Rendering Issues

I have a program that was originally working completely fine that would draw a triangle using the go-gl OpenGL wrapper for Go. In the process of playing with the code things started to get weird. Sometimes the shape would be rendered, and then it just wouldn't. Sometimes saving the file then running the code again would work, sometimes that also failed. In this process there were no changes to made to the file from working to broken. The glfw window still shows up with a background color and the vertices array I use is populated. I am not sure whether this is a simple error in my code or if its something hardware related.
Not sure if this helps, but I am using the latest Atom editor with the Go-Plus plugin. Thanks in advance for the help!
package main
import (
"fmt"
"log"
"runtime"
"github.com/go-gl/gl/v4.1-core/gl"
"github.com/go-gl/glfw/v3.1/glfw"
)
const vertexSource = `
#version 330
in vec2 position;
void main() {
gl_Position = vec4(position,0.0, 1.0);
}
` + "\x00"
const fragmentSource = `
#version 330
uniform vec3 triangleColor;
out vec4 outputColor;
void main() {
outputColor = vec4(triangleColor,1.0);
}
` + "\x00"
var vao, vbo uint32
var vertices = []float32{
0.0, 0.5,
0.5, -0.5,
-0.5, -0.5,
}
func init() {
runtime.LockOSThread()
}
func main() {
if err := glfw.Init(); err != nil {
log.Fatalln("failed to initalize GL window:", err)
}
defer glfw.Terminate()
glfw.WindowHint(glfw.Resizable, glfw.True)
glfw.WindowHint(glfw.ContextVersionMajor, 4)
glfw.WindowHint(glfw.ContextVersionMinor, 1)
glfw.WindowHint(glfw.OpenGLProfile, glfw.OpenGLCoreProfile)
glfw.WindowHint(glfw.OpenGLForwardCompatible, glfw.True)
windowHeight := 800
windowWidth := 800
window, err := glfw.CreateWindow(windowWidth, windowHeight, "HelloGL2.0", nil, nil)
if err != nil {
panic(err)
}
window.MakeContextCurrent()
if err := gl.Init(); err != nil {
panic(err)
} else {
fmt.Println("OpenGL Version:", gl.GoStr(gl.GetString(gl.VERSION)))
}
gl.GenVertexArrays(1, &vao)
gl.BindVertexArray(vao)
gl.GenBuffers(1, &vbo)
gl.BindBuffer(gl.ARRAY_BUFFER, vbo)
gl.BufferData(gl.ARRAY_BUFFER, len(vertices), gl.Ptr(vertices), gl.STREAM_DRAW)
vertexShader := gl.CreateShader(gl.VERTEX_SHADER)
vcsources, free := gl.Strs(vertexSource)
gl.ShaderSource(vertexShader, 1, vcsources, nil)
free()
gl.CompileShader(vertexShader)
fragmentShader := gl.CreateShader(gl.FRAGMENT_SHADER)
fcsources, free := gl.Strs(fragmentSource)
gl.ShaderSource(fragmentShader, 1, fcsources, nil)
gl.CompileShader(fragmentShader)
shaderProgram := gl.CreateProgram()
gl.AttachShader(shaderProgram, vertexShader)
gl.AttachShader(shaderProgram, fragmentShader)
gl.BindFragDataLocation(shaderProgram, 0, gl.Str("outputColor\x00"))
gl.LinkProgram(shaderProgram)
gl.UseProgram(shaderProgram)
posAttrib := uint32(gl.GetAttribLocation(shaderProgram, gl.Str("position\x00")))
gl.EnableVertexAttribArray(posAttrib)
gl.VertexAttribPointer(posAttrib, 2.0, gl.FLOAT, false, 0.0, gl.PtrOffset(0))
gl.GetUniformLocation(shaderProgram, gl.Str("triangleColor\x00"))
//GL_STATIC_DRAW: The vertex data will be uploaded once and drawn many times (e.g. the world).
//GL_DYNAMIC_DRAW: The vertex data will be changed from time to time, but drawn many times more than that.
//GL_STREAM_DRAW: The vertex data will change almost every time it's drawn (e.g. user interface).
for !window.ShouldClose() {
//gl.Uniform3f(uniColor, 2.0, 0.0, 1.0)
gl.ClearColor(0.9, 1.0, 0.3, 1.0)
gl.Clear(gl.COLOR_BUFFER_BIT)
gl.DrawArrays(gl.TRIANGLES, 0, 3)
window.SwapBuffers()
glfw.PollEvents()
}
gl.DeleteProgram(shaderProgram)
gl.DeleteShader(fragmentShader)
gl.DeleteShader(vertexShader)
gl.DeleteBuffers(1, &vbo)
gl.DeleteVertexArrays(1, &vao)
fmt.Println("Exiting!")
window.Destroy()
}
Your code gives me:
./gl.go:76: undefined: gl.Strs
./gl.go:81: undefined: gl.Strs
Consider using my code for loading and compiling your shaders:
func NewProgram(vertexShaderSource, fragmentShaderSource string) (uint32, error) {
vertexShader, err := CompileShader(vertexShaderSource, gl.VERTEX_SHADER)
if err != nil {
return 0, err
}
fragmentShader, err := CompileShader(fragmentShaderSource, gl.FRAGMENT_SHADER)
if err != nil {
return 0, err
}
program := gl.CreateProgram()
gl.AttachShader(program, vertexShader)
gl.AttachShader(program, fragmentShader)
gl.LinkProgram(program)
var status int32
gl.GetProgramiv(program, gl.LINK_STATUS, &status)
if status == gl.FALSE {
var logLength int32
gl.GetProgramiv(program, gl.INFO_LOG_LENGTH, &logLength)
log := strings.Repeat("\x00", int(logLength+1))
gl.GetProgramInfoLog(program, logLength, nil, gl.Str(log))
return 0, errors.New(fmt.Sprintf("failed to link program: %v", log))
}
gl.DeleteShader(vertexShader)
gl.DeleteShader(fragmentShader)
return program, nil
}
func CompileShader(source string, shaderType uint32) (uint32, error) {
shader := gl.CreateShader(shaderType)
csource := gl.Str(source)
gl.ShaderSource(shader, 1, &csource, nil)
gl.CompileShader(shader)
var status int32
gl.GetShaderiv(shader, gl.COMPILE_STATUS, &status)
if status == gl.FALSE {
var logLength int32
gl.GetShaderiv(shader, gl.INFO_LOG_LENGTH, &logLength)
log := strings.Repeat("\x00", int(logLength+1))
gl.GetShaderInfoLog(shader, logLength, nil, gl.Str(log))
return 0, fmt.Errorf("failed to compile %v: %v", source, log)
}
return shader, nil
}

Can anybody explain these WebGL snippets?

I am learning webgl and fully confused now.
I am going through this website and the comments written with code half explains for a beginner like me.
For example:
var canvas = document.getElementById('canvas');
var gl = getWebGLContext(canvas);
if(!gl) {
return;
}
//Setup GLSL
var program = createProgramFromScripts(gl, ["2d-vertex-shader", "2d-fragment-shader"]);
gl.useProgram(program);
//Look up where the vertex data needs to go
var positionLocation = gl.getAttribLocation(program, 'a_position');
//create a buffer and put a single CLIPSPACE rectangle in it.
var buffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array([-1.0, -1.0,
1.0, -1.0,
-1.0, 1.0,
-1.0, 1.0,
1.0, -1.0,
1.0, 1.0]), gl.STATIC_DRAW);
gl.enableVertexAttribArray(positionLocation);
gl.vertexAttribPointer(positionLocation, 2, gl.FLOAT, false, 0, 0);
//draw
gl.drawArrays(gl.TRIANGLES, 0, 6);
In the above snippet, the line
var positionLocation = gl.getAttribLocation(program, 'a_position');
indicates it got the position where the vertex needs to go , but I didn't find anything specific in vertex shaders
attribute vec2 a_position;
void main() {
gl_Position = vec4(a_position, 0, 1);
}
How can we say where is the position?
Also the Float32Array , why are we using that at all ,is there any scenario where we can use it in real time, I am totally confused with these shaders.
I also read GLSL essentials ,to get some shaders knowledge, but still confused with these things. Can somebody put some light on to it?
The Float32Array contains all the vertices that will go through the shading pipeline.
In your vertex shader you assign gl_Positionto be a 4-dimensional vector with x and y belonging to your inserted vertices. So a_Position contains the values you passed in your array and the vertex shader will be run for every single vertex out there.
So this shader hardly does anything. In a real application, you can do several transformations and lighting operations etc. here.
If you run this program you should see 2 triangles being drawn (1 rectangle). That's because the array contains 6 2d-values that assign to 2 triangles.
Check out more information on the openGL pipeline here.

Render Truetype fonts with OpenGL

I am starting to learn OpenGL now, using the Go programming language (I just couldn't get working with C/C++ on my Windows machine), and so far I've managed to display some rotating cubes in the screen with textures mainly copying and pasting code from tutorials. I've learned a lot, though, but I just can't get some text on the screen with this code that I wrote on my own. I've looked up many tutorials and questions but nothing seems to work, and I suspect there is something wrong with the vertices because I'm pretty sure the textures coordinates are correct and still there's nothing showing up in the screen. Here's the code:
package game
import (
"fmt"
"io/ioutil"
"image"
"image/draw"
"github.com/go-gl/gl/v3.3-core/gl"
mgl "github.com/go-gl/mathgl/mgl32"
"github.com/golang/freetype/truetype"
"golang.org/x/image/font"
"golang.org/x/image/math/fixed"
)
type GameFont struct {
loaded bool
vao uint32
vbo VBOData
pix float32
Texture *Texture
Shader ShaderProgram
}
// Load a TrueType font from a file and generate a texture
// with all important characters.
func (f *GameFont) Load(path string, pix float32) {
contents, err := ioutil.ReadFile(path)
if err != nil {
fmt.Println("Could not read font file: " + path)
panic(err)
}
fontFace, err := truetype.Parse(contents)
if err != nil {
fmt.Println("Could not parse font file: " + path)
panic(err)
}
// Create a texture for the characters
// Find the next power of 2 for the texture size
size := nextP2(int(pix * 16))
fg, bg := image.White, image.Black
rgba := image.NewRGBA(image.Rect(0, 0, size, size))
draw.Draw(rgba, rgba.Bounds(), bg, image.ZP, draw.Src)
d := &font.Drawer{
Dst: rgba,
Src: fg,
Face: truetype.NewFace(fontFace, &truetype.Options{
Size: float64(pix),
DPI: 72,
Hinting: font.HintingNone,
}),
}
// Some GL preps
gl.GenVertexArrays(1, &f.vao)
gl.BindVertexArray(f.vao)
f.vbo.Create()
f.vbo.Bind()
f.Shader = newShaderProgram("data/shaders/font.vert", "data/shaders/font.frag")
f.Shader.Use()
f.Shader.SetUniform("tex", 0)
// Create vertex data (and coordinates in the texture) for each character
// All characters below 32 are useless
for i := 32; i < 128; i++ {
c := string(rune(i))
x, y := i % 16, i / 16
// Draw the character on the texture
d.Dot = fixed.P(x * int(pix), y * int(pix))
d.DrawString(c)
// Vertices
quads := []float32{
0, 0,
0, pix,
pix, 0,
pix, pix,
}
norm := func(n int) float32 {
return float32(n) / 16.0
}
// Texture coordinates (normalized)
texQuads := []float32{
norm(x), 1.0 - norm(y + 1),
norm(x), 1.0 - norm(y),
norm(x + 1), 1.0 - norm(y + 1),
norm(x + 1), 1.0 - norm(y),
}
for v := 0; v < 8; v += 2 {
vQuads, vTexQuads := quads[v:(v+2)], texQuads[v:(v+2)]
// Data is like (X, Y, U, V)
f.vbo.AppendData(vQuads, 2)
f.vbo.AppendData(vTexQuads, 2)
}
}
// Upload data to GPU and we're done
f.Texture = newTextureFromRGBA(rgba)
f.Texture.Bind()
f.Texture.SetGLParam(gl.TEXTURE_MIN_FILTER, gl.LINEAR)
f.Texture.SetGLParam(gl.TEXTURE_MAG_FILTER, gl.LINEAR)
f.Texture.Upload()
f.vbo.UploadData(gl.STATIC_DRAW)
gl.EnableVertexAttribArray(0)
gl.VertexAttribPointer(0, 2, gl.FLOAT, false, 4*4, gl.PtrOffset(0))
gl.EnableVertexAttribArray(1)
gl.VertexAttribPointer(1, 2, gl.FLOAT, false, 4*4, gl.PtrOffset(2*4))
f.loaded = true
}
// Render a text using the font
func (f *GameFont) Render(text string, x, y int, pix float32, color mgl.Vec4) {
if !f.loaded {
return
}
gl.Disable(gl.DEPTH_TEST)
gl.Enable(gl.BLEND)
gl.BlendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA)
gl.BindVertexArray(f.vao)
f.Shader.Use()
f.Shader.SetUniform("projection", mgl.Ortho2D(0, _screen.Width, 0, _screen.Height))
f.Shader.SetUniform("color", color)
f.Texture.Bind()
scale := pix / f.pix
for i := 0; i < len(text); i++ {
index := rune(text[i])
model := mgl.Ident4().Mul4(mgl.Scale3D(scale, scale, 0))
model = model.Add(mgl.Translate3D(float32(x) + float32(i) * pix, float32(y), 0))
f.Shader.SetUniform("model", model)
gl.DrawArrays(gl.TRIANGLE_STRIP, (32-index)*4, 4)
}
gl.Enable(gl.DEPTH_TEST)
gl.Disable(gl.BLEND)
}
Here's the shaders:
Vertex shader
#version 330
uniform mat4 projection;
uniform mat4 model;
layout (location = 0) in vec2 vert;
layout (location = 1) in vec2 vertTexCoord;
out vec2 fragTexCoord;
void main() {
fragTexCoord = vertTexCoord;
gl_Position = projection * model * vec4(vert, 0, 1);
}
Fragment shader
#version 330
uniform sampler2D tex;
uniform vec4 color;
in vec2 fragTexCoord;
out vec4 outputColor;
void main() {
outputColor = color * texture(tex, fragTexCoord);
}
Every "component" of the GameFont struct is working properly (I've used them with the rotating cubes), so every function calls the GL corresponding one.
Also the texture is being drawed correctly, I've saved it to the disk and it looks like this:
And still, there's no text on the screen.

glVertexAttrib4fv won't pass to shader input at location 0

I'm trying to learn OpenGL and Rust at the same time. I'm using the OpenGL Superbible Sixth Edition, and got stuck in chapter 3 which introduces the function glVertexAttrib4fv to offset the position of a triangle. It worked fine when I did it in C++, but when I tried to translate it to Rust, the triangle disappeared. I've tried to reduce the example as much as possible to the following code (cargo dependencies are glutin = "*" and gl = "*"):
main.rs
extern crate glutin;
extern crate gl;
use std::io::Read;
fn main() {
unsafe {
let win = glutin::Window::new().unwrap();
win.make_current().unwrap();
gl::load_with(|s| win.get_proc_address(s));
let program = build_shader_program();
gl::UseProgram(program);
let mut vao = std::mem::uninitialized();
gl::GenVertexArrays(1, &mut vao);
gl::BindVertexArray(vao);
let red = [1.0, 0.0, 0.0, 1.0];
let mut running = true;
while running {
for event in win.poll_events() {
if let glutin::Event::Closed = event {
running = false;
}
}
win.swap_buffers().unwrap();
gl::ClearBufferfv(gl::COLOR, 0, &red[0]);
let attrib = [0.5, 0.0, 0.0, 0.0];
panic_if_error("before VertexAttrib4fv");
gl::VertexAttrib4fv(0, &attrib[0]);
panic_if_error("after VertexAttrib4fv");
gl::DrawArrays(gl::TRIANGLES, 0, 3);
}
}
}
fn panic_if_error(message: &str) {
unsafe {
match gl::GetError() {
gl::NO_ERROR => (),
_ => panic!("{}", message),
}
}
}
fn load_file_as_cstring(path: &str) -> std::ffi::CString {
let mut contents = Vec::new();
let mut file = std::fs::File::open(path).unwrap();
file.read_to_end(&mut contents).unwrap();
std::ffi::CString::new(contents).unwrap()
}
fn load_and_compile_shader(path: &str, shader_type: u32) -> u32 {
let contents = load_file_as_cstring(path);
unsafe {
let shader_id = gl::CreateShader(shader_type);
let source_ptr = contents.as_ptr();
gl::ShaderSource(shader_id, 1, &source_ptr, std::ptr::null());
gl::CompileShader(shader_id);
let mut result = std::mem::uninitialized();
gl::GetShaderiv(shader_id, gl::COMPILE_STATUS, &mut result);
assert_eq!(result, gl::TRUE as i32);
shader_id
}
}
fn build_shader_program() -> u32 {
let vert = load_and_compile_shader("a.vert", gl::VERTEX_SHADER);
let frag = load_and_compile_shader("a.frag", gl::FRAGMENT_SHADER);
unsafe {
let program_id = gl::CreateProgram();
gl::AttachShader(program_id, vert);
gl::AttachShader(program_id, frag);
gl::LinkProgram(program_id);
let mut result = std::mem::uninitialized();
gl::GetProgramiv(program_id, gl::LINK_STATUS, &mut result);
assert_eq!(result, gl::TRUE as i32);
program_id
}
}
a.frag
#version 430 core
out vec4 color;
void main() {
color = vec4(1.0, 1.0, 1.0, 1.0);
}
a.vert
#version 430 core
layout (location = 0) in vec4 offset;
void main() {
const vec4 vertices[3] =
vec4[3](vec4( 0.25, -0.25, 0.5, 1.0),
vec4(-0.25, -0.25, 0.5, 1.0),
vec4( 0.25, 0.25, 0.5, 1.0));
gl_Position = vertices[gl_VertexID]; // LINE 1
// gl_Position = vertices[gl_VertexID] + offset; // LINE 2
}
This code, as is, produces a white triangle in the middle of a red window.
Now, my expectation is that when I comment out LINE 1 in the vertex shader, and uncomment LINE 2, the triangle should move a quarter of a screen to the right, due to this code in "main.rs":
let attrib = [0.5, 0.0, 0.0, 0.0];
panic_if_error("before VertexAttrib4fv");
gl::VertexAttrib4fv(0, &attrib[0]);
panic_if_error("after VertexAttrib4fv");
But instead, the triangle disappears altogether. The panic_if_error call before and after gl::VertexAttrib4fv ensures that gl::GetError returns gl::NO_ERROR.
Question: Does anybody know why this is happening?
Other things of note. While I was searching for the answer to this, I came upon this question, where the user is having a similar problem (except in C++, where I had no problem). Anyway, one of the comments there incidentally lead me to try changing the location from 0 to 1, as in this:
layout (location = 1) in vec4 offset;
for the vertex shader, and this for the call to gl::VertexAttrib4fv:
gl::VertexAttrib4fv(1, &attrib[0]);
Well, that worked, but I have no idea why, and would still like to know what the problem is with using location 0 there (since that's what the book shows, and it worked fine in C++).
You need to make sure that you have a Core Profile context. If you do not specify this, you may be creating a Compatibility Profile context. In the Compatibility Profile, vertex attribute 0 has a special meaning. From the OpenGL 3.2 Compatibility Profile spec:
Setting generic vertex attribute zero specifies a vertex; the four vertex coordinates are taken from the values of attribute zero. A Vertex2, Vertex3, or Vertex4 command is completely equivalent to the corresponding VertexAttrib* command with an index of zero. Setting any other generic vertex attribute updates the current values of the attribute. There are no current values for vertex attribute zero.
In other words, vertex attribute 0 is an alias for the fixed function vertex position in the compatibility profile.
The above does not apply in the Core Profile. Vertex attribute 0 has not special meaning, and can be used like any other vertex attribute.
Based on what you already found, you need to use the with_gl_profile method with argument Core to specify that you want to use the core profile when creating the window.