Running GLSL Compute shader without a graphic context [duplicate] - c++

This question already has answers here:
How to render offscreen on OpenGL? [duplicate]
(5 answers)
Creating OpenGL context without window
(1 answer)
Closed 5 years ago.
I'd like to write a GLSL compute shader that works like an OpenCL application. I'm a newbie in GPGPU so i don't even know if it is possible.
The problem is that i'd like to run that program without needing a graphic server, like X11 on Unix. Is there a way i can initialize a dummy glContext to run that program in any terminal, without needs of graphic environment?
The host application is written in C++, if it matters anything.

Related

Creating a GUI in OpenGL, is it possible? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I'm trying to create a custom GUI in OpenGL from scratch in C++, but I was wondering is possible or not?
I'm getting started on some code right now, but I'm gonna stop until I get an answer.
YES.
If you play a video game, in general, every UIs should be implemented by APIs like OpenGL, DXD, Metal or Vulkan. Since a rendering surface has higher frame rate than OS UI APIs, using them together slows down the game.
Starting with making a view class as a base class, implement actual UI classes like button, table and so on inherited from the base class.
Making UIs using a GFX API is similar to making a game in terms of using same graphics techniques such as Texture Compression, Mipmap, MSAA and some special effects and so on. However, handling a font is a sort of huge part, for this reason, many game developers use a game engine/UI libraries.
https://www.twitch.tv/heroseh
Works on a Pure C + OpenGL User Interface Library daily at about 9AM(EST).
Here is their github repo for the project:
https://github.com/heroseh/vui
I myself am in the middle of stubbing in a half-assed user interface that
is just a list of clickable buttons. ( www.twitch.com/kanjicoder )
The basic idea I ran with is that both the GPU and CPU need to know about your
data. So I store all the required variables for my UI in a texture and then
sync that texture with the GPU every time it changes.
On the CPU side its a uint8 array of bytes.
On the GPU side it's unsigned 32 bit texture.
I have getters and setters both on the GPU (GLSL code) and CPU (C99) code that
manage the packing and unpacking of variables in and out of the pixels of the texture.
It's a bit crazy. But I wanted the "lowest-common-denominator" method of creating
a UI so I can easily port this to any graphics library of my choice in the future.
For example... Eventually I might want to switch from OpenGL to Vulkan. So if I keep
most of my logic as just manipulations of a big 512x512 array of pixels, I shoudn't
have too much refactoring work ahead of me.

Drawing pixel by pixel in C++ [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
Some years ago, I used to program for MS-DOS in assembly language. One the things available was to tell the BIOS an x coordinate, a y coordinate and a color (expressed as an integer) then call a function and the BIOS would do it immediately.
Of course, this is very hard work and very time consuming but the trade off was that you got exactly what you want exactly at the time you wanted it.
I tried for many years to write to MacOS API, but found it either difficult or impossible as nothing is documented at all. (What the hell is an NSNumber? Why do all the controls return a useless object?)
I don't really have any specific project in mind right now, but I would like to be able to write C++ that can draw pixels much in the same way. Maybe I'm crazy but I want that kind of control.
Until I can overcome this, I'm limited to writing programs that run in the console by printing text and scrolling up as the screen gets full.
You could try using Windows GDI:
#include <windows.h>
int main()
{
HDC hdc = GetDC(GetConsoleWindow());
for (int x = 0; x < 256; ++x)
for (int y = 0; y < 256; ++y)
SetPixel(hdc, x, y, RGB(127, x, y));
}
It is pretty easy to get something drawn (if this is what you are asking) as you could see from the above example.
Modern x86 operating systems do not work anymore under real mode.
You have several options:
Run a VM and install a real mode OS (e.g. MS-DOS).
Use a layer that emulates the real mode (e.g. DOSBox).
Use a GUI library (e.g. Qt, GTK, wxWidgets, Win32, X11) and use a canvas or a similar control where you can draw.
Use a 2D API (e.g. the 2D components of SDL, SFML, Allegro).
Use a 3D API (e.g. OpenGL, Direct3D, Vulkan, Metal; possibly exposed by SDL, SFML or Allegro if you want it portable) to stream a texture that you have filled pixel by pixel with the CPU each frame.
Write fragment shaders (either using a 3D API or, much easier, in an web app using WebGL).
If you want to learn how graphics are really done nowadays, you should go with the last 2 options.
Note that, if you liked drawing "pixel by pixel", you will probably love writing fragment shaders directly on the GPU and all the amazing effects you can achieve with them. See ShaderToy for some examples!

How to set GPU in visual studio / OpenGL [duplicate]

This question already has answers here:
Forcing NVIDIA GPU programmatically in Optimus laptops
(2 answers)
Closed 5 years ago.
I'm using a Surface Book 2 and visual studio. I'm trying to make an OpenGL application and I noticed that it is defaulting to the integrated intel GPU rather than the discrete NVIDIA GPU that is also on the laptop.
I know that I can use the NVIDIA control panel to set the NVIDIA GPU as the default, but the base setting is to "let the application choose" (I understand that the purpose of this setting is to save battery when the better GPU is not needed). I am trying to find a way that I can choose the GPU in my application without manually changing settings in the NVIDIA control panel.
I looked around and it sounds like OpenGL does not support any methods choosing between different GPUs (which is very surprising to me). Is there any way that I can select which GPU I want without using a different API and without changing the settings in the NVIDIA control panel?
Find the executable generated by Visual Studio, and set your GPU for it.

Maximum file size for sprites in Cocos2d [duplicate]

This question already has answers here:
What is the maximum texture size available on the iPad?
(3 answers)
Closed 8 years ago.
What's the maximum size a file can be for a sprite? I have a background for a level that's 9200x640 and when I try to load the page it's on, the app crashes. If this is an issue, what can I do to resolve it? If not, what should I do in order for it to work?
Depends on the device (usually 2048×2048 or 4096×4096). For large images like the one you are using, you really need to use tiled images. Ray Wenderlich has an example here:
http://www.raywenderlich.com/29458/how-to-make-a-tile-based-game-with-cocos2d-2-x

Make graphics to glow/halo [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How to add glowing effect to a line for OpenGL?
I would like to get help for making my graphics glow. I would like to do it the same way as in Tron 2.0 game, without using advanced stuff like shaders etc. Mainly because my video card isn't capable.
I know I can get the technique from site: wanted outcome
But I don't know, how to do it in OpenGL in c++. Maybe somebody has code for doing it or step by step guide or can point me in direction where to obtain code or step by step guide for doing it. Let's say that my program draws a maze like in the picture below:
And now I would like to get that maze lines something like the picture below:
I have done exactly the same maze type of game, with glowing lines :)
The way we did it was to add rectangles around lines, with glow textures.
Sorry for not adding step-to-step tutorial, but at least you have the general idea there.