video file + fragment shader under Linux [closed] - c++

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
coming from Windows (MSVC++ 2005):
What SDK or alike do you recommend to port an C++ application (DirectShow+Direct3D) to Linux playing video file + using fragment shaders?

is there any reason you need a fragment shader at all? (are you doing post processing on the video images?). You don't need to do any shader coding to get a video playing with OpenGL.
I would use ffmpeg (libavcodec actually) to do the video decoding. Displaying a frame just requires an OpenGL texture and a call to glTexSubImage2D each frame to do the update.
Using FFMPEG in C/C++

You need to use OpenGL instead.
Some tip for the implementation:
- To achieve a good performance you
need to make sure a good video card
driver is installed.
- If you are not familiar with OpenGL
start it with the 'Red book' - OpenGL
Programming Guide
- You may need to download the latest extension header from here
http://www.opengl.org/registry/
- The library GLEW may help you in
identifying the available
extension.
- Include the GL/gl.h and the glext.h file in your project
- Link to the driver's opengl dynamic library: /usr/lib64/libGL.so or simmilar

i would also check the gstreamer framework on linux if you need to port a more complicated directshow application. it also has some sort of graph for media playback to build. it is totally different, but if you have experience and the need for complicated directshow, then you will see some analogy.
and gstreamer also has an opengl plugin for image effects and shaders, ....
http://www.gstreamer.net/
http://www.gstreamer.net/releases/gst-plugins-gl/0.10.1.html

Related

Use OpenGL or Qt? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
There is not much experience with OpenGL to create a two-dimensional scene. I decided to work with 3D and got up with this question: how best?
1) Use a normal OpenGL?
2) Work with OpenGL through Qt?
The main goal - to provide real-world experience working with graphics. And these questions were formed:
1) Which of the options used in real game development (when the company originally wrote the engine)?
2) Which of the options would be more advantageous for an employer? If I have experience with OpenGL or OpenGL ES in Qt?
P.S.I'm sorry, Not very good at English. I hope for an answer!
Qt is an UI library, it has nothing to do with whether you wanna use OpenGL or not.
OpenGL and Qt are not exclusive, though. Eat a burger or burger + coke? – rolevax.
if you want your application have a nice UI when you're not in 3D mode, you need a UI library, for example, game start screens.
Game Engines are developed based on 3D libraries like OpenGL DirectX Metal etc.
none of the options would be more advantageous for an employer. It's just libraries, you can just learn it if you're good at coding.
Qt is a GUI library, not a rendering library. You can create an OpenGL context using the QOpenGL class. Although OpenGL is a 3D API, it's very much possible to render 2D graphics by using an orthagonal projection matrix and not using the Z coordinate.
For 2D graphics, though, I recommend using a library like SDL which is very simple and high-level. There are also a ton of games that use it.

Which video library is OpenCV using under the hood on linux? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I'm writing a simple software to capture and record webcam images to a compressed video file. I'm using OpenCV "VideoCapture read(frame)" and "VideoWriter write(frame)" in a C++ software.
I'm on Ubuntu 14.04 LTS operating system.
I would like to know which library OpenCV is using under the hood.
Is it ffmpeg or gstreamer or V4L2 or its own low level source code ?
It seems to be changing depending of the OpenCV version I'm using. (Ex 2.4.1, 2.4.11, 3.x)
Can somebody give me a overview of what OpenCV is doing to decode/encode video ?
What is the typical path of the video data coming from the webcam up to my program in user space ?
What is the typical path of the video data coming from program up to the file system ?
Right now, this is confusing for me.
OpenCV uses ffmpeg
I don't know exactly where or how. I know it is used for reading and writing video files. I think it isn't used when getting images from a cam. I think it reads raw cam data, because it can set webcam properties.
Also, the "video" from the web cam isn't video, it is an image at the time the frame is capture. Capturing multiple images in order can be written to a video.
Getting video from a file, OpenCV grabes a frame at a time out of the stream.

Embedded 3d graphic engine with supporting Blender models [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I'm trying to develop desktop application. User can load 3d models from Blender with animation (simple object (move, rotate, etc) animation and NLA tracks) and interact with it (rotate model, zoom, click on different object, run animation).
Requirements:
Platforms: Windows, Linux.
High performance
Blender support.
Application's language: C++, C#, maybe another.
I know about Blend4Web (low performance for big model) and Ogre3d (tried to export scene from blender but in app see only black screen unfortunately). I will wonder if I miss something usefull.
Most Graphics Libraries can store files in the WaveFront obj format. libObj can parse and read this format. You should then be able to upload the models to OpenGL and perform the operations that you require.
For a framework to build your app look at GameKit
Using Ogre for graphics, Bullet for physics, OpenAL for sound
OgreKit is most actively developed
Engine is written in C++ and the game logic can be done in C++, Lua scripting or logic bricks
Reads all data from Blender .blend files, with future FBX import planned
Free from viral licenses: only using components using MIT/BSD/Zlib style licenses
CMake cross-platform build system support that works out-of-the-box, see http://cmake.org
Gamekit supports Windows and Mac OSX, Linux, Android and iPhone.
While it doesn't directly read blend files godot is a graphical game building application with python like scripting, it has a gui toolkit that can be used for non-game applications and they offer a blender addon for collada export that is meant to be better than the official one. Being open source you can also adjust it to your needs.

Libraries for Playing Audio? - C++ [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
Are there any C++ (LGPL or LGPL-like) sound libraries which allow me to play a sound by purely specifying stuff like frequency and volume etc?
My target platform is Linux/Ubuntu and I don't really care about cross-platform.
It would be nice if I could send an array of sound and it would be able to compress it into a common sound file like for example mp3.
I'm looking for something OpenGL-like where you can just 'draw' the sound and it'll be played.
I have heard about OpenAL but that only seems to be a library for loading and playing sounds.
Most simple and low-level audio libraries (e.g OpenAL, SDL, etc.) can play PCM waveforms quite easily. You fill a buffer with the wave you want to play, and play it. To make a waveform from stuff like frequency and volume and whatnot, you need to write a bit of code (probably a couple of lines for simple waves) and need to know basic trigonometry.
OpenAL is a cross-platform API targeted towards 3D games. Its interface is philosophically similar to OpenGL and provides functions to manage audio sources in a virtual 3D environment (position, speed, etc.) and provides some sound and environment effects (reverb, etc.) I know that it can decode some compressed formats (MP3, Vorbis,...) using extensions, but I'm not sure whether it has any encoding functionality.
SDL (or Simple Direct-media Layer) is also cross-platform and game-oriented, but it offers much more than audio. But any functionality that it does offer is very basic and this is intentional and by design. SDL is a platform abstraction layer. It's audio capabilities are similarly very basic and low-level; providing only playing and recording PCM waves. Of course, there are extention libraries (e.g. SDL_mixer) that have more functionality.
References:
OpenAL on Wikipedia
OpenAL homepage (seems down, as of Aug 5th, 2013)
OpenAL Soft, a fork of the OpenAL library available from Creative Labs
SDL homepage
SDL_mixer
As far as I know, both projects ship documentation and examples along with their source code, so you might want to get their source code and start experimenting.
If I've understood what you want to do correctly, either of these libraries can do what you want rather easily, but in my personal opinion, SDL is simpler and easier to use (unless you want 3D positional audio and effects.)

3d Realtime Software Renderer Open Source [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Is there a good 3d realtime software renderer with features similar to OpenGL/DirectX? Something similar of what cairo or anti-grain do for 2d, but in 3d.
I actually just know Mesa witch has a software OpenGL implementation , and Coco3d.
It should be open source :)
You could have a look into Orge 3D engine assuming you want an abstraction from the raw GL to something that already has a lot of the key features. It's open source too.
I believe the OpenSceneGraph has grown to be pretty competent and widely used.
For a pixel rendering engine why not have look at the DOOM rendering engine sources.
Another smaller and more standard API/OpenGL implementation called TinyGL could be something to look at too.
Technically OpenGL is just a standard, but there are OSS implementations available for download. I'm not sure you want a reference OGL driver though.
For 3d libraries, there are loads. Irrlicht, CrystalSpace, Ogre3d, to name just 3 off Sourceforge's trove list.
The only major open source real-time software renderer besides Mesa I know of is the Quake I engine. However, it's not up to par with current OpenGL or Direct3D capabilities.
If you can do without the source code, you could have a look at the Microsoft WARP10 renderer. It's a high performance implementation of Direct3D 10 on the CPU.
Check out Coin, an implementation of OpenInventor maintained by the company I'm employed by. It's licensed under a dual licensing model - GPL for free/opensource software. It's being actively developed and uses OpenGL to do rendering. It works on "all" platforms and can be easily integrated with Qt.
For standalone alternatives to OpenGL / Direct3D i would look at
Open source implementation of openGl : Mesa3D
Gallium3D
an implemantation of the the openRT specification: directViz
some reasearch to implement realtime RenderMan : RenderMan for realtime and the progress
OpenGL is open source, and should fall back to software rendering in the absense of 3D hardware on the system, provided that all the proper libraries are installed.