I see that in https://github.com/GStreamer/gst-plugins-bad/tree/1.10/gst-libs/gst/codecparsers, there is gstvp9parser.c . But when I compile gstreamer bad plugins, I don't see an element related to vp9 parser. Has anybody been able to figure out how to use the vp9 parser?
TIA
Related
I am working on a rmvb playback plugin on gstreamer.I write the demux and decoders,and it work fine when I link them using pipeline.
But the playback application is using the playbin2 to play the videos.
so I wonder if it is possible to add them to the playbin2. so that playbin2 can play rmvb files.
But I dont know what to do.
So my question is:
1.Is it possible to do that;
2.If it is possible , what are the keywords I should search;
3.If it is impossible . is there any other way to play the rmvb file at the least cost.(It is hard to change the playback application source code)
It will be appreciated if any one helps.
Thanks a lot.
Yes
Elements have ranks, playbin will look for the elements with the highest rank to be used. So you need to make sure your element reports rmvb caps (as reported by gst-typefind) on its sink pads and that it has a high enough rank. Ranks are set when registering the element to the plugin.
There should already be rmvb support in gstreamer, maybe you're just missing the proper plugin in your installation? You shouldn't need to write a new one. It should be in gst-plugins-ugly (realmedia is the name of the plugin IIRC).
Unasked but please move to 1.0, 0.10 is dead/obsolete/unmantained for years now. By using it you won't be getting much or any help from the community these days.
The HTTP file and its contents are already downloaded and are present in memory. I just have to pass on the content to a decoder in gstreamer and play the content. However, I am not able to find the connecting link between the two.
After reading the documentation, I understood that gstreamer uses httpsoupsrc for downloading and parsing of http files. But, in my case, I have my own parser as well as file downloader to do the same. It takes the url and returns the data in parts to be used by the decoder. I am not sure howto bypass httpsoupsrc and use my parser instead also how to link it to the decoder.
Please let me know if anyone knows how things can be done.
You can use appsrc. You can pass chunks of your data to app source as needed.
I've always wanted to try and make a media player but I don't understand how. I found FFmpeg and GStreamer but I seem to be favoring FFmpeg despite its worse documentation even though I haven't written anything at all. That being said, I feel I would understand how things worked more if I knew what they were doing. I have no idea how video/audio streams work and the several media types so that doesn't help. At the end of the day, I'm just 'emulating' some of the code samples.
Where do I start to learn how to encode/decode/playback video/audio streams without having to read hundreds of pages of several 'standards'. Perhaps to a certain extent also be enough knowledge to playback media without relying on another API. Googling 'basic video audio decoding encoding' doesn't seem to help. :(
This seem to be a black art that nobody is out to tell anyone about.
The first part is extracting streams from the container. From there, you need to decode the streams into media. I recommend finding a small Theora video and seeing how the pieces relate there.
you want that we write one answer and you read that and be master in multimedia domain..!!!!
Anyway that can not be by one answer.
First of all understand this terminolgy by googling
1> container -- muxer/demuxer
2> codec --coder/decoder
If you like ffmpeg then go with its basic video plater application. iT is well documented at here http://dranger.com/ffmpeg/ it will shows the method of demuxing container and decoding any elementry stream with ffmpeg api. more about this at http://ffmpeg.org/ffplay.html
i like gstreamer more then ffmpeg. it has well documentation. it will be good choise if you start with gstreamer
i currently want to write a program which can extract audio from an FLV video using either python or c++. I have no idea how to go about it? Is there some kind of a tutorial or anything that would help me? Please help me out here learn this.
Thanks!
Actually you can use MPlayer to do this, e.g.
mplayer video.flv -vo null -ao pcm:file=file.wav
So, you can use this in combination with calls from python (or C). Another way is to use FFmpeg which MPlayer uses internally.
I found this tutorial about ffmpeg the thing i do not get is how to encode video.
can any one, please provide a tutorial.. with explanations for that? (not that i dont get this official one but i'd love to see more comments)
FFmpeg's developers guide refers to an api sample featuring encoding and decoding of both audio and video. This answer links to it as well.