So I have MP3 track and MJPEG Is there any container for combining those 2 and playing them as one video track in one container?
You can mix them into an AVI
Related
I am compressing frames coming from webcam with libx264. So far I used YUY2 raw frames and swscale to transcode the frames to I420, which is usable by x264.
Anyway I would like to add support for mJPEG webcams (usually webcam provides both, but mJPEG allows higher frame rates and resolutions). What can I use to transcode mJPEG to some format, that can be used by x264?
If you already use swscale why not to use ffmpeg/libav (libavcodec) for decoding mjpeg?
Using direct show filters I have created a mp4 file writer with two input pin (one for audio and one for video) . I was writing audio sample received through one pin into one track and video sample received from the other pin into another track. But my video is not playing. If I connected only one pin, either Audio or Video, I can play the output file. that is if there is only one track.
I am using h264 encoder for video and mpeg4 encoder for audio. The encoders are working fine as I am able to play the audio and video separately.
I am setting the track count as 2. Is there any information to be provided in the moov box to make the video playing. Or Should we tell the decoder that which track is audio and which track is video. As we are setting those fields in track information I don't think that is important, but Why my video is not playing?
I am using panda board and i have installed opencv and wrote a code for sticking 3 different images from 3 different cams.now this stitched image is stored in a matrix location(pointer).i for that 3 cams images will be continuously captured and stitched.so it becomes a video.so i need to stream that stitched image to iPhone .can any one help me with this.i am really stuck here and need help.its very important for me.
I would suggest you look at constructing either mjpeg stream or better a RTSP (encapsulating mpeg4 - saving bandwidth) stream based on RTP protocol. Say you decide to go with mjpeg stream, then each of your opencv IplImage* can be converted to JPEG Frames using libjpeg compression. See my answer here Compressing IplImage to JPEG using libjpeg in OpenCV. You would compress each frame and then create mjpeg stream. See creating my own MJPEG stream. You would need a webserver to run mjpeg cgi that streams your image stream. You could look at lighttpd web server running on Panda Board. Gstreamer is the package that may be helpful in your situation. On the decoding side (iphone) you can construct gstreamer decoding pipeline as follows - say you are streaming mjpeg gst-launch -v souphttpsrc location="http://<ip>:<port>/cgi_bin/<mjpegcginame>.cgi" do-timestamp=true is_live=true ! multipartdemux ! jpegdec ! ffmpegcolorspace ! autovideosink
I'm not familiar with libvlc, I'm just wondering if it's possible to stream some sequece of images with LIBVLC.
It's certainly possible - at least with JPG-compressed images, or raw video.
MJPEG is just all of the individual JPGs concatenated together.. and raw video codecs will play arbitrary uncompressed streams.
How to transcode RGB images into VP8 frames (Keyframe + some dependent frames)?
So I created some images how to turn tham into VP8 now?
The easiest way to go is to use ffmpeg.
The latest release of ffmpeg (0.6) now supports the VP8 codec, and building it is now easy.
Then, ffmpeg makes it simple to gather individual frames into a movive. Here is a tutorial, but you can google for more results.
First, you need a codec library for VP8:
http://www.webmproject.org/code/build-prerequisites/
Using libvpx API you can then encode your RGB frames into VP8 frames.