I am trying to stream an audio file in mp3 format using the FFMPEG library to a remote computer, located on the same LAN as the sender. The command i used to stream at the sender is given below:
ffmpeg -re -f mp3 -i sender.mp3 -ar 8000 -f mulaw -f rtp rtp://10.14.35.23:1234
I got the below command on FFMPEG documentation page that generates audio and streams it to port number 1234 on remote computer
ffmpeg -re -f lavfi -i aevalsrc="sin(400*2*PI*t)" -ar 8000 -f mulaw -f rtp rtp://10.14.35.23:1234
I thought i had made relevant changes to this so that the mp3 streaming command will work, but only to know encounter the error which reads
"Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height"
Can anyone tell me what is the wrong parameter here and how to rectify it?
I could figure out the way to stream an audio file using FFMPEG. The command for the same is given below:
ffmpeg -re -f mp3 -i sender.mp3 -acodec libmp3lame -ab 128k -ac 2 -ar 44100 -f rtp rtp://10.14.35.23
Here the audio file 'sender.mp3' is located in the same folder as ffmpeg.exe. In case of a different folder, the full path should be mentioned in the command.
Related
I’m working on oneVPL samples from this GitHub repository (https://github.com/oneapi-src/oneAPI-samples ) and I’m trying to build hello-vpp sample. After running the program with the command in readme.md file, I wanted to increase the video size to 1280x720. While playing the raw output file, I used the below command
fplay -video_size 1280x720 rawvideo out.raw
My raw output file got damaged. A buffered video got played. How do I change the width and height of the output file? Any suggestions here?
Add the scale filter. Example assuming video.raw is 640x360:
ffplay -f rawvideo -video_size 640x360 -pixel_format rgba -vf scale=1280:720 video.raw
Try the below command:
ffplay -video_size 1280x720 -pixel_format bgra -f rawvideo out.raw
I'm writing an application which need to capture screen. I've looked up for solution and internet says that FFMPEG could do it. But I can't find the way to do that IN CODE. FFMPEG documentation seems to be very poor.
Can anybody please tell me how do I access framebuffer raw data with FFMPEG?
FFmpeg supports input of rawframes throught stdin:
With the arg -f rawvideo ffmpeg will expect frames coming from stdin
ffmpeg -r 60 -f rawvideo -pix_fmt uyvy422 -s 1280x720 -i - -threads 0 -preset fast -y -pix_fmt yuv420p output.mp4
You can check this link, it has useful information.
In Qt, you would run a QProcess with ffmpeg with -f rawvideo and write to stdin with write() method.
This is roughly how to acomplish it:
QProcess* process;
process->start("ffmpeg.exe", args, QProcess::Unbuffered | QProcess::ReadWrite);
process->waitForStarted();
...
process->setProcessChannelMode(QProcess::ForwardedChannels);
videoFrame->GetBytes(&buffer);
process->write(buffer);
I am able to stream and receive webcam feed in two terminal via udp
command for streaming:
ffmpeg -i /dev/video0 -b 50k -r 20 -s 858x500 -f mpegts udp://127.0.0.1:2000
command for recieving:
ffplay udp://127.0.0.1:2000
Now i have to use this received video stream as input in python/opencv how can i do that.
I will be doing this using rtp and rstp as well.
But in case of rtsp it is essential to initiate the receiving terminal, but if I do that then port will become busy and my program will not be able to take the feed.How could it be resolved.
I am currently using opencv 2.4.13, python 2.7 in ubuntu 14.04
Check this tutorial, and use cv2.VideoCapture("udp://127.0.0.1:2000"). You will need to build opencv with FFmpeg so that it works.
I have a c++ script that coneverts a series of jpg into a .mp4 video, the command i use is the folllowing:
std::system("ffmpeg -threads auto -y -framerate 1.74659 -i /mnt/ev_ramdsk/1/%05d-capture.jpg -vcodec libx264 -preset ultrafast /mnt/ev_ramdsk/1/video.mp4");
which produces a .mp4 video file like its supposed to except it cant be played from anywhere (tested in 2 computers and html5 video)
But, if from the same computer where the program runs, i do:
ffmpeg -threads auto -y -framerate 2 -i %05d-capture.jpg -vcodec libx264 -preset ultrafast video.mp4
from the command line, the output video plays wonderfully (except in vlc, for vlc i have to use -vcodec mpeg4)
What can possibly cause this behaviour?
could cp command corrupt the file? (ran after the mpeg to move it out of the ramfs)
EDIT:
As requested, i ran the whole set of commands one by one in the console exactly as the program do (the program logs every single command it runs, i just repeated them).
The commands are:
cp -r /var/cache/zoneminder/events/1/16/05/18/23/30/00/ /mnt/ev_ramdsk/1/
ffmpeg -threads auto -y -framerate 1.76729 -i /mnt/ev_ramdsk/1/%5d-capture.jpg -preset ultrafast /mnt/ev_ramdsk/1/video.mp4
cp /mnt/ev_ramdsk/1/video.mp4 /var/cache/evmanager/videos/1/2016_05_18_23_30_00_.mp4
The resulting .mp4 file can be played without any trouble. Also, is the only one with a preview image in the file explorer.
Thank you very much!
Solved it!
this was the winning answer. finally got it to work using:
std::system("ffmpeg -threads auto -y -r 1.74659 -i /mnt/ev_ramdsk/1/%05d-capture.jpg -px_fmt yuv420p -preset ultrafast -r 10 /mnt/ev_ramdsk/1/video.mp4");
Thank you very much!
So below is my command that I am running. It should be converting it to mp3 but it still exports as a video in flv. What am I doing wrong?
$cmd = '/usr/local/bin/youtube-dl -o "%(title)s.%(ext)s" -x --audio-format mp3 -- '.escapeshellarg($url).'';
youtube-dl will download the video before converting it. Most likely, you don't have ffprobe or ffmpeg installed. Make sure both programs are available (i.e. you get a sensible output for ffprobe --help and ffmpeg --help).
You can directly download the .mp3 file from the youtube site.
For e.g in ubuntu terminal youtube-dl youtube.com/watch?v=qn6CMz18lkQ -f 141 .Most probably 141 is the .mp3 file format code for better quality.