Getting Video using Node Js on Ar Drone v1.0 - node.js-stream

How to get video stream or png stream using node js on the ar drone v1.0
I have been seeing the codes for v2.0 and even the gitHub people say that those libraries of video stream does not work with v1.
is there any other way I could get the png / video stream on the computer?
thanks and regards

Good question. I have struggled with producing a PNG (picture) stream or a TCP (video) stream from my Parrot AR Drones through Node.js. I am going to assume you already have Node.js installed and downloaded Felix Geisendörfer's Node.js package for AR-Drones, and that you have been able to fly it. To get the PNG stream, Make sure that you have 7zip installed (you'll need it) and you will also have to download FFMPEG so that your computer can easily parse the drone's stream before outputting it to your browser. Use 7zip to extract the FFMPEG files after downloading. Create a Node.js file (.js) and copy annd paste the following source code into it:
var arDrone = require('ar-drone');
var http = require('http');
console.log('Connecting png stream ...');
var pngStream = arDrone.createClient().getPngStream();
var lastPng;
pngStream.on('error', console.log).on('data', function(pngBuffer) {
lastPng = pngBuffer
});
var server = http.createServer(function(req, res) {
if (!lastPng) {
res.writeHead(503);
res.end('Did not receive any png data yet.');
return
}
res.writeHead(200, {
'Content-Type': 'image/png'
});
res.end(lastPng)
});
server.listen(8080, function() {
console.log('Serving latest png on port 8080 ...')
});
Save the file as something like 'video.js' (as long as it ends in .js). Make sure the JavaScript file is set to run using Node.js (software you installed to fly the drone). Then, turn on your drone and connect to its wifi. Then, go through the FFMPEG files until you find a file called 'ff-prompt.bat'. In the prompt of that batch file, type or paste in the file path of your video.js (or whatever you called it). Then press enter and it should start broadcasting onto your localhost (127.0.0.1) at port 8080. So go to your browser and type "http://localhost:8080" and you should be able to see an image from your drone. If you have questions, feel free to leave a comment and I'll address it as soon as possible. Thanks!

Related

7Zip CLI Compress file with LZMA

I try to compress a file in the console with LZMA.
7z a -t7z output input
or
7z a -t7z -m0=lzma output input
However, I cannot open it on the client.
How can compress a file as an LZMA archive in the console?
It is possible the problem to be that the above commands add a file in an archive. However, I want to compress data in a data file without file structure.
Is there an option to compress a data file to a compressed data file with LZMA?
Edit
I see downvotes, which means the question is "not correct" in some way.
So I'll try to explain what I want to achieve.
I compress data serverside and use them on a client application. I successfully do it in Node.js like that:
const lzma = require('lzma');
lzma.compress(inputBuffer, 1, callback);
function callback(data, err) {
writefile(outputPath, Buffer.from(data));
}
However, it is very slow. So I want to call 7Zip for the compression.
My .NET server also compresses it in a similar way.
byte[] barData;
using (var barStream = dukasDataHelper.SerializeLightBars(lightBars.ToArray()))
using (var zippedStream = zipLzma.Zip(barStream))
{
barData = zippedStream.ToArray();
}
My problem is that I cannot set the correct options in CLI in order to be able to read the file in the client.
My client code C# is:
using (var blobStream = new MemoryStream(blobBytes))
using (var barStream = new ZipLzma().Unzip(blobStream))
{
SaveDataSet(barStream, localPath);
}
I have this error message when compress via CLI:
$exception {"Data Error"}
Data: {System.Collections.ListDictionaryInternal}
at SevenZipLzma.LZMA.Decoder.Code(Stream inStream, Stream outStream, Int64
inSize, Int64 outSize, ICodeProgress progress)
at SevenZipLzma.ZipLzma.Unzip(Stream stream)
Since the code works as I compress with Node.js and doesn't work when compressing via CLI, it means something is wrong.
7zip makes an archive of files and directories, whereas LZMA generates a single stream of compressed data. They are not the same format. LZMA can be used inside a 7zip archive to compress an entry (or LZMA2 or Deflate or several other compression methods).
You can try the xz command to generate LZMA streams with xz --format=lzma.

Convert Raw to Wav Streams in NodeJS

I am using a nodeJS library naudio —link— to record sound from a 2 microphones (total 4 channel audio with each microphone being stereo). This library spits out a .raw file in the following specs: 16 BIT, 48000Hz Sample Rate, Channel Count 4
// var portAudio = require('../index.js');
var portAudio = require('naudiodon');
var fs = require('fs');
//Create a new instance of Audio Input, which is a ReadableStream
var ai = new portAudio.AudioInput({
channelCount: 4,
sampleFormat: portAudio.SampleFormat16Bit,
sampleRate: 48000,
deviceId: 13
});
ai.on('error', console.error);
//Create a write stream to write out to a raw audio file
var ws = fs.createWriteStream('rawAudio_final.raw');
//Start streaming
ai.pipe(ws);
ai.start();
process.once('SIGINT', ai.quit);
Instead of the .raw file, I am trying to convert this to two individual .wav files. With the above encoding and information, what would be the best way to do so? I tried to dig around for easy ways to deinterleaving and getting .wav but seem to be hitting a wall.
The addon is a wrapper around a C++ library called portaudio which according to its documentation supports writing to a WAV file.
What you could do is extend the addon and bind a NodeJS function to the underlying C++ function that write to WAV.
This will give you a good performance if it is an issue.
If you want something easier you could look up utilities that do the conversion and call them from within your script using ex like this
Look similar to this question.
You may also take a look here to know how to create wav file from javascript.

How to extract audio form video using ffmpeg in C++?

I'm developing a C++ app that uses FFmpeg to play audio/video. Now I want to enhance the application to allow the users to extract audio from video. How can FFmpeg can be used for this? I searched a lot about this but I was not able to find a tutorial regarding it.
You need to
Open the input context [ avformat_open_input ]
Get the stream information [ avformat_find_stream_info ]
Get the audio stream:
if (inputFormatContext->streams[index]->codec->codec_type ==
AVMEDIA_TYPE_AUDIO) {
inputAudioStream = inputFormatContext->streams[index];
}
Read each packet.
AVPacket packet;
int ret = av_read_frame(inputFormatContext, &packet);
if (ret == 0) {
if (packet.stream_index == inputAudioStream->index) {
// packet.data will have encoded audio data.
}
}
This seems like a simple scripting task... why do you want to use the heavy artillery (C/C++) to swat a fly?
I use Applescript to build/run an ffmpeg command line via a Bash shell. The only reason I involve Applescript is so I can invoke it as a droplet (ie drag-and-drop the file(s) onto the app and have it run without interaction.)
I get that you're probably on Windows, meaning no Applescript and no Bash. But surely something lighter than C can build/run an ffmpeg command line. It's really as simple as:
ffmpeg -i infile.mp4 -b 160k outfile.mp3

Playing mp4 video using Phonon

I'm trying to write a very simple video player using QT and Phonon, on Windows. My backend is phonon_ds94. First of all, here is the code when I click on "Play" :
if (!this->_files.empty()) {
QString file = this->_files.front();
this->_files.pop();
Phonon::MediaSource _src(file);
this->ui.videoPlayer->play(_src);
}
(Here, file is a std::queue of files to read)
If I want to play a .avi ou .wmv, everything works fine. My video play, it's perfect.
But when I want to play a .mp4 file, nothing happen. The videoPlayer stay black.
I've search on the web and see that there is a BackendCapabilities::availableMimeTypes, so I've try it to be sure that my backend is compatible with mp4 - it's in the list. Here is the list of available mime types:
application/vnd.ms-wpl application/x-mplayer2 application/x-ms-wmd
application/x-ms-wmz audio/3gpp audio/3gpp2 audio/aiff audio/basic
audio/mid audio/midi audio/mp3 audio/mp4 audio/mpeg audio/mpegurl
audio/mpg audio/vnd.dlna.adts audio/wav audio/x-aiff audio/x-mid
audio/x-midi audio/x-mp3 audio/x-mpeg audio/x-mpegurl audio/x-mpg
audio/x-ms-wax audio/x-ms-wma audio/x-wav midi/mid unknown video/3gpp
video/3gpp2 video/avi video/mp4 video/mpeg video/mpg video/msvideo
video/quicktime video/vnd.dlna.mpeg-tts video/x-mpeg video/x-mpeg2a
video/x-ms-asf video/x-ms-asf-plugin video/x-ms-wm video/x-ms-wmv
video/x-ms-wmx video/x-ms-wvx video/x-msvideo vnd.ms.wmhtml
I've also connected the stateChanged signal of the mediaObject to a slot, and when I try to read my video, there is an error saying that file format is not supported.
How can I have Phonon to support it? Should I install a codec pack, even if mp4 is in my list?
I recently had a similar problem and after trying a number of codec packs, here is the one that worked.
K Lite Mega Codec Pack
If you go into the advanced install, you can uncheck the "Tools", "Program" (Windows Media Player Classic), "Shell Extension", and later uncheck the free browser toolbars that come with it, you end up with just the codecs.
Afterwards I have been able to play anything on Windows using the qmediaplayer example program included in the Demos folder of the QtSDK.

FTPClient in MFC :GetFile(Download) issue

I am using CFtpConnection class for creating my FTPClient Library using MFC.
I am using GetFile to download file from Server.
MY requirement is like if i am downloading 100 MB video from server when 50-60 MB video is downloaded and in between if i play that while it should play upto that particular location what it has downloaded uptil that time .
Is that way i can do it any additional parameters i need to pass or something like that?
My FTPlibrary download method is as follows:
CFtpConnection* m_pConnect;
bool CFTPClient::Download(LPCTSTR pstrRemoteFile, LPCTSTR pstrLocalFile,
DWORD dwFlags)
{
m_pConnect->GetFile(pstrRemoteFile,pstrLocalFile,dwFlags);
return true;
}
And while calling in my application i am doing like this :
CFTPClient m_objftpclient ;
m_objftpclient.Download("MVI_2884_1.avi","D:\\MVI_2884_1.avi",FTP_TRANSFER_TYPE_BINARY);
You can't do that easily or even do it at all. The GetFile method of CFtpConnection is blocking which means it will exit only when the file is downloaded. So even if you thread it, the only way you can monitor the download is to get the size of the file on disk.
If you're about to implement video streaming, you should go down a level and work at the socket level. If you really want to use CFtpConnection, you should use the method OpenFile which returns a CInternetFile which can be read by chunks allowing you to monitor the download and share the buffer in which the file is downloaded for playback.