I use GStreamer on Windows VS 2010 and need a pair Camera names - camera ID. Like logitech ... - sdjvnskj (or any other way to have an ID for a camera) and ID should be like in Linux /dev/video0,1 ... . I'm using the ways like ylatuya offerd How to detect the device name for a capture device? but it returns half of a string Ok - "USB" and half - UTF unreadable rubish (or mayb) like "\xe0Y\xd2". I have some examples from Linux where after it they create a pipeline like
GstElement *pipeline = gst_parse_launch("dshowvideosrc device-name="+ gst_camera_capturer_enum_video_devices() +"name=source1 ! ffmpegcolorspace ! fakesink", NULL);
but in device-name they write "/dev/video" and I don't know what to write for it. Also how to add to gchar something from GList? I just then need to get resolutions and framerates for each cameras.
Ok. If it's needed to anybody except me than the pair like in linux ID ("/dev/video0") and normal name is a bit difficult to get. In my case it shows only "USB-video device" for all cameras which is as I understand an ID and the symbolic name at the same time. Just put it to device-name=
in GstElement *pipeline = gst_parse_launch(
like above and you have the right camera that you need. The question here is :
"Does win differs cameras by anything except it's path or ID in dshow???".
Also I write in Russian OS and with Russian VS 2010 so the question is
how to print everything in Russian if in the line "USB-video device" USB prints ok and the rest, which is in Russian, is a useless line of some strange symbols?
But when I print it to a file it's ok.
Related
There are examples on Qt website regarding using audio API, but frankly I don't really understand them at all.
What I was imagining is writing array of values (bytes, integers...) into some audio buffer and have the sound card "play" them (actually DAC them).
Pseudocode:
// Square wave?
const int values[] = {255,255,255,255, 0,0,0,0, 255,255,255,255 ...};
// Create output that will buffer the bytes and put them on digi to analog converter
RawAudioOutput output(BIT_RATE_CONSTANT, ... some other parameters ...);
output.start();
output.writeBytes(values, sizeof(values));
Can I accomplish something like that? How would I go about it? I know I can model square wave in Audacity (doesn't sound nice), so I guess it's possible. How?
In Qt, if you want to write an array of values into an audio buffer, the class for that is QAudioOutput. The format of the array of values can vary, the PCM format should be supported by all platforms.
Qt ships with an example that demonstrates the usage of QAudioOutput, have a look at that. In the example, the Generator::generateData() function creates the array of values that are then later sent to the audio device.
Of course playing audio from an array of values is quite low-level. With QMediaPlayer, Qt also provides a high-level class to play sound files (.wav, .mp3), video files and even streams.
I have created a simple waveform generator which is connected to an AUGraph. I have reused some sample code from Apple to set AudioStreamBasicDescription like this
void SetCanonical(UInt32 nChannels, bool interleaved)
// note: leaves sample rate untouched
{
mFormatID = kAudioFormatLinearPCM;
int sampleSize = SizeOf32(AudioSampleType);
mFormatFlags = kAudioFormatFlagsCanonical;
mBitsPerChannel = 8 * sampleSize;
mChannelsPerFrame = nChannels;
mFramesPerPacket = 1;
if (interleaved)
mBytesPerPacket = mBytesPerFrame = nChannels * sampleSize;
else {
mBytesPerPacket = mBytesPerFrame = sampleSize;
mFormatFlags |= kAudioFormatFlagIsNonInterleaved;
}
}
In my class I call this function
mClientFormat.SetCanonical(2, true);
mClientFormat.mSampleRate = kSampleRate;
while sample rate is
#define kSampleRate 44100.0f;
The other setting are taken from sample code as well
// output unit
CAComponentDescription output_desc(kAudioUnitType_Output, kAudioUnitSubType_RemoteIO, kAudioUnitManufacturer_Apple);
// iPodEQ unit
CAComponentDescription eq_desc(kAudioUnitType_Effect, kAudioUnitSubType_AUiPodEQ, kAudioUnitManufacturer_Apple);
// multichannel mixer unit
CAComponentDescription mixer_desc(kAudioUnitType_Mixer, kAudioUnitSubType_MultiChannelMixer, kAudioUnitManufacturer_Apple);
Everything works fine, but the problem is that I am not getting stereo sound and my callback function is failing (bad access) when I try to reach the second buffer
Float32 *bufferLeft = (Float32 *)ioData->mBuffers[0].mData;
Float32 *bufferRight = (Float32 *)ioData->mBuffers[1].mData;
// Generate the samples
for (UInt32 frame = 0; frame < inNumberFrames; frame++)
{
switch (generator.soundType) {
case 0: //Sine
bufferLeft[frame] = sinf(thetaLeft) * amplitude;
bufferRight[frame] = sinf(thetaRight) * amplitude;
break;
So it seems I am getting mono instead of stereo. The pointer bufferRight is empty, but don't know why.
Any help will be appreciated.
I can see two possible errors. First, as #invalidname pointed out, recording in stereo probably isn't going to work on a mono device such as the iPhone. Well, it might work, but if it does, you're just going to get back dual-mono stereo streams anyways, so why bother? You might as well configure your stream to work in mono and spare yourself the CPU overhead.
The second problem is probably the source of your sound distortion. Your stream description format flags should be:
kAudioFormatFlagIsSignedInteger |
kAudioFormatFlagsNativeEndian |
kAudioFormatFlagIsPacked
Also, don't forget to set the mReserved flag to 0. While the value of this flag is probably being ignored, it doesn't hurt to explicitly set it to 0 just to make sure.
Edit: Another more general tip for debugging audio on the iPhone -- if you are getting distortion, clipping, or other weird effects, grab the data payload from your phone and look at the recording in a wave editor. Being able to zoom down and look at the individual samples will give you a lot of clues about what's going wrong.
To do this, you need to open up the "Organizer" window, click on your phone, and then expand the little arrow next to your application (in the same place where you would normally uninstall it). Now you will see a little downward pointing arrow, and if you click it, Xcode will copy the data payload from your app to somewhere on your hard drive. If you are dumping your recordings to disk, you'll find the files extracted here.
reference from link
I'm guessing the problem is that you're specifying an interleaved format, but then accessing the buffers as if they were non-interleaved in your IO callback. ioData->mBuffers[1] is invalid because all the data, both left and right channels, is interleaved in ioData->mBuffers[0].mData. Check ioData->mNumberBuffers. My guess is it is set to 1. Also, verify that ioData->mBuffers[0].mNumberChannels is set to 2, which would indicate interleaved data.
Also check out the Core Audio Public Utility classes to help with things like setting up formats. Makes it so much easier. Your code for setting up format could be reduced to one line, and you'd be more confident it is right (though to me your format looks set up correctly - if what you want is interleaved 16-bit int):
CAStreamBasicDescription myFormat(44100.0, 2, CAStreamBasicDescription::kPCMFormatInt16, true)
Apple used to package these classes up in the SDK that was installed with Xcode, but now you need to download them here: https://developer.apple.com/library/mac/samplecode/CoreAudioUtilityClasses/Introduction/Intro.html
Anyway, it looks like the easiest fix for you is to just change the format to non-interleaved. So in your code: mClientFormat.SetCanonical(2, false);
I have a question concerning Gstreamer and the path for the video (uri).
Indeed, in order to try my code, I used to set the path to my video directly in the C++ source code, that way :
data.pipeline = gst_parse_launch ("playbin2 uri=file:///D:/video", NULL);
But now, I am using a user interface (wxWidgets) in order to get the path to the video that the user wants to play. The path is now in a variable, m_txtVideoPath. And I don't know how I can launch the video using this variable, instead of D:/video.
Thanks in advance for your answer !
you have to construct the pipeline with the user-defined filename, rather than hardcode everything.
this is very basic string handling, you might want to consult a beginner's tutorial for your programming language of choice.
e.g.
std::string pipeline = "playbin2";
pipeline+=" uri=file://"+m_txtVideoPath;
std::cout << "PIPELINE: " << pipeline << std::endl; // for debugging
data.pipeline = gst_parse_launch (pipeline.c_str(), NULL);
I would like to write some C++ program that can detect the presence of an USB expansion card, or an SD card reader without it necessarily having anything plugged in to it. Is this possible? In Linux?
if you know the exact vendor id and/or product id you could search for it, like this:
for (bus = busses; bus; bus = bus->next)
for (dev = bus->devices; dev; dev = dev->next)
if ((dev->descriptor.idVendor == vendor) && (dev->descriptor.idProduct == product))
return dev;
libusb tutorial
Yes, You can get the idVendor and idProduct by a simple dmesg. And then, put a search condition for it just shown above by a knowledgeble man. If you want to dig deeper, and if you have a linux, then you can explore usb.h present in your <kernel_source>/drivers/usb/core.
There is a structure : struct usb_device{}.
If you yet more keen to explore then, you should check out driver.c and hub.c for methods like announce_device() which prints the USB device details after a dmesg. Explore it for good! :)
I am using Win32 API.
Really i do not understand how to get the drive letter for DevicePath of a USB stick .
can you pls explain it to me
( what i have is SP_DEVICE_INTERFACE_DETAIL_DATA DevicePath
using this Device path i get VID AND PID of the usb device
my device path looks like below
"\?\usb#vid_1a8d&pid_1000#358094020874450#{a5dcbf10-6530-11d2-901f-00c04fb951ed}"
Is there any way to to map DRIVE LETTER to my DEVICE PATH
so please help me to map drive letter to DevicePath )
Thanks for any help.
The link I provided in your other question gives you all the information you need to do this. In semi-pseudocode:
DiskDevice = CreateFile(DiskDevicePath);
DiskDeviceNumber = DeviceIoControl(DiskDevice, IOCTL_STORAGE_GET_DEVICE_NUMBER);
for each VolumeDevicePath in GetLogicalDriveStrings
VolumeDevice = CreateFile(VolumeDevicePath);
VolumeDeviceNumber = DeviceIoControl(VolumeDevice, IOCTL_STORAGE_GET_DEVICE_NUMBER);
if(VolumeDeviceNumber == DiskDeviceNumber)
// volume (i.e. "G:") corresponding to VolumeDevicePath resides on disk (i.e. "XYZ USB Storage Device") corresponding to DiskDevicePath
I'm not 100% sure (it's been a while), but I think that the Disk device (GUID_DEVINTERFACE_DISK) is a child of the USB device (GUID_DEVINTERFACE_USB_DEVICE). In any event, I think DiskDevicePath needs to be the path of the Disk device (not the USB device).
Take a look at this, maybe it'll help (I don't think there's an easy way to do it ...)
http://msdn.microsoft.com/en-us/library/cc542456(VS.85).aspx