Data format in EtherCAT is it only oriented process? - ethercat

I Want to know if EtherCAT can be used in other scenarios than process management, like transferring video for example.
Thanks !

EtherCAT is a communication protocol to transfer data at a very high speed. At the moment it is only used for high speed automation and other industrial applications. I have not found any evidence of it used otherwise.

You can define your own format -- the so called vendor-specific protocol over EtherCAT (VoE) or use File over EtherCAT (FoE) in order to transfer separate video frames.

Related

C++ socket programming, multicast with compression, any good libraries/wrappers?

I am beginning to get into socket programming. Currently, I am transferring data between server and clients using scp which scales very poorly when dealing with streams of data (it seems like each new scp session needs to open a new TCP connection and this really slows down the speed).
I would like to transfer text to multiple clients, over a day, this text could reach a couple gigabytes in size so implementing some sort of compression is key.
Can anybody recommend some good libraries or wrappers which can simplify writing this code? The standard C++ sockets interface is quite cumbersome to work with. So far, my only lead is Boost ASIO but that doesn't seem to have compression capabilities. Any suggestions would be much appreciated.
For the compression part, you can use zlib. There are many C++ interfaces out there for zlib, or you can use it directly to compress and decompress messages.
try UDT.
UDT is a reliable UDP based application level data transport protocol for distributed data intensive applications over wide area high-speed networks. UDT uses UDP to transfer bulk data with its own reliability control and congestion control mechanisms.
I dont really know if the compression is available ...
Compression and multicast are two orthogonal issues. As said by previous posters, pick a compression library best suited for your data.
For multicast there are multiple options, OpenPGM and RSP are open source.

What are the basics of networking for a 3D game in C++?

In a few weeks I'm thinking of helping a project finish a pretty major aspect of a C++ world with 3D characters: networking. I will help with the server's transfer of information from/to clients. I already know C++ well enough. I just need to know what specifically I should know to do this and resources from which I could find this information. Thanks :)
as RageD said, it's a big difference in networking between different types of games. A FPS server typically sends complete game state to all clients regularly (e.g. 60Hz) over UDP. Other game types can use TCP (tuned a bit like TCP_NODELAY and forcing immediate ACK packets) or reliable UDP (raknet lib or others). Network protocol can become really wide so you'll need to think how to make it easily extendable. I'd recommend you to start from here: http://www.gamedev.net/community/forums/showfaq.asp?forum_id=15

How to get started implementing a video streaming server in c/c++?

In my project I need a dedicated server that dispatches the streams over to multiple clients.
More specificly, I've a callback function that gets called to gather the stream data, but no idea how to stream it over to other applications.
What's the best way to get started on this ?
What type of video are you planning to stream?
There's an open source library called liveMedia available at http://www.live555.com. This c++ library is available under LGPL and implements the RTSP, RTP/RTCP protocols and payload formats for many different media types. There is a class called DeviceSource IIRC that facilitates getting data into the library. There is an active mailing list and you should be able to find lots of information by searching the archives.
There are also a bunch of example test projects that illustrate how to stream mpeg, mp3, etc.
Should you choose to use standardized protocols, you might want to read up on RTP and RTSP.
I think you should check communication through network sockets.
There is no network concept in C++, so you have to rely on your system API or libraries ( as boost.asio for instance )

What does "loop" mean in "loop device"?

This maybe a linguistic question. I have checked the loop device on Wikipedia. It is just for mounting files as block device. But what does "loop" mean here? Its usage here is totally bizarre to me. I am not a native English speaker. So could someone explain this jargon to me in plain English? :)
Thanks.
It's short for "loopback".
The concept is also known as a disk image. I guess because driver calls to the image get passed along to the underlying driver of the physical disk. There is no actual loop involved; it is an additional level to a driver stack which is already several layers deep.
(I've written an encrypted disk image driver, and I find the "loopback" terminology incorrect and confusing.)
I guess this term comes from the communication realm when sometimes it is needed to test the communication system by simulating a peer using a proxy circuit loop.
The concept came also to UNIX networking where loopback network interfaces do not send network traffic to the medium.
The same concept in file systems loop means that the file system driver does not really goes through the hard disk IO stack and, instead, ends using a plain disk image file for IO.

streaming video to and from multiple sources

I wanted to get some ideas one how some of you would approach this problem.
I've got a robot, that is running linux and uses a webcam (with a v4l2 driver) as one of its sensors. I've written a control panel with gtkmm. Both the server and client are written in C++. The server is the robot, client is the "control panel". The image analysis is happening on the robot, and I'd like to stream back the video from the camera to the control panel for two reasons:
A) for fun
B) to overlay image analysis results
So my question is, what are some good ways to stream video from the webcam to the control panel as well as giving priority to the robot code to process it? I'm not interested it writing my own video compression scheme and putting it through the existing networking port, a new network port (dedicated to video data) would be best I think. The second part of the problem is how do I display video in gtkmm? The video data arrives asynchronously and I don't have control over main() in gtkmm so I think that would be tricky.
I'm open to using things like vlc, gstreamer or any other general compression libraries I don't know about.
thanks!
EDIT:
The robot has a 1GHz processor, running a desktop like version of linux, but no X11.
Gstreamer solves nearly all of this for you, with very little effort, and also integrates nicely with the Glib event system. GStreamer includes V4L source plugins, gtk+ output widgets, various filters to resize / encode / decode the video, and best of all, network sink and sources to move the data between machines.
For prototype, you can use the 'gst-launch' tool to assemble video pipelines and test them, then it's fairly simply to create pipelines programatically in your code. Search for 'GStreamer network streaming' to see examples of people doing this with webcams and the like.
I'm not sure about the actual technologies used, but this can end up being a huge synchronization ***** if you want to avoid dropped frames. I was streaming a video to a file and network at the same time. What I eventually ended up doing was using a big circular buffer with three pointers: one write and two read. There were three control threads (and some additional encoding threads): one writing to the buffer which would pause if it reached a point in the buffer not read by both of the others, and two reader threads that would read from the buffer and write to the file/network (and pause if they got ahead of the producer). Since everything was written and read as frames, sync overhead could be kept to a minimum.
My producer was a transcoder (from another file source), but in your case, you may want the camera to produce whole frames in whatever format it normally does and only do the transcoding (with something like ffmpeg) for the server, while the robot processes the image.
Your problem is a bit more complex, though, since the robot needs real-time feedback so can't pause and wait for the streaming server to catch up. So you might want to get frames to the control system as fast as possible and buffer some up in a circular buffer separately for streaming to the "control panel". Certain codecs handle dropped frames better than others, so if the network gets behind you can start overwriting frames at the end of the buffer (taking care they're not being read).
When you say 'a new video port' and then start talking about vlc/gstreaming i'm finding it hard to work out what you want. Obviously these software packages will assist in streaming and compressing via a number of protocols but clearly you'll need a 'network port' not a 'video port' to send the stream.
If what you really mean is sending display output via wireless video/tv feed that's another matter, however you'll need advice from hardware experts rather than software experts on that.
Moving on. I've done plenty of streaming over MMS/UDP protocols and vlc handles it very well (as server and client). However it's designed for desktops and may not be as lightweight as you want. Something like gstreamer, mencoder or ffmpeg on the over hand is going to be better I think. What kind of CPU does the robot have? You'll need a bit of grunt if you're planning real-time compression.
On the client side I think you'll find a number of widgets to handle video in GTK. I would look into that before worrying about interface details.