I have an iPhone app which users can broadcast their camera to other users.
I have a self hosted Wowza Streaming Engine server.
I'm using GoCoder SDK and have couple of issues or maybe things that I did not understand:
I configure the connection parameters with a new generated StreamName but my expectation is that the SDK will create the streamFile in Wowza stream files, but it does not. It says that everything is ok but when I go to Wowza StreamFile nothing is in there.
If I create manually the stream file with the stream uri in advance than I can see the phone camera stemming, but I just don't get it how can I manage different users streaming if I need to create the stream files in advanced?
I was able to connect the streamFile I created in advanced with stream uri "wowz://host:port" to the live application but not to the webrtc application which I need it to present in my web site. Is there any limitation here?
I tried hard to find in Wowza docs some explanations on how things should works but with no success :-(
Related
I have an webRTC (native c++ -> node.js) app which allows screensharing. I've tried to make a windows service from it for easy permissions elevation as well as simple auto-start. I've managed to rewrite it as service but to simplify my example I'll skip that. Let's just say I've used nssm service wrapper to run my app as service.
Whether I run service as local system or my own user, when it gets to the point where I request system for video sources, webRTC return nullptr.
SourceList* sources;
captureScreen = webrtc::DesktopCapturer::CreateScreenCapturer(options);
captureScreen->GetSourceList(&sources); //Return nullptr!
Exception thrown: read access violation.
sources._Mypair._Myval2._Myfirst was nullptr. occurred
When capturing camera instead of screen everything works fine according to this guy and only relevant thread in whole internet I found.
Is there a bug in webRTC/win10? Is there some kind of workaround I can use in order for this to work?
If you are making web based WebRTC app then i think everthing should work fine.
I don't thing there is bug in either WebRTC and Windows while screen sharing.
Make sure you are using Google Chrome for development. (Just an advice)
Your webapp app has permission for capturing windows.
If your are using Google Chrome then make sure you are running the chrome with command line flag "--enable-usermedia-screen-capturing" e.g. on Windows "Chrome.exe --enable-usermedia-screen-capturing"
Screensharing web app using WebRTC - https://www.webrtc-experiment.com/Pluginfree-Screen-Sharing/
Checkout WebRTC projects on GitHub - https://github.com/muaz-khan/WebRTC-Experiment
:)
I want to create a server on one device and the changes I make on a certain website should be visible on other devices in real time, I don't want to cast the entire screen just the website.
Can anyone help me with that?
If you want really to use WebRTC to send something that appears in a web browser to another web client, you should see Canvas to peer connection.
Anyway, if various clients should be informed about an event via web, I suggest that you see Your first Web Progressive Application
I would like to know if it's currently possible to access AME SDK or API ?
I seen binaries named : - ame_webservice_console.exe - AMEWebService.exe (it's a windows service for media trans-coding)
But there isn't any documentation about that. It is implemented or in the roadmap for developers?
I explain the goal : I would like automatically schedule a transcode from a external project throught the AME Webservice without import manually the file by the menu. That's really interesting to mount an external and centralized render farm cluster.
You can find AMEWinService.exe and AMEWebService.exe inside Adobe Media Encoder CC(inside program files)
The things you have to do
1.Start the AME windows service by AMEWinService -install command in your command line(run as administrator)
2.Start the AME Web service in the same way
3.Now you can enjoy the service at localhost:8080 , if you want change this configuration you can done it at ame_webservice_config.ini file
4.for client side application development you can refer the below document
https://github.com/sp00x/node-adobe-media-encoder-webservice/blob/master/dist/ame-webservice-client.d.ts
Has anyone had any luck using ColdFusion as a way to collect data via streaming APIs?
i.e. - https://dev.twitter.com/docs/streaming-api
I know the best option is to use an app that literally sits on the server monitoring these portals. Just curious if anyone has done anything using CF yet.
Aaron Longnion built refynr.com using CF9. It's a service that collects users' Twitter streams based on supplied criteria. I imagine he's down something like you describe.
However, I'd look into the new web socket functionality built into ColdFusion 10 and see if that makes consuming streaming APIs any easier.
http://labs.adobe.com/technologies/coldfusion10/
If you know a bit of Java, it may not be too difficult to use Twitter4J, and build an event gateway for your CF app to consume the stream.
If you want to go the web socket route, see: Twitter + HTML5 webSocket API
I've been running a process for a client that involves grabbing their publicly available calendar file from me.com by making an https GET call in a ruby script, and then converting the data in the .ics file to html, then copying it to their website.
They recently upgraded to Lion and iCloud, and it appears that, while the calendar I want is still publicly available, it's only usable by webCal enabled apps--I can no longer get it over https.
I've poked around a bit on google, but haven't see anything that points me in the right direction yet. Does anyone know if there's a way to access public calendars on iCloud via http/https? Or is it strictly via webcCl? The documentation does make it sound like iCloud is designed to only share data among Apple devices. Am I just stuck here?
I'm surprised no one has answered this yet...
If you go to iCloud.com, you should be able to get the URL of the calendar that you have syncing using a public share. It should be a webcal protocol (webcal://). However, if you change that webcal to https, it will download the ics format instead of trying to sync using Macs iCalendar.
I have my website linking to the https ics file, and it appears to be working just fine (for now at least).
icloud is gona supply an API for developers soon at least thats what they sayed in the last keynote