How to setup multiple mount points and two separate playlist in the single icecast server? - icecast

Below is my sample icecast configuration. Can you suggest me how to create multiple mount points. I want to create a separate playlist for each mount point. Is that possible ?
IP=1.2.4.5
PORT=8000
SERVER=2
MOUNT=/radiostation2
PASSWORD=password
FORMAT=1
MP3PATH=m3u:/usr/local/etc/playlist2.m3u
LOOP=1
SHUFFLE=1
NAME=RadioStation 2: MP3
DESCRIPTION=Test Radio
GENRE=Varios
URL=http://localhost:8000/
LOG=2
LOGPATH=/var/log/icecast/playlist1.log
BITRATE=48000
SOURCE=source
Thanks,
Raja K

It looks like you are specifying server options and also a playlist.
You need an application to supply icecast with music, such as ices or mpd. This tutorial discusses setting up icecast and also setting up an application to play music (a source): https://www.howtoforge.com/how-to-install-a-streaming-audio-server-with-icecast-2.3.3-on-centos-6.3-x86_64-linux

Related

Sync postgres database via git

We are 3 students working on a django project with postgres database and we sync our project with eachother via a git repository like gitlab. we have different os (windows 10 and linux ubuntu 20). and we use vscode as IDE.
How can we sync our entire database (data entry) via git (like sqlite) ?
Is there any way to handle it via kind of converting our db to file ?
Its very complicated to sync your three local database. The best way is to host your database into a cloud platform and all 3 of you connect to that.
There are free cloud platform you can use for 1 year like amazon web service, google cloud platform. You just need an active debit/credit card, it won't charge you except for amazon, it will deducted 1 dollar for account verification.
I would definitely go with this answer.
If you do not want to do that, use pg_dump to export the data from the database as a text file and sync that one using git. But you might still get a lot of merge conflicts.
I use https://www.elephantsql.com/ for it and it works! thank you

Perforce (AWZ Server Lightsail Windows Instance) - Unreal Engine Source Control - Move Perforce Depot

I'll give a bit of a background as to the setup we have and why. Currently myself and a friend want to collaborate on an Unreal Engine Project. To do this I've set up an Amazon Lightsail Instance with Windows Server running. I've then installed Perforce onto this Server and added two users. Both of us are able to connect to this server from our local machines (great I thought!). Our goal was to attach two 'virtual' disks of 32gb to this server via Lightsails Storage option. I've formatted these discs and they are detected as Disk D and E on the Server. Our goal was to have two depots, one on Disk E and one on Disk D, the reason for this being the C disk was only 20gb (12gb Free after Windows).
I have tried multiple things (not got much hair left after this) to try and map the depots created to each HDD but have had little success and need your wisdom!
I've followed both the process indicated in this support guide (https://community.perforce.com/s/article/2559) via CMD as well as changing the depot storage location in P4Admin on the Server via RDP to the virtual disks D and E respectively.
Example change is from //UE_WIP/... to D:/UE_WIP/... (I have create a folder UE_WIP and UE_LIVE on each HDD).
When I open up P4V on my local machine I'm able to happily connect (as per screenshot) and set workstation to my local machine (detects both depots). This is when we're getting stuck. I then open up a new unreal engine file and save the unreal engine file to the the following local directory E:/DELETE/Perforce/Test/ and open up source control (See image 04). This is great, it detects the workspace and all is connecting to the server.
When I click submit to source control I get the following 'Failed Checking Source Control' when I try adding via P4V manually marking the new content folder for add I get the following 'file(s) not in client view.
All we want is the ability to send an Unreal Engine up to either the WIP Drive Depot or the Live Drive Depot. To resolve this does it require:
Two different workstations (one set up for LIVE and one for WIP)
Do we need to add some local folders to our directory? E:/DELETE/Perforce/UE_WIP & E:/DELETE/Perforce/UE_LIVE?
Do we need to tweak something on the Perforce Server?
Do we need to tweak something in Unreal Engine?
Any and all help would be massively appreciated.
Best,
Ben
https://imgur.com/a/aaMPTvI - Image gallery of issues
Your screenshots don't show how (or if?) you set up your local workspace (i.e. the thing that tells Perforce where the files are on your local workstation).
See: https://www.perforce.com/perforce/r13.1/manuals/p4v/Defining_a_client_view.html
The Perforce server acts as a layer of abstraction between the backend storage (i.e. the depots you've set up) and the client machines where you actually do your work. The location of the depot files doesn't matter at all to the client (any more than, say, the backend filesystem of a web server matters to your web browser); all that matters is how you set up the workspace, which is a simple matter of "here's where my local files are" (the Root) and "here's how my local paths map to depot paths" (the View).
You get the "file not in view" error if you try to add a local file to the depot and it's not in the View you've defined. The fix there is generally to simply fix the Root and/or View to accurately describe where you local files are. One View can easily map to multiple depots (as long as they're on a single server).
(edit)
Specifically, in your case, all of the files you're trying to add are under the path:
E:\DELETE\Perforce\Test\Saved\...
Since you've set up your workspace as:
Client: bsmith
Root: E:\DELETE\Perforce\bsmith
View:
//WIP/... //bsmith/WIP/...
//LIVE/... //bsmith/LIVE/...
then your bsmith workspace consists of these two local paths:
E:\DELETE\Perforce\bsmith\WIP\...
E:\DELETE\Perforce\bsmith\LIVE\...
The files you're trying to add aren't even under your Root, much less under either of the View mappings. That's what the "not in client view" error messages mean.
If you want to add the files where they are, modify your Root and View so that you define your workspace as being where your files are; if you want to have the files in one of the local directories above that you've already defined as being where your workspace lives, you'll have to move them there. If you put your files in bsmith\WIP, then when you add them they'll go to the WIP depot; if you put them in bsmith\LIVE, then they'll go to the LIVE depot, per your View.
Either way, once they're in your workspace, you can add them to the depot. Simple as that!

Can AWS host a HLS link that can change at a whim?

​Hi everyone,
Situation: I want AWS to host a HLS link which I can change at a whim (so, not hard coded) to guide devices like the Roku, Fire TV, Alexa, Apple TV, where the HLS link is. Currently my programs tell devices to go to Ooyala then Ooyala tells the device where to get HLS link.
So, I want to cut out Ooyala and just use AWS to tell devices where to get the HLS link (at CDN).
Problem: Does anyone know if it is possible or another solution? if so, what do I need and what to research? I was thinking something along the lines of writing a script and a static IP.
If you have an idea, please label a few steps for me, so I can get an idea the possibilities!
Thank you,
Jackson
I've built many apps on the devices you've listed. Some clients have decided to use an external config (json or xml) that each of the apps load on startup. Ultimately it is the equivalent of an API response. You can employ some cache busting during the request for the file to make sure you always get the latest. You can choose to host that file anywhere you want...

How to use salt-stack send files from VM datastore to a minion in that datastore

I use salt-stack & vSphere Web to manage a vCenter server. There many virtual machines run in this, so I use one vm as salt-master.
There are some files in the vCenter's datastore. And I want to send these files to a minion.
Image of files
I have tried different ways many times. For example, I tried changing the datastore in cloud.profiles.d/.conf. Nothing has worked so far.
It's almost like I can't use salt to control datastore any more.
I tried using the mount.mounted. Nothing has worked so far.
So, I use the " virtual hard " on vSphere Web Client instead of it. It works.

Update wowza StreamPublisher schedule via REST API (or alternative)

Just getting started with Wowza Streaming Engine.
Objective:
Set up a streaming server which live streams existing video (from S3) at a pre-defined schedule (think of a tv channel that linearly streams - you're unable to seek through).
Create a separate admin app that manages that schedule and updates the streaming app accordingly.
Accomplish this with as a little custom Java as possible.
Questions:
Is it possible to fetch / update streamingschedule.smil with the Wowza Streaming Engine REST API?
There are methods to retrieve and update specific SMIL files via the REST API, but they only seem to be applicable to those created through the manager. After all, streamingschedule.smil needs to be created manually by hand
Alternatively, is it possible to reference a streamingschedule.smil that exists on an S3 bucket? (In a similar way footage can be linked from S3 buckets with the use of the MediaCache module)
A comment here (search for '3a') seems to indicate it's possible, but there's a lot of noise in that thread.
What I've done:
Set up Wowza Streaming Engine 4.4.1 on EC2
Enabled REST API documentation
Created a separate S3 bucket and filled it with pre-recorded footage
Enabled MediaCache on the server which points to the above S3 bucket
Created a customised VOD edge application, with AppType set to Live and StreamType set to live in order to be able to point to the above (as suggested here)
Created a StreamPublisher module with a streamingschedule.smil file
The above all works and I have a working schedule with linearly streaming content pulled from an S3 bucket. Just need to be able to easily manipulate that schedule without having to manually edit the file via SSH.
So close! TIA
To answer your questions:
No. However, you can update it by creating an http provider and having it handle the modifications to that schedule. Should you want more flexibility here you can even extend the scheduler module to not require that file at all.
Yes. You would have to modify the ServerListenerStreamPublisher solution to accomplish it. Currently it solely looks a the local filesystem to read teh streamingschedule.smil file.
Thanks,
Matt