Icecast : send output parameters in a mountpoint - icecast

I have defined a mount point in my Icecast server, for example data.ogg. I would like to send some parameters to the mount point I have defined, for example /data.ogg?param1=x1&param2=x2&param3=x3. I have tried to parse these parameters in a new script and I could only parse the mount point /data.ogg. I have the following question:
How to parse the other parameters (param1,param2 and param3) in a new script? I have written a script and tried to parse all the parameters sent as $1,$2,$3
but din't find any result.
An example of the script would be :
echo $1 >/tmp/viewls.txt
echo $! >/tmp/viewls2.txt
echo $3 >/tmp/viewls3.txt
Any idea on how to do it ?
Thanks
GOrka

Related

Converting part of read line from file to a variable in tcl

I have a file which has something like this:
IPLIST: 10.10.10.1 10.10.10.2 # A bunch of IPs
#CMDS:
ping $ip
I want to use tcl to read the file and run the commands.
I've been successful in reading the file, and create a list of IPs, and also create a list of commands.
And I want to run the commands, for which I did:
#iplist is the list of IP formed from IPLIST in file
foreach ip $iplist {
# cmdlist is the list of commands read from file
foreach cmd $cmdlist {
echo "$cmd\n"
}
}
I was expecting the $ip in the command will get replaced by the ip variable in the first foreach loop. But, that is not happening. What I get is:
ping $ip
Is there a way I can get the $ip in the file get converted to the ip from the iplist as I run the foreach loop ?
I did look at a whole bunch of examples here, but none which can be used in this situation
Thank you for the help!
Try using subst:
echo [subst -nobackslashes -nocommands $cmd]
It will perform substitution on variables. I'm also using -nobackslashes and -nocommands there just in case there might be square parens which might not be what you want to execute, if there are any, but if you do want them to execute, then you can omit them.

Bash, Netcat, Pipes, perl

Background: I have a fairly simple bash script that I'm using to generate a CSV log file. As part of that bash script I poll other devices on my network using netcat. The netcat command returns a stream of information that I can pipe that into a grep command to get to certain values I need in the CSV file. I save that return value from grep into a bash variable and then at the end of the script, I write out all saved bash variables to a CSV file. (Simple enough.)
The change I'd like to make is the amount of netcat commands I have to issue for each piece of information I want to save off. With each issued netcat command I get ALL possible values returned (so each time returns the same data and is burdensome on the network). So, I'd like to only use netcat once and parse the return value as many times as I need to create the bash variables that can later be concatenated together into a single record in the CSV file I'm creating.
Specific Question: Using bash syntax if I pass the output of the netcat command to a file using > (versus the current grepmethod) I get a file with each entry on its own line (presumably separated with the \n as the EOL record separator -- easy for perl regex). However, if I save the output of netcat directly to a bash variable, and echo that variable, all of the data is jumbled together, so it is cumbersome to parse out (not so easy).
I have played with two options: First, I think a perl one-liner may be a good solution here, but I'm not sure how to best execute it. Pseudo code might be to save the netcat output to a a bash variable and then somehow figure out how to parse it with perl (not straight forward though).
The second option would be to use bash's > and send netcat's output to a file. This would be easy to process with perl and Regex given the \n EOL, but that would require opening an external file and passing it to a perl script for processing AND then somehow passing its return value back into the bash script as a bash variable for entry into the CSV file.
I know I'm missing something simple here. Is there a way I can force a newline entry into the bash variable from netcat and then repeatedly run a perl-one liner against that variable to create each of the CSV variables I need -- all within the same bash script? Sorry, for the long question.
The second option would be to use bash's > and send netcat's output to
a file. This would be easy to process with perl and Regex given the \n
EOL, but that would require opening an external file and passing it to
a perl script for processing AND then somehow passing its return value
back into the bash script as a bash variable for entry into the CSV
file.
This is actually a fairly common idiom: save the output from netcat in
a temporary file, then use grep or awk or perl or what-have-you as
many times as necessary to extract data from that file:
# create a temporary file and arrange to have it
# deleted when the script exists.
tmpfile=$(mktemp tmpXXXXXX)
trap "rm -f $tmpfile" EXIT
# dump data from netcat into the
# temporary file.
nc somehost someport > $tmpfile
# extract some information into variable `myvar`
myvar=$(awk '/something/ {print $4}' $tmpfile)
That last line demonstrates how to get the output of something (in this case, an awk script) into a variable. If you were using perl to extract some information you could do the same thing.
You could also just write the whole script in perl, which might make your life easier.

Detecting errors of a command that print nothing if the command was successful using Perl and Expect

I am trying to automate the configuration of a server using perl and the expect module. I have been using the expect module three days but now I have encountered a problem that I can't solve.
My problem is when im executing a command that prints no output if it is successful but prints an error message if something went wrong. An example of such command is the cd command:
$ cd .
$
$ cd sadjksajdlaskd
sadjksajdlaskd: No such file or directory.
$
What I would like to do is to send the command to the server, and then perform an expect call to check if something other than the prompt sign was printed. Something like this:
$com->send("cd $dir");
$com->expect(2,
["^[^$#]*", sub {
my $self = shift;
my $error = $self->match();
die "ERROR: $error";
}],
"-re", "^[$#]"
);
The problem I have is that when I perform the expect call it will match against all previous text and not against text received after the send call, so it will always match and report an error. How do I make expect match only agains the text received after the send call? Is it possible to clear the buffer of the expect module or is it possible to achieve this kind of error detection some other way?
I also wonder how the expect module handles regular expressions. If I for example use "^[$#]\$" as the regular expression to match the prompt of the terminal, will the \$ part of the regular expression match end of line or an actual dollar sign? If i remove the \ perl complains.
Thanks in advance!
/Haso
EDIT: I have found a solution:
The solution was to use $com->clear_accum() which clears the accumelator. I have tried using it before but it seems like this function only works at random, or maybe I don't understand what clear_accum() is suppose to do.
EDIT: A final note about clear_accum():
The reason the clear_accum() function seems to work at random is because the text generated from the previous send is not read into the accumelator until an expect() call is made. So in order to truly clear all previous data is to first perform an expect() call and then clear the accumelator:
#To clear all previous data
$com->expect(0);
$com->clear_accum();
akarageo#Pal4op:~> cd banana
bash: cd: banana: No such file or directory
akarageo#Pal4op:~:( > echo $?
1
i.e. check the error code that CD returns, 0 means OK anything else is an error, No need to check the prompt , and btw, the CD command does not generate the prompt the shell does, so that must be part of your confusion also.
try $object->exitstatus() if it is of any help

Bash script - Need help getting match and substitution working

I am trying to get parameter substitution working in my bash script ... I know I have gotten this all wrong ... I am trying to create a script that will rename a PART of a file.
#!/bin/bash
for i in *.hpp; do mv -v "$3 ${$3/$1/$2}" ; done
The error I am getting is:
line 2: $3 ${$3/$1/$2}: bad substitution
${$3} will attempt to interpolate ${"CONTENTS OF $3"} into a variable. It is more likely that you want ${3}. It is even more likely that you want ${i}.

shell script pattern matching?

I think I've written maybe one shell script my entire life, and I'm not even sure if it's possible to do this, but I'm trying to write a script that will ftp the contents of a directory, one at a time. That is, it'll ftp one and then close the connection, then ftp the second, and close that etc. This is because there may be up to five files in a directory all of which are a minimum of 2GB each. FTPing them all at once always results in a reset connection. I thought that if I could match by partial filename, then perhaps that will help, as they are all named the same way.
So, in a directory, it'll have:
SampleFileA_20100322_1.txt
SampleFileA_20100322_2.txt
SampleFileB_20100322_1.txt
SampleFileC_20100322_1.txt
I'd like to ftp SampleFileA_xxxx_1 first, then SampleFileA_xxxx_2, etc. This is the current ftp script, which tries to download everything all at once...
#!/bin/bash
REMOTE='ftp.EXAMPLE.com'
USER='USERNAME'
PASSWORD='PASSWORD'
FTPLOG='/tmp/ftplog'
date >> $FTPLOG
ftp -in $REMOTE <<EOF
_FTP>>$FTPLOG
quote USER $USER
quote PASS $PASSWORD
bin
cd download
mget *
quit
_FTP
:wq!
based on your question I think you need something like
files=`ls Sample*txt`
for file in $files
do
run_ftp_function $file
done
you'll need to setup "run_ftp_function" to do the send (like you already have) using $1 as the file to send