rrdtool update expected 2 data sources - rrdtool

I wrote a simple rrdtool database to graph Wi-Fi signal strength and modulation. The signal strength works, but when I try to update the db with MCS information, I get:
ERROR: ./somefile.rrd: expected 2 data source readings (got 1) from mcsul15
Here's my update code:
rssi=`snmpget -v 2c -c communityname 1.2.3.4 .1.3.6.1.4.1.17713.21.1.2.3.0 | awk -v x=4 '{print $x}' | tr -d -`
noisefloor=`snmpget -v 2c -c communityname 1.2.3.4 .1.3.6.1.4.1.17713.21.1.2.20.1.9.1 | awk -v x=4 '{print $x}' | tr -d -`
ulmcs14=`snmpget -v 2c -c communityname 1.2.3.4 CAMBIUM-PMP80211-MIB::ulWLanMCS14Packets.0 | awk -v x=4 '{print $x}'`
ulmcs15=`snmpget -v 2c -c communityname 1.2.3.4 CAMBIUM-PMP80211-MIB::ulWLanMCS15Packets.0 | awk -v x=4 '{print $x}'`
echo $rssi
echo $noisefloor
echo $ulmcs14
echo $ulmcs15
rrdtool update ./somefile.rrd --template \
rssi:noisefloor N:$rssi:$noisefloor \
mcsul15:mcsul14 N:$ulmcs15:$ulmcs14
Which gives me:
68
94
143679
17602658
ERROR: ./somefile.rrd: expected 2 data source readings (got 1) from mcsul15
What am I missing?

Assuming that somefile.rrd has 4 DS defined in it with those 4 names, you should give all four together when updating. You can only specify one template for the update, and the other parameters should be in that format.
Also, check the names of your DS are correct as your variable is called $ulmcs15 but the DS is being named mcsul15.
rrdtool update ./somefile.rrd --template \
rssi:noisefloor:mcsul15:mcsul14 \
N:$rssi:$noisefloor:$ulmcs15:$ulmcs14
The error message is because in your original commandline, mcsul15:mcsul14 is being taken as an update vector, not a template. Thus it is one timestamp and one value, where two were expected. It would have been a better error message to say something like "timestamp not recognised in 'mcsul15'" but that's a different issue...

Related

Airflow to copy most recent file from GCS bucket to local

I want to copy latest file from a gcs bucket to local using airflow composer.
I was trying to use gustil cp to get the latest file and load into local airflow but got issue: CommandException: No URLs matched error . If I check the XCom I am getting value='Objects' .Any suggestion?
download_file = BashOperator(
task_id='download_file',
bash_command="gsutil cp $(gsutil ls -l gs://<bucket_name> | sort -k 2 | tail -1 | awk '''{print $3}''') /home/airflow/gcs/dags",
xcom_push=True
)
Executing the gsutil command gsutil ls -l gs://<bucket_name> | sort -k 2 | tail -1 | awk '''{print $3}''' will also display the row with total size, objects and etc., will sort by date then get the last row and get the third column of row. That's why you get 'objects' as value like the output sample below:
TOTAL: 6 objects, 28227013 bytes (26.92 MiB)
Try this code to get the second last row only :
download_file = BashOperator(
task_id='download_file',
bash_command="gsutil cp $(gsutil ls -l gs://bucket_name | sort -k 2 | tail -2 | head -n1 | awk '''{print $3}''') /home/airflow/gcs/dags",
xcom_push=True
)

What's the command line for rrdtool to create a graph using the last update time as the end time?

Going off of this question: Print time of recording for LAST value
It appears possible to have rrdtool compute the timestamp of the last update in a rrd. How do you use this in a command as the "end" time?
i.e. I want to do something like this:
rrdtool graph img.png -a PNG -s e-600 -e LASTUPDATETIME -v "CPU Usage" \
--title "CPU Utilization" DEF:ds0a=node.rrd:ds0:AVERAGE \
DEF:ds1a=node.rrd:ds1:AVERAGE AREA:ds0a#35b73d:"User" \
LINE1:ds1a#0400ff:"System"
I tried mucking about the DEF, CDEF and VDEF things to no avail:
rrdtool graph img.png -a PNG -v "CPU Usage" --title "CPU Utilization" \
DEF:data=node.rrd:x:AVERAGE CDEF:count=data,UN,UNKN,COUNT,IF \
VDEF:last=count,MAXIMUM \
DEF:ds0a=node.rrd:ds0:AVERAGE:start=end-600:end=last \
DEF:ds1a=node.rrd:ds1:AVERAGE:start=end-600:end=last \
AREA:ds0a#35b73d:"User" LINE1:ds1a#0400ff:"System"
This results in:
ERROR: end time: unparsable time: last
Any ideas?
on the command line, you could do
rrdtool graph img.png -a PNG -s e-600 -e `date +%s node.rrd` -v "CPU Usage" \
--title "CPU Utilization" DEF:ds0a=node.rrd:ds0:AVERAGE \
DEF:ds1a=node.rrd:ds1:AVERAGE AREA:ds0a#35b73d:User \
LINE1:ds1a#0400ff:System

/bin/sh + "grep -o" + regular expression = single line output?

after some investigation I have managed to find my wanted regular expression to work on /bin/sh - busybox:
INPUT:
Mar 8 09:58:29 mysuperhost kern.alert kernel: Rejected OUT -- IN=br0 OUT=vlan2 SRC=192.168.1.8 DST=3.26.211.8 LEN=95 TOS=0x00 PREC=0x00 TTL=127 ID=648 PROTO=UDP SPT=22008 DPT=51413 LEN=75
REGEXP:
grep -o -E '((^.{0,16})|(IN=.\S*)|(IN=.\S*)|(OUT=.\S*)|(SRC=.\S*)|(DST=.\S*)|(PROTO=.\S*)|(SPT=.\S*)|(DPT=.\S*))'
Which gives me:
Mar 8 09:58:29
IN=br0
OUT=vlan2
SRC=192.168.1.8
DST=3.26.211.8
PROTO=UDP
SPT=22008
DPT=51413
The problem:
I don't seem to be able to have grep giving me the result on a single line
The wanted result:
Mar 8 09:58:29 IN=br0 OUT=vlan2 SRC=192.168.1.8 DST=3.26.211.8 PROTO=UDP SPT=22008 DPT=51413
Here is an awk
awk '{printf "%s %s %s ",$1,$2,$3;for (i=4;i<=NF;i++) if ($i~/(IN|OUT|SRC|DST|PROTO|SPT|DPT)=/) printf "%s ",$i;print ""}' file
Mar 8 09:58:29 IN=br0 OUT=vlan2 SRC=192.168.1.8 DST=3.26.211.8 PROTO=UDP SPT=22008 DPT=51413
Just change whats in the if test to select fields you like.

Linux - Sort a File based on key position with header and trailer

Below gives the list in a file (unsorted-file) that needs to be sorted in Linux, preferably in a single line linux command.
03123456789abcd
02987654321pqrs
02123456789mnop
03987654321stuv
04123456789ghjk
01000000000
99000000000
97000000000
98000000000
Required sorted file output:
01000000000
02123456789mnop
03123456789abcd
04123456789ghjk
02987654321pqrs
03987654321stuv
97000000000
98000000000
99000000000
Requirement:
If first two char is 01 then it is the header
If first two char is greater than 90 then they are trailers
Sort order: position 3 - 11 and then position 1 - 2
I tried a simple sort command like
$sort unsorted-file > sorted-file.
The requirement 3 failed. Then I tried
$sort -k 1.3, 1.11 -k 1.2 unsorted-file > sorted-file
The trailer records made it to the top of the file because of all zeros from position 3.
The other options that I know is to strip out the headers and trailers; sort the file and merge the header and trailer files back. Is there a way to do in one linux (complex) command itself?
Thanks for your time.
-R-
( grep '^01' unsorted-file
grep -E -v '^(01|9)' unsorted-file | sort -k 1.3,1.11 -k 1.1
grep '^9' unsorted-file ) > sorted-file

List workspaces of a user on a specific machine in Perforce

How can I get all Perfoce workspaces of a specific user on a specific machine?
This command let me all workspaces of a specific user on all machines:
P4 clients -u username
Here's a cmd one-liner that does more or less the same thing as pitseeker's:
for /f "tokens=2" %x in ('p4 clients -u username') do #(echo %x & p4 client -o %x | findstr /r /c:"^Host:")
A somewhat more robust batch file that seems to fit what you're looking for is:
#echo off
set USER=%1
set HOST=%2
REM Don't forget to double your for-loop percents in batch files,
REM unlike the one-liner above...
for /f "tokens=2" %%x in ('p4 clients -u %USER%') do call :CheckClient %%x
goto :EOF
:CheckClient
p4 client -o %1 | findstr /r /c:"^Host:" | findstr /i /r /c:"%HOST%$">nul && echo %1
goto :EOF
Save that and run it with the username as the first parameter and the desired host name as the second. That is, something like showclient elady elady_pc
Not exactly what you're asking for, but it's easy and perhaps sufficient:
p4 clients -u username | cut -f2 -d' ' | xargs -n 1 p4 client -o |egrep -e '^Client|^Host'
This lists all your clients and their host-restrictions (if any).
In the resulting list you can find the specific machines very easily.