AWS EC2 Assuming cron creating some random empty files - amazon-web-services

I am assuming cron keep creating random empty files in my root directory with name users-recall?type=cron.*** - * (some random numbers could be with dots like 111.01). And it has only 2 jobs in it.
*/1 * * * * cd /var/www/html/*** && php script.php > /dev/null 2>&1
0 */1 * * * wget -q -t 1 https://domain/users-recall?type=cron > /dev/null 2>&1
I tried to search it but couldn't find it. It creates it daily or so. Not creating it every minute. I am not sure what else could create this files. I have just AWS EC2 Linux server with nothing additional installed except standart tools.

Sending the STDOUT/STDERR output of wget to /dev/null is not sufficient if you don't want a file to be created.
Modify your command by adding -O -.
... type=cron -O - > /dev/null 2>&1
The -O option (uppercase letter O) is the short form for --output-document, and the next - means to use STDOUT instead of creating a file on disk from the response. By default, wget creates a file, and appends a counter to the end if the file already exists. Since you are discarding STDOUT with > /dev/null, this will do what you intend -- make a web request, and discard the output.

these files are getting created by ur second cronjob for sure. You can comment that job and see that those files will stop getting created.

Can you try enclosing the URL in single quote?
0 */1 * * * wget -q -t 1 'https://domain/users-recall?type=cron' > /dev/null 2>&1
The character ? may get interpreted by shell depending on the SHELL variable of the cronfile.

Related

How to pass a command which contains special characters through SSH?

I would like to run the following command from Jenkins:
ssh -i ~/.ssh/company.pem -o StrictHostKeyChecking=no user#$hostname "supervisorctl start company-$app ; awk -v app=$app '$0 ~ "program:company-"app {p=NR} p && NR==p+6 && /^autostart/ {$0="autostart=true" ; p=0} 1' /etc/supervisord.conf > $$.tmp && sudo mv $$.tmp /etc/supervisord.conf”
This is one of the last steps of a job which creates a CloudFormation stack.
Running the command from the target server's terminal works properly.
In this step, I'd like to ssh to each one of the servers (members of ASG's within the new stack) and search and replace a specific line as shown above in the /etc/supervisord.conf, basically setting one specific service to autostart.
When I run the command I get the following error:
Usage: awk [POSIX or GNU style options] -f progfile [--] file ...
Usage: awk [POSIX or GNU style options] [--] 'program' file ...
I've tried escaping the double quotes but got the same error, any idea what I'm doing wrong?
You are running in to this issue due to the way the shell handles nested quotes. This is a use case for a HERE DOCUMENT or heredoc - A HERE DOCUMENT allows you to write multi-line commands passed through bash without worrying about quotes. The structure is as follows:
$ ssh -t user#server.com <<'END'
command |\
command2 |\
END
<--- Oh yeah, the -t is important to the ssh command as it lets the shell know to behave as if being used interactively, and will avoid warnings and unexpected results.
In your specific case, you should try something like:
$ ssh -t -i ~/.ssh/company.pem -o StrictHostKeyChecking=no user#$hostname <<'END'
supervisorctl start company-$app |\
awk -v app=$app '$0 ~ \"program:company-\"app {p=NR} p && NR==p+6 \
&& /^autostart/ {$0="autostart=true" ; p=0} 1' \
/etc/supervisord.conf > $$.tmp && sudo mv $$.tmp /etc/supervisord.conf
END
Just a note, since I can't be sure about your desired output of the command you are running, be advised to keep track of your own " and ' marks, and to escape them accordingly in your awk command as you would at an interactive terminal. I notice the "'s around program:company and I am confused a bit by them If they are a part of the pattern in the string being searched they will need to be escaped accordingly. P.S.

Library compiled to architecture x64 with error in Arm architecture

I'm developing a C++ library that has a piece of shell script code that return the name of a specific serial port. When I run this script in console either X64 desktop or Arm enviorment the script returns the right answer. My problem ocur when I execute the same script inside of the library, the returns shows bad formed string like ÈÛT¶ÈÛT¶¨a , but the expected is /dev/ttyACM0.
The script that run inside of library:
Script
bash -c 'for sysdevpath in $(find /sys/bus/usb/devices/usb*/ -name dev);do(syspath="${sysdevpath%/dev}";devname="$(udevadm info -q name -p $syspath)";[[ "$devname" == "bus/"* ]]&& continue;teste="$(udevadm info -q property --export -p $syspath | grep -i "company_name")";if [[ ! -z "${teste// }" && $devname == *"ttyACM"* ]]; then echo "/dev/$devname";fi);done;' 2> /dev/null
The following piece of code is used to save the content returned by the script into a file.
code c++
pfFile = fopen(CONFIG_FILE, "w+");
fwrite(result,strlen(result), 1, pfFile);
fclose(pfFile);
return 0;
Besides you didn't include what is result and where it comes from in your C++ code; you selected the hardest way to do this. Code running shell scripts inside a library most likely cause nothing but headaches.
Basically you can create an udev rule for your device to create an unique and stable file in /dev to access it. You can create one like this one in the ArchWiki
KERNEL=="video[0-9]*", SUBSYSTEM=="video4linux", SUBSYSTEMS=="usb", ATTRS{idVendor}=="05a9", ATTRS{idProduct}=="4519", SYMLINK+="video-cam1"

Using wildcards and quotation marks in ssh

I have a file that I want to grep out lines with strings like "Nov 30" or "Nov 30" (basically I don't want to specify the number of spaces in the middle.
In the terminal this would be fine, I'd just do:
grep 'Nov * 30' file
However, it'd be nice to keep it general for my purposes, so in fact I'd like to do something like:
grep "$(date +%b) * $(date +%d)" file
That works fine BUT I actually want to do this through ssh so it'd be like
ssh -t host "grep "$(date +%b) * $(date +%d)" file"
At this point I run into problems. Instead of grepping only for Nov 30 type date strings, it returns all sorts of different November dates. I feel like the problem has something to do with double quotation usage (perhaps the first set of double quotes for the -t argument is affecting the second lot, but I don't see how to get around that) , and I see from this answer that "bash will evaluate and substitute variables inside double-quotes on the local machine but will do it on the target machine inside single quotes". So I tried replacing the above with
ssh -t host "grep '$(date +%b) * $(date +%d)' file"
But now the grep returns no results at all! I assume this is because I'm grepping for literal '$(date +%b)...' and not the substituted 'Nov..', but then I don't understand why the first attempt with double quotes didn't work.
Welcome any help
Escape your quotes:
ssh -t host "grep \"$(date +%b) * $(date +%d)\" file"
Alternately, single-quote the command line you wish to execute on the remote machine. (In this case the date commands will execute on the remote end.)
ssh -t host 'grep "$(date +%b) * $(date +%d)" file'
In this version the date command will be executed locally:
ssh -t host "grep '$(date +%b) * $(date +%d)' file"
In this version the date command will be executed on the remote host:
ssh -t host 'grep "$(date +%b) * $(date +%d)" file'
This can make a difference when your local PC and server are in different time zone. Right now for example, it's Dec 1 in France, but Nov 30 on my server in the US.
In the 1st version the $() within the double quotes are evaluated before sending to the server. So the command sent to the server is grep 'Dec * 1' file in my timezone.
In the 2nd version the $() within single quotes are NOT evaluated before sending to the server. So the command sent to the server is grep "$(date +%b) * $(date +%d)" file, and so the server will evaluate the $() within double quotes.

mysqldump not backingup to specified file

I have the following problem, I tried to back up a database using mysqldump command, and after I enter the corrent command on the terminal:
This is the command:
c:/wamp/bin/mysql/mysql5.5.24/bin>mysqldump -uroot -ptest store > store_bak.sql
/* here it jumps to the next line after pressing ENTER */
c:/wamp/bin/mysql/mysql5.5.24/bin>
I think I'm entering the correct commands on the right place, but the only thing that happens is that it jumps to the next line and no backup is made.
I also cannot find any file named as the backupname I specified (store.sql on this example)
does anybody knows what could be causing this?
ps: i'm using wamp, on windows 7, sql 5.5.24
you need spaces here -uroot -ptest
mysqldump -u root -p test -h localhost store > store_bak.sql

Automator Finder Service keeps on running, should run only once

I've created a simple service using automator, that takes a *.tiff, creates a *.jpg out of it and than deletes the original.
However, I run this on a *.tiff file, it keeps on running, meaning it keeps on converting the (then jpg) file over and over again. That is, I believe it does, since the file disappears and reappears about 2 times a minute and the timestamp changes. How do I tell it to run the service (i.e. the shell commands) just once?
The Service in Automator is just this one action of type "run Shell-Script". The Shell script is
newName=${#%.tiff}.jpg
echo "$newName"
sips -s format jpeg "$#" --out "${newName}"
rm "$#"
Thanks!
(Would have posted a picture of the Automator window, but was not allowed to)
This behavior seems to be a folder actions.
if you created a folder action :
1- You must filter the TIFF files for that folder action doesn't process the created JPEG file.
2- You must use a loop, if you drop one or more files in that folder.
3- Use "&&" to delete the TIFF file only when the sips command finishes successfully.
Here's the script:
for f in "$#";do
if [[ "$f" = *.tiff ]]; then
sips -s format jpeg "$f" --out "${f%.tiff}.jpg" && rm "$f"
fi
done