Is it not possible to pipe output in Fabric tasks? - python-2.7

I'm using Fabric 1.10 for my project. For one of the tasks, I need to display a list of files present locally but not yet uploaded to the remote server. For this I use rsync.
rsync -avun <local-directory> <remote-server>
This works fine but it also displays a few summary lines and unwanted output, so I try to grep the results. However, this causes an error.
rsync -avun <local-directory> <remote-server> | egrep "(\.png|\.jpg|\.jpeg|\.ico|\.gif)"
Fatal error: local() encountered an error (return code 1)...
Is it not possible to pipe output in Fabric commands?

My guess is that you've mixed up, or fabric is conflicting with, your quotes. Try and use """'s to surround your commands and perhaps single quotes on the egrep.

Related

GetFileInfo: could not refer to file (-43)

Some lines in my script :
full_path=$(sed -n "${a}p" tmp/archive_file_header_path )
modified_date=$(GetFileInfo -m "$full_path" | awk '{print $1}')
My script works fine (I think) and I get the output as expected.
But there is one bugged me. If I run my script for a file inside a RAID storage, it produce error GetFileInfo: could not refer to file (-43).
Even though it has error, the output show correct date.
This error only happen if the file inside a RAID storage, but no error if inside a single external drive.
And also if I run these lines along with variables outside the script, it does not produce that error, even if the file inside RAID storage.
If its matter, I tried in zsh and bash, both get same result.
My knowledge is very limited, if anyone brought the light to me, I very gratefull

rsyslog Variables Not Workiing

I'm running CentOS 7.5. I've currently got rsyslog setup and working. The only problem is that it's not recognizing the %HOSTNAME% variable in my file path. Instead of writing the logs to a separate file for each host, it's creating a file called '%HOSTNAME%' and writing the logs there.
This is what I currently have in my /etc/rsyslog.conf file.
if $fromhost-ip startswith "10." then /var/log/Client_Logs/%HOSTNAME%.log
& ~
Everything with this is working, except for creating a separate file for each device. I'm not sure if the client device is failing to provide this information or not. I've got several client devices so far and they are all Cisco switches.
I've also tried using other variables like '%FROMHOST-IP%' and they also do not work.
Any help is appreciated. Thanks.
For dynamic files, you must first define them separately in a template. Eg
$template DynFile,"/var/log/Client_Logs/%HOSTNAME%.log"
if $fromhost-ip startswith "10." then ?DynFile
See actions.

Python Jupyter nbconvert limiting printed pages

I am running Jupyter with a Python 2.7 kernel and I was able to get
nbconvert to run on both the command line (ipython nbconvert --to pdf file.ipynb) as well as through the jupyter browser interface.
For some reason, however, my file only prints 4 out of the 8 total pages I see on the browser. I can convert an html preview (generated by Juypter) file, and then use an online converter at pdfcrowd.com to get the full pdf files printed.
However, the online converter is lacking in latex capability.
When I installed nbconvert, I did not manually install MikTex latex for windows, as I already had it successfuly installed for Rstudio and Sweave.
I didn't want to mess with the installation.
I printed a few longer files (greater than 4 pages), without latex and a few that had latex and printed the first couple of pages fine. But that last few pages always seem to get cut off from the pdf output.
Does anyone know if that might be causing it to limit the print output, or any other ideas why?
Also, I do notice the command line prompt where jupyter is launched has a few messages...
1.599 [\text{feature_matrix}
] =
?
! Emergency stop.
<inserted text>
1.599 [\text{feature_matrix}
] =
Output written on notebook.pdf (r pages)
[NbConvertApp] PDF successfully created
I ran another file and saw the output showed
!you can't use 'macro parameter character #,' in restricted horizontal mode.
...ata points }}{\mbox{# totatl data points}}
?
! Emergency stop
Look like it is somehow choking on using the latex character #
then halting publishing. But it at least publishes all the pages prior to the error and emergency stop.
While not a solution, I manually changed the latex symbol # to number and
the entire file printed. I wonder if this is a problem with my version of latex maybe?
halted pdf job
$$\mbox{accuracy} = \frac{\mbox{# correctly classified data points}}{\mbox{# total data points}}$$
passed pdf job
$$\mbox{accuracy} = \frac{\mbox{number correctly classified data points}}{\mbox{number total data points}}$$
edit: if it help anyone, I used the suggestion here, to use a backslash escape character before the #, to solve the issue; so it looks like it is a latex issue, if that helps anyone.

How can I use a doctrine connection to import a SQL file?

I have an app that needs to import a .sql file. I can import the file from the command line with mysql -u my_user -pMyPassword db_name < import.sql, but I'd like to move this into my app. I have some things that need to be done before the import and others after. Right now I have to break it into 3 steps. The closest to a solution I've found was to get the connection (Doctrine\DBAL\Connection) and use exec() but it throws syntax errors even though my source file is correct. I'm guessing it's trying to escape things and double escaping the SQL. The file was generated with mysqldump.
With Symfony using Doctrine, you can do it with:
php app/dev_console doctrine:database:import import.sql
You can use the DBAL "import" command and get the sql executed. This is anyway less performant than using the mysql command directly, since it loads the entire file into memory.
Otherwise, I'd suggest you to use your own Symfony Console Command.
in my case it was :
php bin/console doctrine:database:import my_sql_file.sql
Status September 2021:
I rather trust the code in Doctrine\Bundle\DoctrineBundle\Command\Proxy\ImportDoctrineCommand which calls a deprecation-warning in the execute function. It is not good programming practice to ignore deprecation warnings. Calling dual:run-sql would not be efficient enough because of the overhead.
Alternatively, you can also call e.g. mysql on the operating system level. This causes problems in multi-user environments, for example because the database password has to be specified on the command line. In addition, the server must also activate this function; exec() is switched off for security reasons in many environments, especially with low-cost providers. Furthermore, this function would not be database abstract. This abstraction is one of the most outstanding features of Doctrine.
Therefore, I recommend reading in the SQL data yourself and executing it line by line (see entityManager->createNativeSQL() function). You then have better possibilities to react to possible errors.

Print result of Fabric command

I would like to redirect the output of the Fabric command
print local("git add .")
so that I make sure that everything is working properly. Is there a way to pipe the results of this command to the console? Thanks!
You should already see the output of the command in the console since that's the default behaviour. If you need to customize it, please have a look at the managing output documentation section.
Besides this, there's an ongoing effort to use the logging module (issue #57) that should provide more options like logging to a file, but that hasn't been merged into the trunk branch yet.