curl syntax for PostgREST query - postgrest

Using PostgREST (http://postgrest.org/en/v5.2/api.html) version 5.2 against a Postgres 11 database, the following curl command:
curl -X GET http://192.18.11.13:5741/workorder?acctid=eq.SunnySide&workorderid=eq.0001
generates the following trace in the Postgres log:
SELECT "public"."workorder".* FROM "public"."workorder" WHERE "public"."workorder"."acctid" = 'SunnySide'::unknown
The "restriction" of workorderid=eq.0001 in the original query is dropped so the data returned includes all 'SunnySide' matches and not the single 0001 workorderid as desired.
What is the correct syntax of this command in curl so that workorderid is also passed to the Postgres server?

You need to quote("") the url. Like:
curl "http://192.18.11.13:5741/workorderacctid=eq.SunnySide&workorderid=eq.0001"
Otherwise your shell will treat the command after the & symbol as a background process. That's why workorderid=eq.0001 doesn't get included in the curl call. It's actually being interpreted as a separate command(variable assignment).

Related

execution of salt cmd.run has different behaviors if it's CLI vs through state files. what's the difference?

I'm trying to set a variable TOKEN with a value of a IMDSv2 (AWS instance metadata) token to get an instance IP address.
Executing a set of commands through the salt CLI yields desired results: Here's the command below:
salt-ssh 'name.of.host' cmd.run cmd='TOKEN=$(curl -X PUT "http://169.254.169.254/latest/api/token" -s -H "X-aws-ec2-metadata-token-ttl-seconds: 21600") && curl -H "X-aws-ec2-metadata-token: $TOKEN" -s http://169.254.169.254/latest/meta-data/public-ipv4'
However, I'm trying to execute the same command through an evaluated expression like so:
{{ salt["cmd.run"](cmd='TOKEN=$(curl -X PUT "http://169.254.169.254/latest/api/token" -s -H "X-aws-ec2-metadata-token-ttl-seconds: 21600") && curl -H "X-aws-ec2-metadata-token: $TOKEN" -s http://169.254.169.254/latest/meta-data/public-ipv4')}}
and the following error shows up:
FileNotFoundError: [Errno 2] No such file or directory: 'TOKEN=$(curl': 'TOKEN=$(curl'
My thinking is that the way salt evaluates command expressions through CLI is different from a jinja esque evaluation. I also understand that spaces are a factor in shell expressions for salt. I tried different combinations of putting escape characters behind the equal, dollar and parenthesis, but no such luck. I still think it has to do with special characters, but can't pinpoint what.
Any thoughts? Much appreciated.
As per the documentation, the cmd.run does not process commands through Shell by default. Quoting a "Warning" from the above link:
This function does not process commands through a shell unless the python_shell flag is set to True. This means that any shell-specific functionality such as 'echo' or the use of pipes, redirection or &&, should either be migrated to cmd.shell or have the python_shell=True flag set here.
So there are two options:
Instead of using cmd.run use cmd.shell. E.g.:
{{ salt.cmd.shell('TOKEN=$(curl -X PUT ...)') }}
Or add python_shell=True to cmd.run. E.g.:
{{ salt.cmd.run('TOKEN=$(curl -X PUT ...)', python_shell=True) }}

Setting default host for fab 2

I use fabric2 in terminal and I don't want input -H 'hosts' every time.
How can I do it?
e.g.
// actual
fab2 -H web1 upload_and_unpack
// expected
fab2 upload_and_unpack
I've read the main doc, configuration doc but found nothing.
from fabric import task
#task(hosts=['web1'])
def upload_and_unpack(c):
c.run('uname -a')
If you define your fabfile as above, then you can simple run your fab command without providing any host parameter (assuming that web1 is already defined in your ssh config file).
$ fab upload_and_unpack

request.META does not contain header passed from curl -H

"It works on my machine."
I have a django app. I'm followed this tutorial. OAuth2 works great on my dev box like this:
$ curl -v -H "Authorization: OAuth c52676b24a63b79a564b4ed38db3ac5439e51d47" http://localhost:8000/api/v1/my-model/?format=json
My local dev app finds the header with this line of code:
auth_header_value = request.META.get('HTTP_AUTHORIZATION')
But when I deploy it to my ubuntu box running apache it doesn't.
I added the following to my authentication.py file so I could inspect the values in the log on the remote machine.
logging.error(request.GET)
logging.error(request.POST)
logging.error(request.META)
The header value is mysteriously missing from the output. So I just get 401s.
Did you turn on WSGIPassAuthorization?
http://modwsgi.readthedocs.org/en/latest/configuration-directives/WSGIPassAuthorization.html
Authorisation headers are not passed through by default as doing so
could leak information about passwords through to a WSGI application
which should not be able to see them when Apache is performing
authorisation.

Exporting .csv data locally from a distant MySQL server

I have a MySQL server running on one PC (WinXP).
On another PC (WinXP), I'd like to backup tables to csv files so I have a c++ program connect to the MySQL database and then I issue a command like this:
SELECT data FROM table WHERE something=ABC
INTO OUTFILE c\tmp.txt
FIELDS TERMINATED BY ';'
OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY ";";
the data seems to get exported but on the Servers "c:\" not on my PC:s "c:\".
I can't just fetch the data over the LAN either as my program is running as a windows service.
I have seen this post but it seems I can't use "FIELDS TERMINATED BY" etc. with that solution.
Can I export csv data locally from a distant server or do I have to migrate the data locally first?
Problem "solved" : As it seems you can't do it in a 'simple' way, I run the service on the PC with MySQL and have a DCom server periodically move the data to the other PC.
Intended to be a comment (but I don't have enough rep points to comment). Not sure if you are able to install things, but you could try using cygwin + sqsh (http://www.sqsh.org/sqsh_home.html). I'm a linux user and sqsh is a great tool for grabbing data from databases.
SELECT ... INTO OUTFILE obviously writes the file on the local filesystem of the mysql-demon. One option might be to share a directory of your client-PC, open it on your server and use its path for the outfile-option. If this is not an option you might have to select the data in your c++ program and write it (kind of manually) to a local csv-file.
use the following command, if you don't want to install extra.
mysql -h remotedb.db -u ident -p -B -e "your query ;" | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > localfile.csv
-B means make the output as tab separated.

PHP Command Line & Browser Returning Different Results

I'm having an issue when running a script in a browser versus the command line. The script echo's the date 1/20/2012 when running within the browser, however when running within the command prompt it echo's tommorows date 1/21/2012. I have set my timezone to the
date.timezone = America/New_York
I'm running this script in the command line:
"c:\wamp\bin\php\php5.3.4\php.exe" -f "c:\wamp\www\site.com\cron.php"
Any ideas on why I'm getting two different dates from the same script?
check your Apache and command line have same PHP. and both using the same php.ini file.
php-cli, php-cgi, & php-fpm all use different php.ini files by default.