newman environment variables - avoid using --env-var after each variable pass - postman

I want to pass multiple environment variables using the newman CLI.
newman run myCollection.json --env-var baseurl="fancy/url" --env-var user="admin" --env-var password="admin123" `
I hate that I have to write -env-var for each of the variables I want to set..
Is there a nicer way to do it?
I want to avoid passing the environment variables to newman as json.

If(as you said) the usage of environment/Global JSON files is not an option, i would suggest to a data file. You cann add the credentials in an csv file as username,password pairs. E.g.
user,password
"admin","admin123"
You only have to add -d "datafile.csv" to your newman command
You can access them by using data.user and data.password in your test-code. If you need them in the request-body, URL, or header: Just use {{user}} an {{password}} aus usual.
An other benefit:
With datafiles you can execute each collection for multiple times(aka. iterations), for each row.

Related

Postman - Update CSV Separator in Runner

Is there a way to update separator to read CSV file in Postman Runner. By Default is ",". But I would like to change it as ";".
Maybe there is a postman javascript reference that I could use in the pre-request script of my endpoint ?
In short, no you can't.
It wouldn't work in a pre-request script as it would have already got past the point of picking up the file data.
You can access the file data using pm.iterationData.get() but it uses that separator in the comma separated value file, to establish the variable names and the iteration values.

How to modify updateSql script at liquibase?

I try to modify updateSql script to enable us to change column data types .. but I did not find way .. so please How to modify this script ?
updateSQL is a helper command that allows you to inspect the SQL Liquibase will run while using the update command. This helps to correct any issues that may arise before running the command. Read more about it here
If after running updateSQL you find the output incorrect, you can directly make changes/modify your original SQL script. (as this is the actual use of updateSQL script)
If you have already run liquibase update and now you want to modify column data type (of already created table) as you mentioned in your question, you may want to check out modifyDataType support of liquibase.
Below is an example:
<changeSet author="liquibase-docs" id="modifyDataType-example">
<modifyDataType catalogName="cat"
columnName="id"
newDataType="int"
schemaName="public"
tableName="person"/>
</changeSet>
Or you can do it using SQL tag:
<changeSet author="liquibase-sql" id="example-sql" context="migrate">
<sql dbms="mysql">
ALTER TABLE tablename MODIFY COLUMN password NVARCHAR(64)
</sql>
</changeSet>
I hope I got your question right as your question is unclear to me and this is all that I could summarize from what my understanding came out of your question.
you can not modify updateSQL script. The main case when using Liquibase is to prepare your changelogs that way, so the outputSql would be exactly what you need.

How to pass command line arguments into the implementation js file in gauge project?

I am using gauge-js for running puppeteer scripts and I am trying to pass in custom argument from command lines.
while I am running my gauge run spec command to run the test cases I want to pass in any custom argument like gauge run spec --username=test and read that value inside my implementation files.
You cannot pass custom arguments to Gauge. However, you can use environment variables to pass any additional information you need in your implementation files.
For example, you can run gauge as
(on mac/*nix)
username=test gauge run spec
or
(on windows)
set username=test
gauge run specs
and use the environment variable in your implementation file using process.env.username.
You can additionally set the variable in the .property files in the env folder. These get picked up as environment variables as well.

How to test if a URL is a phishing in command line using Google Safe Browsing?

In Google Safe Browsing, there are two ways to test if a URL is a phishing URL:
lookup-based and
hash-based.
In this question, I focus on the hash-based solution, better for privacy, as used by browsers such as Firefox.
For this, the browser downloads a hash database goog-phish-shavar which is saved as ~/.cache/mozilla/firefox/<profile_folder>/safebrowsing/goog-phish-shavar.sbstore.
Now, I want to test a URL in command line as follows
test-safebrowsing-url goog-phish-shavar.sbstore http://example-phishing.com
How to do this?
The files that you are looking at are Firefox-specific and so you'll need something like sbdbdump to extract the hash prefixes from it:
cd ~/.cache/mozilla/firefox/<profile_folder>/safebrowsing/
~/sbdbdump/dump.py -v --name goog-phish-shavar . > ~/goog-phish-shavar.hashes
and then you'll have to convert a URL to its possible hashes following the hashing rules. regexp-lookup.py can help with that.
Finally, you'll have to check all of the URL hashes against the list of prefixes. If you find any matches, you need to make a request for the full hashes that start with that prefix.
For Google Safe Browsing v3, there is https://github.com/Stefan-Code/gglsbl3.
For Google Safe Browsing v4, there is https://github.com/afilipovich/gglsbl
They both support command line usage of hash-based analysis.

How to implement my product resource into a Pods structure?

Reading http://www.ember-cli.com/#pod-structure
Lets say I have a product resource. Which currently has the following directory structure:
app/controllers/products/base.js
app/controllers/products/edit.js
app/controllers/products/new.js
app/controllers/products/index.js
With pods all the logic in these files are put in a single file app/products/controller.js?
At the same time, my routes and templates for these resources currently look like:
app/routes/products/base.js
app/routes/products/edit.js
app/routes/products/new.js
app/routes/products/index.js
app/templates/products/-form.hbs
app/templates/products/edit.hbs
app/templates/products/index.hbs
app/templates/products/new.hbs
app/templates/products/show.hbs
How should this be converted to Pods?
You can use ember generate --pod --dry-run to help with that:
$ ember g -p -d route products/base
version: 0.1.6
The option '--dryRun' is not supported by the generate command. Run `ember generate --help` for a list of supported options.
installing
You specified the dry-run flag, so no changes will be written.
create app/products/base/route.js
create app/products/base/template.hbs
installing
You specified the dry-run flag, so no changes will be written.
create tests/unit/products/base/route-test.js
$
(I don't know why it complains yet it honours the option, might be a bug).
So you'd end up with a structure like:
app/controllers/products/base/route.js
app/controllers/products/edit/route.js
etc.