What is parastate test network chain ID? - blockchain

I want to add Parastate tesnet to my metamask. I input this:
Network name: Parastate testnet
New RPC URL: https://rpc.parastate.io:8545/
But it also requires "Chain Id". Where can I find it or which one is correct?

Network Name: ParaState
New RPC URL: https://rpc.parastate.io:8545
Chain ID: 123
Currency Symbol (Optional): STATE

RPC URL: https://rpc.parastate.io:8545
Chain ID: 123
Currency Symbol: STATE
try it

Related

DynamoDB stream to elasticsearch for multi-api

guys
I need your help to know, what's the best practice for a multi-api scenario to index data to elasticsearch.
I have for example:
1.users api
2.produts api
But because of any of the apis, triggers the stream, my question is how come can I know where the request api comes from and the http method, because if I need to update username and email only and not the user item complete, but because the stream does know is an update and what's the PK or SK but don't know what http method is, or what field to update.
Example:
New user: {
index: db
id: user#19201,
name: username,
email: user#email.com,
age: 18,
country:US,
...
}
update:
{
index: db
id: user#19201,
name: username,
email: user#email.com
}
the stream will override the record complete for the new update.
in another hand the products api, can be only be indexed by the same index as user api "db" because of the same issue the stream does not now what endpoint is to index it, that's why a fixed one will work, but a multi index won't work
thanks for your any helpful comment.

How to trigger Email notification to Group of users from BigQuery on threshold?

I have written the below query in GCP BigQuery, where I am using error function to pop-up error message when the threshold for the quantity column exceeds 1000.
SELECT ERROR(CONCAT("Over threshold: ", CAST(quantity AS STRING)))
FROM `proj.dataset.table`
WHERE quantity > 1000
I am getting the email notification when I have scheduled this query in BigQuery. But I want to trigger that notification to the group of users through BigQuery.
How to achieve this?
You could achieve this and a lot more with the Cloud Workflows serverless product and an external email sending provider such as Sendgrid, Mailchimp, Mailgun that offers a REST Api.
You basically setup a Workflow that will handle the steps for you:
run the BigQuery query
on error trigger an email step
you could even combine, if results returned are of a kind execute another step
The main workflow would be like this:
#workflow entrypoint
main:
steps:
- getList:
try:
call: BQ_Query
args:
query: SELECT ERROR('demo') from (select 1) where 1>0
result: result
except:
as: e
steps:
- sendEmail:
call: sendGridSend
args:
secret: sendgrid_email_dev_apikey
from: from#domain.com
to:
- email: email1#domain.com
- email: email2#domain.com
subject: "This is a test"
content: ${"Error message from BigQuery" + e.body.error.message}
contentType: "text/plain"
result: callResult
- final:
return: ${callResult}
sendgrid_email_dev_apikey is the secret label, I've used Secret Manager to store Sendgrid's API key. If you want to use MailChimp there are examples in this Github repo.
The workflow invoker could be a Cloud Scheduler entry. So instead of launching the scheduled queries from BigQuery interface, you set them up in a scheduled Workflow. You must give permission for the invoker service account to read Secrets, to run BigQuery jobs.
The rest of the Workflow is here:
BQ_Query:
params: [query]
steps:
- runBQquery:
call: googleapis.bigquery.v2.jobs.query
args:
projectId: ${sys.get_env("GOOGLE_CLOUD_PROJECT_ID")}
body:
useLegacySql: false
query: ${query}
result: queryResult
- documentFound:
return: ${queryResult}
sendGridSend:
params: [secret, from, to, subject, content, contentType]
steps:
- getSecret:
call: http.get
args:
url: ${"https://secretmanager.googleapis.com/v1/projects/" + sys.get_env("GOOGLE_CLOUD_PROJECT_NUMBER") + "/secrets/" + secret + "/versions/latest:access"}
auth:
type: OAuth2
result: sendGridKey
- decodeSecrets:
assign:
- decodedKey: ${text.decode(base64.decode(sendGridKey.body.payload.data))}
- sendMessage:
call: http.post
args:
url: https://api.sendgrid.com/v3/mail/send
headers:
Content-Type: "application/json"
Authorization: ${"Bearer " + decodedKey }
body:
personalizations:
- to: ${to}
from:
email: ${from}
subject: ${subject}
content:
- type: ${contentType}
value: ${content}
result: sendGridResult
- returnValue:
return: ${sendGridResult}
Since you receive a mail notification, I guess you are using the BigQuery Data Transfer service.
According to this paragraph only the person that set up the transfer will receive the mail notification. However, if you're using Gmail you can automatically forward these message to a list of users.
This link should guide you through it.

Connect to Kubernetes apiserver created on Google Container Engine (node.js)

I've successfully connected to container engine through googleapis nodejs client and get the cluster object back (according to the doc here), and saved the masterAuth object to a json file on disk. However, I still cannot figure out how to make an authenticated request to the apiserver:
var request = require("request");
var key = require("path/to/key/json");
var options = {
url: "https://IPofKubernetesCluster/api/v1/endpoints",
cert: key.clientCertificate,
ca: key.clusterCaCertificate,
key: key.clientKey,
passphrase: null
};
request.get(options, function(e, r, body) {});
The code failed with the following error:
crypto.js:131
c.context.setKey(options.key);
^
Error: error:0906D06C:PEM routines:PEM_read_bio:no start line
at Object.exports.createCredentials (crypto.js:131:17)
at Object.exports.connect (tls.js:1345:27)
at Agent.createConnection (https.js:79:14)
at Agent.createSocket (http.js:1294:16)
at Agent.addRequest (http.js:1270:23)
at new ClientRequest (http.js:1417:16)
at Object.exports.request (https.js:123:10)
at Request.start(node_modules/request/request.js:793:30)
at Request.end (node_modules/request/request.js:1400:10)
at end (node_modules/request/request.js:564:14)
Any help would be much appreciated.
The MasterAuth structure includes base64-encoded client and cluster certificates. You will need to decode them back into the PEM format before passing the string into your http client library.

What is the correct set up for Google Analytics Multiple domain tracking?

I'm trying to set up multiple domain tracking with GA, with the end result being that we are able to see traffic to siteB as containing the initial referrer to siteA.
Site A's code:
_gaq.push(["_setAccount", "UA-XXX-X"]);
_gaq.push(["_setDomainName", "sitea.com"]);
_gaq.push(["_setAllowLinker", true]);
_gaq.push(["_trackPageview"]);
With a link:
Link to site b
Site B's code:
_gaq.push(['_setAccount', 'UA-XXX-X']);
_gaq.push(['_setAllowLinker', true]);
_gaq.push(['_addIgnoredRef', 'sitea.com']);
_gaq.push(['_trackPageview']);
Expected behavior: we want to see referrals in the tracking to siteb linked from the link on sitea as the original referrer to sitea:
referrer to sitea -> link -> siteb
in analytics as:
referrer to sitea -> siteb
Actual behavior: we're seeing traffic on siteb as containing sitea as the referrer. Additionally, we're seeing the _utma with the cookie added to the siteb url when you click on the link on sitea, but the _utma cookies being created on siteb contain different values.
I've read various articles about setDomainName to 'none', removing it all together, but nothing seems to be working as expected.
Thanks for any help
UPDATE: Additional info via the ga debugger:
Coming into sitea from external:
_gaq.push processing "_setDomainName" for args: "[sitea.com]": dc_debug.js:24
_gaq.push processing "_setAllowLinker" for args: "[true]": dc_debug.js:24
_gaq.push processing "_trackPageview" for args: "[]":
Referring URL : http://www.external.com/
Hit ID : 1460027114
Visitor ID : 1908962602
Click through from sitea to siteb:
_gaq.push processing "_setAllowLinker" for args: "[true]": ga_debug.js:24
_gaq.push processing "_addIgnoredRef" for args: "[sitea.com]": ga_debug.js:24
_gaq.push processing "_trackPageview" for args: "[]":
Referring URL : http://www.sitea.com/
Hit ID : 73647255
Visitor ID : 1908962602
So the visitor ID is staying consistent, but the referring URL tracked on siteb is not the referrer from sitea.
Also unclear if it's relevant, but we're using the DoubleClick integration on sitea (dc.js) and Classic GA on siteb (ga.js).
Thx!
I believe the issue is with setting the cookie domain (again) in tag, could you try removing it?
I suggest to keep it ONLY with all other commands in tracker definition block?
Also, add domain definition to SiteB:
_gaq.push(["_setDomainName", "siteb.com"]);
Ignoring referral shouldn't be an issue as you pass UTM parameters that will override that information anyway.
Check out how to set up Views (formerly Profiles) in GA correctly and according to documentation.
Hope this helps.

Regular expressions for long phrases in perl

I'm looking to extract the "Account Name" and "Source Network Address" from the following text using regular expressions in a perl script. Adding a regular expression for such a long phrase, seems to take a lot of effort.
I need your help with finding the best regex for this, or any ideas would help. Keep in mind that this are just 3 examples out of possible 50? phrases similar to this (different lengths).
Example phrase 1:
WinEvtLog: Security: AUDIT_SUCCESS(4624): Microsoft-Windows-Security-Auditing: admin: DOMAIN: hostname.domain.com: An account was successfully logged on. Subject: Security ID: S-1-0-0 Account Name: - Account Domain: - Logon ID: 0x0 Logon Type: 3 New Logon: Security ID: S-1-5-21-1130994204-1932287720-1813960501-1239 Account Name: admin Account Domain: DOMAIN Logon ID: 0x1d12cfff5 Logon GUID: {AF5E2CF5-1A54-2121-D281-13381F397F41} Process Information: Process ID: 0x0 Process Name: - Network Information: Workstation Name: Source Network Address: 101.101.101.101 Source Port: 52616 Detailed Authentication Information: Logon Process: Kerberos Authentication Package: Kerberos Transited Services: - Package Name (NTLM only): - Key Length: 0 This event is generated when a logon session is created. It is generated on the computer that was accessed.
Example phrase 2:
WinEvtLog: Security: AUDIT_SUCCESS(4634): Microsoft-Windows-Security-Auditing: admin: DOMAIN: hostname.domain.com: An account was logged off. Subject: Security ID: S-1-5-21-1130554204-1932287720-1813960501-4444 Account Name: admin Account Domain: DOMAIN Logon ID: 0x1d12d000a Logon Type: 3 This event is generated when a logon session is destroyed. It may be positively correlated with a logon event using the Logon ID value. Logon IDs are only unique between reboots on the same computer." 4646,1
Example phrase 3:
WinEvtLog: Security: AUDIT_SUCCESS(540): Security: Administrator: HOST88: HOST88: Successful Network Logon: User Name: Administrator Domain: HOST88 Logon ID: (0x14,0x6E6FB948) Logon Type: 3 Logon Process: NtLmSsp Authentication Package: NTLM Workstation Name: DESKHOST88 Logon GUID: - Caller User Name: - Caller Domain: - Caller Logon ID: - Caller Process ID: - Transited Services: - Source Network Address: 10.10.10.10 Source Port: 43221
The following regex will handle your posted cases:
if ( $string =~ /(?<=Account Name:)\s+([^-\s]+).+(?:Source Network Address:)\s+([\d.]+)\s+/ ) {
$account_name = $1;
$source_addr = $2;
}
How rigorous do you want to be with your solution?
If you have log lines and want to extract the word that follows "Account Name:" and the address that follows "Source Network Address:" then you can do it with a very naive regex like this:
my ($account_name) = /Account Name:\s+(\S+)/;
my ($source_network_addr) = /Source Network Address:\s+(\S+)/;
That doesn't attempt to validate that anything else in the line is as you expect it to be, but if the application is only parsing lines that are generated by IIS or whatever, it may not need to be really precise.