Hyperledger chaincode does not get current user metadata - blockchain

Currently I'm working with Hyperledger chaincode and trying to get at least any info regarding current user who invokes/queries chaincode. For some reason chaincode example asset_management.go results in an error "ERRO 031 Got error: Invalid admin certificate. Empty." I have security.enabled and security.privacy set to true and Membership services running. I've enrolled "admin".
Here are the lines in the code where it happens
// Set the admin
// The metadata will contain the certificate of the administrator
adminCert, err := stub.GetCallerMetadata()
if err != nil {
myLogger.Debug("Failed getting metadata")
return nil, errors.New("Failed getting metadata.")
}
if len(adminCert) == 0 {
myLogger.Debug("Invalid admin certificate. Empty.")
return nil, errors.New("Invalid admin certificate. Empty.")
}
Do you have any ideas how to make the chaincode return any data for stub.GetCallerMetadata() ?

"Metadata" should be provided in your deploy command, an example of "deploy" for asset_management_with_roles:
curl -XPOST -d ‘{“jsonrpc": "2.0", "method": "deploy", "params": {"type": 1,"chaincodeID": {"path": "github.com/hyperledger/fabric/examples/chaincode/go/asset_management_with_roles","language": "GOLANG"}, "ctorMsg": { "args": ["init"] }, "metadata":[97, 115, 115, 105, 103, 110, 101, 114] ,"secureContext": "assigner"} ,"id": 0}' http://localhost:7050/chaincode
In this command "metadata" contains utf-8 encoded string “assigner”. This string will be saved in a ledger and only user with such role will be able to execute “assign” function in smart contract.
"asset_management" example expects that you will provide certificate in metadata field. In order to obtain certificate you can use step 9 described in related question: How is running the asset_management.go different from running a simple chaincode like chaincode_example02.go

Related

Creating Connection for RedshiftDataOperator

So i when to the airflow documentation for aws redshift there is 2 operator that can execute the sql query they are RedshiftSQLOperator and RedshiftDataOperator. I already implemented my job using RedshiftSQLOperator but i want to do it using RedshiftDataOperator instead, because i dont want to using postgres connection in RedshiftSQLOperator but AWS API.
RedshiftDataOperator Documentation
I had read this documentation there is aws_conn_id in the parameter. But when im trying to use the same connection id there is error.
[2023-01-11, 04:55:56 UTC] {base.py:68} INFO - Using connection ID 'redshift_default' for task execution.
[2023-01-11, 04:55:56 UTC] {base_aws.py:206} INFO - Credentials retrieved from login
[2023-01-11, 04:55:56 UTC] {taskinstance.py:1889} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/operators/redshift_data.py", line 146, in execute
self.statement_id = self.execute_query()
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/operators/redshift_data.py", line 124, in execute_query
resp = self.hook.conn.execute_statement(**filter_values)
File "/home/airflow/.local/lib/python3.7/site-packages/botocore/client.py", line 415, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/botocore/client.py", line 745, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (UnrecognizedClientException) when calling the ExecuteStatement operation: The security token included in the request is invalid.
From task id
redshift_data_task = RedshiftDataOperator(
task_id='redshift_data_task',
database='rds',
region='ap-southeast-1',
aws_conn_id='redshift_default',
sql="""
call some_procedure();
"""
)
What should i fill in the airflow connection ? Because in the documentation there is no example of value that i should fill to airflow. Thanks
Airflow RedshiftDataOperator Connection Required Value
Have you tried using the Amazon Redshift connection? There is both an option for authenticating using your Redshift credentials:
Connection ID: redshift_default
Connection Type: Amazon Redshift
Host: <your-redshift-endpoint> (for example, redshift-cluster-1.123456789.us-west-1.redshift.amazonaws.com)
Schema: <your-redshift-database> (for example, dev, test, prod, etc.)
Login: <your-redshift-username> (for example, awsuser)
Password: <your-redshift-password>
Port: <your-redshift-port> (for example, 5439)
(source)
and an option for using an IAM role (there is an example in the first link).
Disclaimer: I work at Astronomer :)
EDIT: Tested the following with Airflow 2.5.0 and Amazon provider 6.2.0:
Added the IP of my Airflow instance to the VPC security group with "All traffic" access.
Airflow Connection with the connection id aws_default, Connection type "Amazon Web Services", extra: { "aws_access_key_id": "<your-access-key-id>", "aws_secret_access_key": "<your-secret-access-key>", "region_name": "<your-region-name>" }. All other fields blank. I used a root key for my toy-aws. If you use other credentials you need to make sure that IAM role has access and the right permissions to the Redshift cluster (there is a list in the link above).
Operator code:
red = RedshiftDataOperator(
task_id="red",
database="dev",
sql="SELECT * FROM dev.public.users LIMIT 5;",
cluster_identifier="redshift-cluster-1",
db_user="awsuser",
aws_conn_id="aws_default"
)

AWS sagemaker endpoint received client (400) error

I've deployed a tensorflow multi-label classification model using a sagemaker endpoint as follows:
predictor = sagemaker_model.deploy(initial_instance_count=1, instance_type="ml.m5.2xlarge", endpoint_name='testing-2')
It gets deployed and works fine when I invoke it from the Sagemaker Jupyter instance:
sample = ['this movie was extremely good']
output=predictor.predict(sample)
output:
{'predictions': [[0.00370046496,
4.32942124e-06,
0.00080883503,
9.25126587e-05,
0.00023958087,
0.000130862]]}
However, I am unable to send a request to the deployed endpoint from other notebooks or sagemaker studio. I'm unsure of the request format.
I've tried several variations in the input format and still failed. The error message is as below:
sagemaker error
Request:
{
"body": {
"text": "Testing model's prediction on this text"
},
"contentType": "application/json",
"endpointName": "testing-2",
"customURL": "",
"customHeaders": [
{
"Key": "sm_endpoint_name",
"Value": "testing-2"
}
]
}
Error:
Error invoking endpoint: Received client error (400) from primary with message "{ "error": "Failed to process element:
0 key: text of 'instances' list. Error: INVALID_ARGUMENT: JSON object: does not have named input: text" }".
See https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logEventViewer:group=/aws/sagemaker/Endpoints/testing-2
in account 793433463428 for more information.
Is there any way to find out exactly how the model expects the request format to be?
Earlier I had the same model on my local system and the way I tested it was using this curl request:
curl -s -H 'Content-Type: application/json' -d '{"text": "what ugly posts"}' http://localhost:7070/sentiment
And it worked fine without any issues.
I've tried different formats and replaced the "text" key inside body with other words like "input", "body", nothing etc.
Based on your description above, I assume you are deploying the TensorFlow model using the SageMaker TensorFlow container.
If you want to view what your model expects as input you can use the saved_model CLI:
1
├── keras_metadata.pb
├── saved_model.pb
└── variables
├── variables.data-00000-of-00001
└── variables.index
!saved_model_cli show --all --dir {"1"}
After you have confirmed the input name above you can invoke the endpoint as follows:
import json
import boto3
client = boto3.client('runtime.sagemaker')
data = {"instances": ['this movie was extremely good']}
response = client.invoke_endpoint(EndpointName=<EndpointName>,
Body=json.dumps(data))
response_body = response['Body']
print(response_body.read())
The same payload can then also be used in Studio when invoking the endpoint.

Decrypting SAML2 response using pysaml2 Python module

I am integrating my app with okta to have single sign on. Okta will be passing some user information in SAML response which I need to use in my application.
Hence, we decided to encrypt the saml response(xml) at IDP using my server(apache) public key.
Now I am trying to decrypt the saml2 response so that I can get the attributes.
My applications uses
Python 3.5
Django 1.11
pysaml2 python module
I am using below to validate/parse the saml2 response coming from okta
https://github.com/fangli/django-saml2-auth
If the saml response is not encrypted, I am able to process the response and able to get the user identity and user attributes from it.
However once it is encrypted at okta end with my server public key, I am not able to decrypt with my private key.
The saml setting I have at my application is below :
saml_settings = { 'metadata': {
"local": [ metadat_xml
],
},
'service': {
'sp': {
'endpoints': {
'assertion_consumer_service': [
(acs_url, BINDING_HTTP_REDIRECT),
(acs_url, BINDING_HTTP_POST),
(https_acs_url, BINDING_HTTP_REDIRECT),
(https_acs_url, BINDING_HTTP_POST)
],
},
'allow_unsolicited': True,
'authn_requests_signed': False,
'logout_requests_signed': True,
'want_assertions_signed': True,
'want_response_signed': False,
},
},
'key_file': "mykey.key", # private part
'cert_file': "mykey.crt", # public part
'xmlsec_binary': '/usr/bin/xmlsec1',
'encryption_keypairs': [{
'key_file': 'mykey.key',
'cert_file': 'mykey.crt',
}]
}
if 'ENTITY_ID' in settings.SAML2_AUTH:
saml_settings['entityid'] = settings.SAML2_AUTH['ENTITY_ID']
#print('entity id ' , settings.SAML2_AUTH['ENTITY_ID'])
if 'NAME_ID_FORMAT' in settings.SAML2_AUTH:
saml_settings['service']['sp']['name_id_format'] = settings.SAML2_AUTH['NAME_ID_FORMAT']
# NOTE-'NAME_ID_FORMAT is set to None above
spConfig = Saml2Config()
spConfig.load(saml_settings)
spConfig.allow_unknown_attributes = True
saml_client = Saml2Client(config=spConfig)
return saml_client
Then I have
saml_client = _get_saml_client(get_current_domain(r))
resp = r.POST.get('SAMLResponse', None)
authn_response = saml_client.parse_authn_request_response(resp, entity.BINDING_HTTP_POST )
This auth_response object is not returning me anything when the message is encrypted.
In logs I see below error
GbHvkJJM0WIsPYFGtiQ/0n+ux0tV/z/OKpT1AqEE74iRVHEHD7omP41iY/c4=
</ns3:CipherValue></ns3:CipherData><ns3:ReferenceList><ns3:DataReference
URI="#_648cdbd139564492f0bdfe4fbbda92f6" /></ns3:ReferenceList>
</ns3:EncryptedKey></ns1:EncryptedAssertion></ns0:Response>
2018-04-30 18:21:09,232 [DEBUG] sigver saml2.sigver decrypt(): Decrypt input
len: 15187
2018-04-30 18:21:09,233 [DEBUG] sigver saml2.sigver _run_xmlsec(): xmlsec
command: /usr/bin/xmlsec1 --decrypt --privkey-pem
/private.pem --id-attr:ID EncryptedKey --output /tmp/tmp7rt7g95u.xml
/tmp/tmpkhxwo8s4
2018-04-30 18:21:09,247 [DEBUG] sigver saml2.sigver _run_xmlsec(): xmlsec
p_out:
2018-04-30 18:21:09,247 [DEBUG] sigver saml2.sigver _run_xmlsec(): xmlsec
p_erryy:
func=xmlSecXPathDataExecute:file=xpath.c:line=273:obj=unknown:
subj=xmlXPtrEval:error=5:libxml2 library function
failed:expr=xpointer(id('_841612fffac65343e73f8913eeecfb30'))
func=xmlSecXPathDataListExecute:file=xpath.c:line=373:obj=unknown:
subj=xmlSecXPathDataExecute:error=1:xmlsec library function failed:
func=xmlSecTransformXPathExecute:file=xpath.c:line=483:
obj=xpointer:subj=xmlSecXPathDataExecute:error=1:xmlsec library function
failed:
func=xmlSecTransformDefaultPushXml:file=transforms.c:
line=2411:obj=xpointer:subj=xmlSecTransformExecute:error=1:xmlsec library
function failed:
func=xmlSecTransformCtxExecute:file=transforms.c:line=1302:
obj=unknown:subj=xmlSecTransformCtxXmlExecute:error=1:xmlsec library
function failed:
func=xmlSecKeyDataRetrievalMethodXmlRead:file=keyinfo.c:line=1178:
obj=retrieval-method:subj=xmlSecTransformCtxExecute:error=1:xmlsec library
function failed:
func=xmlSecKeyInfoNodeRead:file=keyinfo.c:line=114:obj=retrieval-method:
subj=xmlSecKeyDataXmlRead:error=1:xmlsec library function
failed:node=RetrievalMethod
func=xmlSecKeysMngrGetKey:file=keys.c:line=1349:obj=unknown:
subj=xmlSecKeyInfoNodeRead:error=1:xmlsec library function
failed:node=KeyInfo
func=xmlSecEncCtxEncDataNodeRead:file=xmlenc.c:line=957:
obj=unknown:subj=unknown:error=45:key is not found:
func=xmlSecEncCtxDecryptToBuffer:file=xmlenc.c:line=715:
obj=unknown:subj=xmlSecEncCtxEncDataNodeRead:error=1:xmlsec library function
failed:
func=xmlSecEncCtxDecrypt:file=xmlenc.c:line=623:
obj=unknown:subj=xmlSecEncCtxDecryptToBuffer:error=1:xmlsec library function
failed:
Error: failed to decrypt file
Error: failed to decrypt file "/tmp/tmpkhxwo8s4"
I am not sure why xmlsec1 command is failing and what I am missing here.
I have tried decrypting with my private key(self signed private key) here
https://www.samltool.com/decrypt.php
and it works
Could you please help me here and let me know what I am not doing correctly?
You need to add
saml_settings['id_attr_name'] = 'Id'
The default id attr is ID, but Okta uses Id. See xmlsec FAQ for more details.

Could not connect to the endpoint URL in Cost Explorer

Im trying to use CostExplorer to estimate charges, filtered by TagName.
time_period = {'Start':'2017-12-18', 'End':'2017-12-19'}
filters = {
"And":
[{
"Tags": {
"Key": "TagName",
"Values": ["Test1"]
}
}]
}
print aws.get_cost_and_usage(TimePeriod=time_period, Granularity='DAILY', Metrics=['BlendedCost'], Filter=filters)
By requesting the cost of any of my machines (Ireland), it shows an error that it is not possible to connect to ce.eu-west-1.amazonaws.com
Traceback (most recent call last):
File "test.py", line 22, in <module>
print aws.service.cloudwatch.client.get_cost_and_usage(TimePeriod=time_period, Granularity='DAILY', Metrics=['BlendedCost'], Filter=filters)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "https://ce.eu-west-1.amazonaws.com/"
Maybe this service is not available in Ireland yet?
I cannot find "Cost explorer" / "Billing" / "Cost management" here:
http://docs.aws.amazon.com/general/latest/gr/rande.html#awssupport_region
I'm using:
boto3==1.5.2
botocore==1.8.16
The Cost Explorer service is deployed in us-east-1.
All of your queries must be directed to that region, i.e.:
client = boto3.client('ce', region_name='us-east-1')
client.get_cost_and_usage(....)
Response will include all your regions.
Notice the AWS UI also mentions 'Global' when you navigate to billing console.

"ValueError: Invalid control character" using Python OAuth client

I am trying to load data from a google spreadsheet into a postgres database.
The problem is when I am trying to authenticate my credentials, I get the following error:
File "/usr/local/lib/python2.7/dist-packages/oauth2client/_openssl_crypt.py", line 117, in from_string
pkey = crypto.load_privatekey(crypto.FILETYPE_PEM, parsed_pem_key)
OpenSSL.crypto.Error: [('PEM routines', 'PEM_read_bio', 'no start line')]
I have followed all steps in Using OAuth2 for Authorization, enabled the API and created a service account from which I got a .json file containing the necessary keys and authentication elements.
The way I am trying to authenticate is the following:
json_key = json.load(open('gdoc.json'),strict=False)
creds = SignedJwtAssertionCredentials(json_key['client_email'], json_key['private_key'], scope)
login = gspread.authorize(creds)
Perhaps the issue comes from the parameter in the json load strict=False, the problem is if I remove it, get the error:
File "/usr/lib/python2.7/json/decoder.py", line 382, in raw_decode
obj, end = self.scan_once(s, idx)
ValueError: Invalid control character at: line 5 column 38 (char 174)
I saw a couple other forums and they suggest using the strict=False parameter since there are non escaped \n inside the json key file.
This is a copy of the .json key file:
{
"type": "service_account",
"project_id": "geometric-shine-118101",
"private_key_id": "xxx",
"private_key": "-----BEGIN PRIVATE KEY-----\nxxx\n-----END PRIVATE KEY-----\n",
"client_email": "dataload#geometric-shine-118101.iam.gserviceaccount.com",
"client_id": "117076930343404252458",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/dataload%40geometric-shine-118101.iam.gserviceaccount.com"
}
"private_key": "-----BEGIN PRIVATE KEY-----\nxxx\n-----END PRIVATE KEY-----\n",
"client_email": "dataload#geometric-shine-118101.iam.gserviceaccount.com",
Copy and paste the actual values into single quotes (.e.g. ''). It worked for me.
'dataload#geometric-shine-118101.iam.gserviceaccount.com'
'-----BEGIN PRIVATE KEY-----\nxxx\n-----END PRIVATE KEY-----\n'