I registered dataflow with command : gcloud deployment-manager type-providers create dataflow --descriptor-url='https://dataflow.googleapis.com/$discovery/rest?version=v1b3'
When i run this script
- name: "my-topic-to-avro"
type: 'project_id/dataflow:dataflow.projects.locations.templates.launch'
properties:
projectId: project_id
gcsPath: "gs://test-create-123"
jobName: "my-topic-to-avro"
location: "europe-west1"
parameters:
inputTopic: "projects/project_id/topics/test"
outputDirectory: "gs://test-create-123/"
avroTempDirectory: "gs://test-create-123/"
In output i have this :
ERROR: (gcloud.beta.deployment-manager.deployments.update) Error in Operation [operation-1598980583c2a0ec69]: errors:
- code: RESOURCE_ERROR
location: /deployments/quick-deployment/resources/my-topic-to-avro
message: '{"ResourceType":"project_id/dataflow:dataflow.projects.locations.templates.launch","ResourceErrorCode":"401","ResourceErrorMessage":{"code":401,"message":"Request
is missing required authentication credential. Expected OAuth 2 access token,
login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.","status":"UNAUTHENTICATED","statusMessage":"Unauthorized","requestPath":"https://dataflow.googleapis.com/v1b3/projects/project_id/locations/europe-west1/templates:launch","httpMethod":"POST"}}'
I can have my token run command : gcloud auth print-access-token, but i don't know where insert my value and the schema for my yaml to insert all value to create dataflow.
Any help appreciated.
The "401 - Request is missing required authentication credential" error message that is triggered when doing a POST request to the Dataflow API is due to a missing credential. The following public reference explains in detail how to use OAuth 2.0 to access Google APIs. Please read it carefully and make sure to follow the steps as mentioned to avoid any errors.
Another place to check this is double check that the Dataflow API is enabled and try setting "GOOGLE_APPLICATION_CREDENTIALS" to point to the JSON of the service account you are using.
I found this Authentication documentation where is mentioned that you can use the Authorization header to supply an access token from the project's service account, you can also try with this method but is important t o clarify that is an example for GKE cluster.
I noticed this question was addressed in this post.
Related
I am using this link: https://cloud.google.com/composer/docs/composer-2/triggering-with-gcf in order to trigger a DAG using airflow2.
However I am getting this error:
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://myairflow/api/v1/dags/lmbr-agentgroup-gcs-to-raw/dagRuns
However there aren't any place to put some kind of authentication in the code provide in the link of google material.
Does anyone faced this problem and know how to fixed?
I was using the default user in cloud function which is: appengine. after that I found more information and use this command to list all user who has permission
gcloud composer environments run myAirflowEnv --location=southamerica-east1 users -- list
and I used one of this user on cloud function, but it didn't work as well.
same error: requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://myAirflow/api/v1/dags/mydag/dagRuns
I'm trying to deploy Amplify project using bitbucket repo with the frontend.
At the deploying stage Amplify CLI asks for OAuthToken (which cannot be found anywhere in bitbucket). OAuth consumer has key and secret and both are not accepted by Amplify CLI.
Error i'm getting is:
There was an issue setting up your repository.
Please try again later.({"type": "error",
"error": {"message": "Access token expired."}})
(Service: Amplify; Status Code: 400; Error Code:
BadRequestException; Request ID:
ceced480-e38d-4c44-a1db-ad2eb50e34a8; Proxy:
null)
I might be doing something wrong here.
But hours of googling didnt helped me.
Please advice.
Big thanks in advance!
The solution is to setup OAuth Consumers (key and secret) in your bitbucket account, but there are some specific steps you have to follow.
Click on your account profile name -> settings. Under OAuth Consumers, click add consumers.
Set callback url to http://localhost:3000 Note: this was never used, but couldnt get it to work without setting something
Check 'This is private consumer'
Permissions:
Repositories -> Admin
Webhooks -> Read and Write
Pull Requests -> Read
Once created, make a note of the key and secret, as it will be used in the next step.
Open terminal (replacing KEY and SECRET) and run:
curl -X POST -u "KEY:SECRET" https://bitbucket.org/site/oauth2/access_token -d grant_type=client_credentials
This will return an access_token which you should copy.
Now, this threw me, because in AWS, you can supply an access token or an oath token. Since we are using OAuth, even though the returned token is called access_token, in aws we need to add it to the OAuthToken field.
In Cloudformation, it looks as follows (AWS CLI uses similarly named fields, so should also work there).
Type: AWS::Amplify::App
Properties:
Name: MyApp
Repository: https://bitbucket.org/USERNAME/REPO_NAME
OauthToken: ACCESS_TOKEN # token returned from terminal command
I am a novice developer who wants to learn how to use artificial intelligence.
So I created model and it responds correctly according to the inputs.
So I want to test using postman a call to the API to verify that everything works and I have an error in my call:
"message": "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
I don't know how to authenticate myself to retrieve the access token. Could you help me find a solution please?
You have 2 solutions:
As John commented,
Install the gcloud SDK on your computer.
Authenticate yourselves with the command gcloud auth login or gcloud init (proposed at the end of the installation)
Generate an access token gcloud auth print-access-token and copy it
Tips: you can skip the 2 first steps if you use Cloud Shell
Add the access token to the header to your Postman request like this:
Key is "Authrorization"
Value is "Bearer "
The access token is valid 1H, then you have to generate a new one.
(Not recommended) Make your model public (i.e. unauthenticated) like this (only in command line, it doesn't work on the GUI)
gcloud ai-platform models add-iam-policy-binding --member="allUsers" \
--role="roles/ml.modelOwner" <Model Name>
replace Model Name by your deployed model. However, it's only for test purpose. You have any filter and anyone can use it and you will pay for the processing incurs.
I am using https://dataproc.googleapis.com/v1/projects/{projectId}/regions/{region}/clusters to create GCP Dataproc clusters as described at https://cloud.google.com/dataproc/docs/reference/rest/v1/projects.regions.clusters/create.
I am using service account credentials that have been exported into a JSON keyfile. That service account (myserviceaccount#projectA.iam.gserviceaccount.com) exists in projectA and I have been able to use it to successfully create Dataproc clusters in projectA.
I now need to use the same service account to create Dataproc clusters in projectB. I'm running exactly the same code using exactly the same credentials, the only difference is the project that I'm creating it in. I have granted myserviceaccount#projectA.iam.gserviceaccount.com the exact same permissions in projectB as it has in projectA but when I try and create the cluster it fails:
2019-03-22 10:58:47 INFO: _retrieve_discovery_doc():272: URL being requested: GET https://www.googleapis.com/discovery/v1/apis/dataproc/v1/rest
2019-03-22 10:58:54 INFO: method():873: URL being requested: GET https://dataproc.googleapis.com/v1/projects/dh-coop-no-test-35889/regions/europe-west1/clusters?alt=json
2019-03-22 10:58:54 INFO: new_request():157: Attempting refresh to obtain initial access_token
2019-03-22 10:58:54 DEBUG: make_signed_jwt():100: [b'blahblahblah', b'blahblahblah']
2019-03-22 10:58:54 INFO: _do_refresh_request():777: Refreshing access_token
2019-03-22 10:58:55 WARNING: _should_retry_response():121: Encountered 403 Forbidden with reason "forbidden"
So, that service account is forbidden from creating clusters in projectB, but I don't get any information about why. I am hoping there are some audit logs that explain more about why the request was forbidden but I've looked in https://console.cloud.google.com/logs/viewer?project=projectB and can't find any.
Can someone tell me where I can get more information to diagnose why this request is failing?
As mentioned in the comments, one way to get more information on the failed request is to set up gcloud to use the service account. Running gcloud commands with --log-http may also give additional information.
Re-pasting here for easier readability/visibility.
I'm trying to login Cloud Foundry endpoint.
But when I connect by Cloud Foundry CLI, I get a error message below:
C:\Users\abc>cf login -a https://xxx.predix-
uaa.run.aws-usw02-pr.ice.predix.io
API endpoint: https://xxx.predix-uaa.run.aws-us
w02-pr.ice.predix.io
Not logged in. Use 'cf login' to log in.
FAILED
Error performing request: Get /login: unsupported protocol scheme ""
Please help!
The issue is likely that you are not specifying your CF API endpoint url. Please contact your platform operator to confirm what it should be.
We'll improve the error message, but what seems to be happening is that the cf CLI tries to retrieve a json configuration from [api-endpoint]/v2/info, but not getting the response it expects.
It then builds a URL to the login endpoint from the "authorization_endpoint" that should be advertised in that json configuration. As that field is not in your response, it tries to access "/login" instead of e.g. "https://xxx.predix-uaa.run.aws-usw02-pr.ice.predix.io/login", causing the error.
CF endpoint urls generally start with "api.". In fact, I've never seen one starting differently.