Following the quickstart for gcp dataflow here
I run into the following error when executing the example script here
using this command
declare -r PROJECT="beam-test"
declare -r BUCKET="gs://my-beam-test-bucket"
echo
set -v -e
python -m apache_beam.examples.wordcount \
--project $PROJECT \
--job_name $PROJECT-wordcount \
--runner DataflowRunner \
--staging_location $BUCKET/staging \
--temp_location $BUCKET/temp \
--output $BUCKET/output
which results in this error:
http_response.request_url, method_config, request)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/beam-test/locations/us-central1/jobs?alt=json>: response: <{'status': '403', 'content-length': '284', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Fri, 31 Mar 2017 15:52:54 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'quic=":443"; ma=2592000; v="37,36,35"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 403,
"message": "(f010d95b3e221bbf): Could not create workflow; user does not have write access to project: beam-test Causes: (f010d95b3e221432): Permission 'dataflow.jobs.create' denied on project: 'beam-test'",
"status": "PERMISSION_DENIED"
I have already enabled the DataFlow api for the project. And I have authorized the gcloud cli with the owner account of the project (which I assumes has full access).
How & where do I enable write permissions?
Change $PROJECT=project-name to $PROJECT=project-id
Have you tried running gcloud auth login to make sure you have a valid credential?
If yes, your default cloud project might be different than the one you're running Dataflow with. To change the default project, you can run gcloud init.
Let me know if that doesn't solve it.
Related
When trying to add high availability on an existing Cloud SQL instance using:
gcloud sql instances patch $INSTANCE --project $PROJECT --availability-type regional
the process fails with this message
The following message will be used for the patch API method.
{"project": "$PROJECT", "name": "$INSTANCE", "settings": {"availabilityType": "REGIONAL", "databaseFlags": [{"name": "sql_mode", "value": "TRADITIONAL"}, {"name": "default_time_zone", "value": "+01:00"}]}}
ERROR: (gcloud.sql.instances.patch) HTTPError 400: The incoming request contained invalid data.
It also fails using the web interface.
Gcloud version Google Cloud SDK [280.0.0]
This is the output of the log (not much help that I can see):
2020-02-14 11:01:34,476 DEBUG root Loaded Command Group: [u'gcloud', u'sql', u'instances']
2020-02-14 11:01:34,510 DEBUG root Loaded Command Group: [u'gcloud', u'sql', u'instances', u'patch']
2020-02-14 11:01:34,517 DEBUG root Running [gcloud.sql.instances.patch] with arguments: [--availability-type: "regional", --project: "$PROJECT", INSTANCE: "$INSTANCE"]
2020-02-14 11:01:35,388 INFO ___FILE_ONLY___ The following message will be used for the patch API method.
2020-02-14 11:01:35,398 INFO ___FILE_ONLY___ {"project": "$PROJECT", "name": "$INSTANCE", "settings": {"availabilityType": "REGIONAL", "databaseFlags": [{"name": "sql_mode", "value": "TRADITIONAL"}, {"name": "default_time_zone", "value": "+01:00"}]}}
2020-02-14 11:01:35,865 DEBUG root (gcloud.sql.instances.patch) HTTPError 400: The incoming request contained invalid data.
Traceback (most recent call last):
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\lib\googlecloudsdk\calliope\cli.py", line 981, in Execute
resources = calliope_command.Run(cli=self, args=args)
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\lib\googlecloudsdk\calliope\backend.py", line 807, in Run
resources = command_instance.Run(args)
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\lib\surface\sql\instances\patch.py", line 306, in Run
return RunBasePatchCommand(args, self.ReleaseTrack())
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\lib\surface\sql\instances\patch.py", line 278, in RunBasePatchCommand
instance=instance_ref.instance))
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\lib\googlecloudsdk\third_party\apis\sql\v1beta4\sql_v1beta4_client.py", line 697, in Patch
config, request, global_params=global_params)
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\..\lib\third_party\apitools\base\py\base_api.py", line 731, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\..\lib\third_party\apitools\base\py\base_api.py", line 737, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "C:\Users\udAL\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\..\lib\third_party\apitools\base\py\base_api.py", line 604, in __ProcessHttpResponse
http_response, method_config=method_config, request=request)
HttpBadRequestError: HttpError accessing <https://sqladmin.googleapis.com/sql/v1beta4/projects/$PROJECT/instances/$INSTANCE?alt=json>: response: <{'status': '400', 'content-length': '269', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Fri, 14 Feb 2020 10:01:35 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'quic=":443"; ma=2592000; v="46,43",h3-Q050=":443"; ma=2592000,h3-Q049=":443"; ma=2592000,h3-Q048=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 400,
"message": "The incoming request contained invalid data.",
"errors": [
{
"message": "The incoming request contained invalid data.",
"domain": "global",
"reason": "invalidRequest"
}
]
}
}
>
2020-02-14 11:01:35,868 ERROR root (gcloud.sql.instances.patch) HTTPError 400: The incoming request contained invalid data.
2020-02-14 11:01:35,898 DEBUG root Metrics reporting process started...
Edit:
When using the gcloud cli command:
gcloud patch with 3 input parameters
Both $PROJECT and $INSTANCE do exist since I can gcloud sql databases list --instance $INSTANCE --project $PROJECT and it works fine.
availability-type=regional it's documented so should work
I'm not constructing the request manually, I'm using gcloud CLI
When using the console.cloud.google.com web interface:
Main menu -> SQL -> select instance -> Enable High Availability.
It's a button, no parameters added by myself.
Both return the same error "The incoming request contained invalid data."
Can't see how I may be doing it wrong.
Please check your data in the incoming request.
I used the Method: instances.patch and it worked as expected for me.
project
instance-name
request body:
"settings": {
"availabilityType": "REGIONAL",
"databaseFlags": [
{
"name": "sql_mode",
"value": "TRADITIONAL"
},
{
"name": "default_time_zone",
"value": "+01:00"
}
]
}
}
Curl command:
'https://sqladmin.googleapis.com/sql/v1beta4/projects/your-project/instances/your_instancet?key=[YOUR_API_KEY]' \
--header 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
--header 'Accept: application/json' \
--header 'Content-Type: application/json' \
--data '{"settings":{"availabilityType":"REGIONAL","databaseFlags":[{"name":"sql_mode","value":"TRADITIONAL"},{"name":"default_time_zone","value":"+01:00"}]}}' \
--compressed```
Response 200:
{
"kind": "sql#operation",
"targetLink": "https://content-sqladmin.googleapis.com/sql/v1beta4/projects/your-project/instances/your-instance",
"status": "PENDING",
"user": "#cloud.com",
"insertTime": "2020-02-14T12:35:37.615Z",
"operationType": "UPDATE",
"name": "3f55c1be-97b5-4d37-8d1f-15cb61b4c6cc",
"targetId": "your-instance",
"selfLink": "https://content-sqladmin.googleapis.com/sql/v1beta4/projects/wave25-vladoi/operations/3f55c1be-97b5-4d37-8d1f-15cb61b4c6cc",
"targetProject": "your-project"
}
I want to deploy my application using bitbucket pipeline in the production environment.
I followed the instruction given in https://cloud.google.com/solutions/continuous-delivery-bitbucket-app-engine but this is deploying my application in the staging environment.
My Pipeline file is
image: python:2.7
pipelines:
branches:
master:
- step:
script: # Modify the commands below to build your repository.
# Downloading the Google Cloud SDK
- curl -o /tmp/google-cloud-sdk.tar.gz https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-155.0.0-linux-x86_64.tar.gz
- tar -xvf /tmp/google-cloud-sdk.tar.gz -C /tmp/
- /tmp/google-cloud-sdk/install.sh -q
- source /tmp/google-cloud-sdk/path.bash.inc
# Authenticating with the service account key file
- echo ${GOOGLE_CLIENT_SECRET} > client-secret.json
- gcloud auth activate-service-account --key-file client-secret.json
# Linking to the Google Cloud project
- gcloud config set project $CLOUDSDK_CORE_PROJECT
- gcloud -q app deploy app.yaml
This is showing me the following error
You are about to deploy the following services:
- my-app/default/232326565655 (from [/opt/atlassian/pipelines/agent/build/app.yaml])
Deploying to URL: [https://my-app.appspot.com]
Beginning deployment of service [default]...
Some files were skipped. Pass `--verbosity=info` to see which ones.
You may also view the gcloud log file, found at
[/root/.config/gcloud/logs/2018.02.05/05.25.49.374053.log].
ERROR: gcloud crashed (UploadError): Error uploading files: HttpError accessing <https://www.googleapis.com/storage/v1/b/staging.my-app.appspot.com/o?alt=json&maxResults=1000>: response: <{'status': '403', 'content-length': '410', 'expires': 'Mon, 05 Feb 2018 05:25:52 GMT', 'vary': 'Origin, X-Origin', 'server': 'UploadServer', 'x-guploader-uploadid': 'UPLOADER_ID', 'cache-control': 'private, max-age=0', 'date': 'Mon, 05 Feb 2018 05:25:52 GMT', 'alt-svc': 'hq=":443"; ma=2592000; quic=51303431; quic=51303339; quic=51303338; quic=51303337; quic=51303335,quic=":443"; ma=2592000; v="41,39,38,37,35"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "bitbucket-authorization#my-app.iam.gserviceaccount.com does not have storage.objects.list access to staging.my-app.appspot.com."
}
],
"code": 403,
"message": "bitbucket-authorization#my-app.iam.gserviceaccount.com does not have storage.objects.list access to staging.my-app.appspot.com."
}
}
Followed the same tutorial and based on the errors you presented, the service account you used ""bitbucket-authorization#my-app.iam.gserviceaccount.com" didn't have privilege to access the staging buckets.
Make sure both App Engine > App Engine Admin & Storage > Storage Object Admin roles are given to this service account from the Cloud Console.
One last thing I noticed when using a new project for that tutorial was that I had to manually enable the Google App Engine Admin API too.
EDIT:
You could use the flag --bucket=gs://BUCKETNAME in the script used by bitbucket to deploy:
i.e -> gcloud app deploy --bucket="gs://BUCKETNAME"
In postman to send credentials to setup login in the body section x-www-form-ulrencoded. This works absolutely fine.
After exporting this script in newman collection runner, this does not work.
Following is the output:
-------------------------------------------------------------------------------------------
401 3361ms Login Call [POST] https://<url>/login
------------------------------------------------------------
Request headers:
{
"content-type": "application/x-www-form-urlencoded",
"host": "<url>",
"accept-encoding": "gzip, deflate",
"content-length": 0
}
Request data:
{
"userid": "user",
"password": "pswd"
}
------------------------------------------------------------
Response headers:
{
"x-frame-options": "SAMEORIGIN",
"content-type": "application/json",
"x-content-type-options": "nosniff",
"date": "...",
"cache-control": "no-cache",
"content-length": "120",
"x-xss-protection": "1; mode=block",
"connection": "close",
"accept-ranges": "bytes"
}
Response body:
{"errorCode":"401","errorMessage":"Login failed, please check the credentials","errorDescription":"API Request Failure"}
Also, when i generate curl code from postman i don't see the user/pswd headers attached:
curl -X POST \
https://<url>/login \
-H 'cache-control: no-cache' \
-H 'content-type: application/x-www-form-urlencoded' \
-H 'postman-token: 5b04e538-498c-9e43-2be9-8523073260f9'
Using Postman Chrome app:
Postman for Chrome
Version 5.0.2
win / x86-64
Chrome 59.0.3071.115
So the issue over here is the version of Newman installed.
So I got the latest version:
//to install
npm install -g newman
//needed to use Newman.run instead of Newman.execute
newman run c:\s -e c:\env --reporters cli,json --reporter-json-export c:\out.json -k
//-k was required to bypass ssl security issues.
The easy-and-famous datalab create instance-name command is not longer working. We did not any change in project/apis/keys/ or any other google options.
The same cmd was ok yesterday and now:
user-used-yeserday#pruebaalexborrar:~$ datalab create alexborrarpurbea
ERROR: gcloud crashed (BadStatusCodeError): HttpError accessing
<https://sourcerepo.googleapis.com/v1/projects/pruebaalexborrar/repos
alt=json>:
response: <{'status': '500', 'content-length': '109', 'x-xss-protection':
'1; mod
e=block', 'x-content-type-options': 'nosniff', 'transfer-encoding':
'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-
encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 19 Apr 2017
09:08:43 G
MT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json;
charset=UTF-8'}>, content <{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
>
When I use the same URL on my browser to get the error, I got other different error:
{
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected
OAuth 2 access token, login cookie or other valid authentication credential.
See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
}
}
I guess the 401 error code is not related with the upper 501 from the `datalab create´ command...
I know google now is deploying new cloud release...
Anyone knows what's happening?
There is a reported issue as #37242989 at the issue tracker regarding this, so I suggest that you can add more details and star the issue there to get further updates from the related team working on this.
I'm running a simple DataFlow pipeline w/ the Python SDK for counting keywords. The job runs fine for pre-processing the input data, but it fails for grouping/output steps with the following error.
I guess the logs says the worker is having an issue accessing the temp folder, but the storage bucket in our project exists with proper permissions. What could be a possible issue for this?
"/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcsio.py", line
606, in write raise self.upload_thread.last_error # pylint:
disable=raising-bad-type HttpError: HttpError accessing
<https://www.googleapis.com/resumable/upload/storage/v1/b/[PROJECT-NAME-REDACTED]-temp-2016-08-07_04-42-52/o?uploadType=resumable&alt=json&name=0015bf8d-fa87-4c9a-82d6-8ffcd742d770>:
response: <{'status': '404', 'alternate-protocol': '443:quic',
'content-length': '165', 'vary': 'Origin, X-Origin', 'server':
'UploadServer', 'x-guploader-uploadid':
'AEnB2UoYRPUwhz-OXlJ437k0J8Uxd1lJvTsFbfVJF_YMP2GQEvmdDpo7e-3DVhuqNd9b1A_RFPbfIcK6hCsFcar-hdI94rqJZUvATcDmGRRIvHecAt5CTrg',
'date': 'Sun, 07 Aug 2016 04:43:23 GMT', 'alt-svc': 'quic=":443";
ma=2592000; v="36,35,34,33,32,31,30"', 'content-type':
'application/json; charset=UTF-8'}>, content <{ "error": { "errors": [
{ "domain": "global", "reason": "notFound", "message": "Not Found" }
], "code": 404, "message": "Not Found" } } >
This is https://issues.apache.org/jira/browse/BEAM-539, which doesn't allow root buckets as outputs for TextFileSink. As a workaround, please use a subdirectory path (e.g. gs://foo/bar) as output locations.