Unable to trigger Airflow using REST Api in google cloud composer - google-cloud-platform

I have tried running the 'Call Airflow REST API using client_id' from the link https://cloud.google.com/composer/docs/access-airflow-api#curl. I have triggered the code as below
data_1 = "{'key':'value'}"
trigger_dag(data_1)
I have been facing the error as below
> Exception: Bad response from application: 400 / {'Date': 'Wed, 20 Apr
> 2022 18:58:00 GMT', 'Content-Type': 'application/problem+json',
> 'Content-Length': '231', 'Server': 'gunicorn',
> 'Access-Control-Allow-Headers': '', 'Access-Control-Allow-Methods':
> '', 'Access-Control-Allow-Origin': '', 'X-Frame-Options': 'DENY',
> 'Via': '1.1 google'} / '{\n "detail": "\\"{\'key\':\'value\'}\\" is
> not of type \'object\' - \'conf\'",\n "status": 400,\n "title": "Bad
> Request",\n "type":
> "https://airflow.apache.org/docs/apache-airflow/2.1.4/stable-rest-api-ref.html#section/Errors/BadRequest"\n}\n'

Related

How to download the current version of a file from an S3 versioned bucket

I have objects with multiple versions and I am trying to compare which versions I can delete. I basically want to delete any version that has the same size of the current version.
The problem that I am having is that I can't find out which of the returned versions is the latest/current.
If I use the aws cli, it returns a field called 'IsLatest' but apparently, the boto3 version doesn't.
The aws cli also always returns the StorageClass while boto3 doesn't in some scenarios apparently.
Return from boto3:
{'ResponseMetadata': {'RequestId': 'PHQFMDCF3AHQM6R1', 'HostId': 'b7PmgsVm6y30wfA9GExS+Rc659cu1DI4YFec3i7tvDBew8ob5tY0Mtz6q+yC9nTwdmAoykdV7Lo=', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amz-id-2': 'b7PmgsVm6y30wfA9GExS+Rc659cu1DI4YFeR3i7tVDBeu8ab5tY0Mtz6X+yC9nTwdmAoykdV7Lo=', 'x-amz-request-id': 'PHQFMDTB32HQM6R1', 'date': 'Sat, 19 Feb 2022 22:42:14 GMT', 'last-modified': 'Thu, 17 Feb 2022 17:02:54 GMT', 'etag': '"55f146382684970d4970ae31b3d4b310"', 'x-amz-server-side-encryption': 'AES256', 'x-amz-version-id': 'gHm2D2uuosJQS6GpmuySU9uNSXN84cq9', 'accept-ranges': 'bytes', 'content-type': 'text/plain', 'server': 'AmazonS3', 'content-length': '969'}, 'RetryAttempts': 0}, 'AcceptRanges': 'bytes', 'LastModified': datetime.datetime(2022, 2, 17, 17, 2, 54, tzinfo=tzutc()), 'ContentLength': 969, 'ETag': '"55f141382684970d4970ae31b3d4b310"', 'VersionId': 'gHa2D2uuosJQS6GpmuySU9uNSXN84cR9', 'ContentType': 'text/plain', 'ServerSideEncryption': 'AES256', 'Metadata': {}, 'Body': <botocore.response.StreamingBody object at 0x10f29e1c0>}
Versioning_Test/file1.txt
Response from aws cli:
{
"ETag": "\"55f141382684970d4970ae31b3d4b310\"",
"Size": 969,
"StorageClass": "STANDARD",
"Key": "Versioning_Test/file1.txt",
"VersionId": "gHa2D2uuosJQS6GpmuySU9uNSXN84cR9",
"IsLatest": true,
"LastModified": "2022-02-17T17:02:54+00:00",
"Owner": {
"ID": "1e5bc34834bec07ae1bc55a5d07adab10d7d58da04ae761769339a914d1ab472"
}
},
Here is my python script:
bucket_name = 'bucket-name'
profile_name = 'aws-profile-name'
key = ''
session = boto3.session.Session(profile_name=profile_name)
s3 = session.resource('s3')
versions = s3.Bucket(bucket_name).object_versions.filter()
for version in versions:
print(version.object_key)
obj = version.get()
print(obj)
#print("\t" + obj.get('VersionId'), obj.get('ContentLength'), obj.get('LastModified'), obj.get('IsLatest'), obj.get('StorageClass'))
I am missing something?
You can list your object versions from a bucket using list_object_versions API:
import boto3
bucket_name = 'bucket-name'
profile_name = 'aws-profile-name'
if __name__ == "__main__":
session = boto3.Session(profile_name=profile_name)
client = session.client('s3')
response = client.list_object_versions(Bucket=bucket_name)
for version in response['Versions']:
print(f'Key: {version["Key"]}, Size: {version["Size"]} bytes, Latest: {version["IsLatest"]}'
f' LastModified: {version["IsLatest"]}, StorageClass: {version["StorageClass"]}')
You can notice that the response from AWS contains an IsLatest property as well.

Fail to deploy Django application with standard environment to Google App Engine

I deployed my Django app last Wednesday and everything was ok. On the next day, I tried to deploy changes, and App Engine ran into an error: INTERNAL: Internal error encountered. Also, I ran the deployment command today using the same app.yml config and nothing has changed. The same error has occurred.
My app.yml:
runtime: python37
instance_class: F1
env: standard
entrypoint: gunicorn -b :$PORT MyProject.wsgi --timeout 600
beta_settings:
cloud_sql_instances: "<project-name>:<region>:<database-name>"
handlers:
- url: /.*
script: auto
secure: always
redirect_http_response_code: 301
Here are the logs:
DEBUG: Running [gcloud.app.deploy] with arguments: [--quiet: "True", --verbosity: "debug", DEPLOYABLES:1: "[u'app.yaml']"]
DEBUG: Loading runtimes experiment config from [gs://runtime-builders/experiments.yaml]
INFO: Reading [<googlecloudsdk.api_lib.storage.storage_util.ObjectReference object at 0x7f463bc2cd10>]
DEBUG:
Traceback (most recent call last):
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/api_lib/app/runtime_builders.py", line 269, in _Read
with contextlib.closing(storage_client.ReadObject(object_)) as f:
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/api_lib/storage/storage_api.py", line 303, in ReadObject
object_=object_ref, err=http_exc.HttpException(err)))
BadFileException: Could not read [<googlecloudsdk.api_lib.storage.storage_util.ObjectReference object at 0x7f463bc2cd10>]. Please retry: HTTPError 404: No such object: runtime-builders/experiments.yaml
DEBUG: Experiment config file could not be read. This error is informational, and does not cause a deployment to fail. Reason: Unable to read the runtimes experiment config: [gs://runtime-builders/experiments.yaml], error: Could not read [<googlecloudsdk.api_lib.storage.storage_util.ObjectReference object at 0x7f463bc2cd10>]. Please retry: HTTPError 404: No such object: runtime-builders/experiments.yaml
Traceback (most recent call last):
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/surface/app/deploy.py", line 133, in _ServerSideExperimentEnabled
runtimes_builder_root)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/api_lib/app/runtime_builders.py", line 524, in LoadFromURI
.format(uri, e))
ExperimentsError: Unable to read the runtimes experiment config: [gs://runtime-builders/experiments.yaml], error: Could not read [<googlecloudsdk.api_lib.storage.storage_util.ObjectReference object at 0x7f463bc2cd10>]. Please retry: HTTPError 404: No such object: runtime-builders/experiments.yaml
DEBUG: API endpoint: [https://appengine.googleapis.com/], API version: [v1]
Services to deploy:
descriptor: [/home/runner/work/MyProject/MyProject/app.yaml]
source: [/home/runner/work/MyProject/MyProject]
target project: [<project-name>]
target service: [default]
target version: [20201018t154518]
target url: [https://<site-url>]
DEBUG: No bucket specified, retrieving default bucket.
DEBUG: Using bucket [gs://staging.<bucket-name>.appspot.com].
Beginning deployment of service [default]...
INFO: Using ignore file at [/home/runner/work/MyProject/MyProject/.gcloudignore].
DEBUG: Skipping file [README.md]
DEBUG: Skipping file [static]
DEBUG: Skipping file [.github]
DEBUG: Skipping file [tests]
DEBUG: Skipping file [.git]
DEBUG: Skipping file [misc]
DEBUG: Skipping file [static-root]
DEBUG: Skipping upload of [my_file_1.py]
...
DEBUG: Skipping upload of [my_file_n.py]
INFO: Incremental upload skipped 85.92% of data
DEBUG: Uploading 164 files to Google Cloud Storage
DEBUG: Using [16] threads
#============================================================#
#= Uploading 164 files to Google Cloud Storage =#
#INFO: Uploading [/home/runner/work/MyProject/MyProject/my_file_1.py] to [staging.<project-name>.appspot.com/1373be4811b24b04f2c1f0aaef730bdc76346e4a]
...
#INFO: Uploading [/home/runner/work/MyProject/MyProject/my_file_n.py] to [staging.<project-name>.appspot.com/1373be4811b24b04f2c1f0aaef730bdc76346e4a]
File upload done.
INFO: Manifest: [{...}]
DEBUG: Converted YAML to JSON: "{
"betaSettings": {
"cloud_sql_instances": "<project-name>:<region>:<database-name>"
},
"entrypoint": {
"shell": "gunicorn -b :$PORT MyProject.wsgi --timeout 600"
},
"env": "standard",
"handlers": [
{
"redirectHttpResponseCode": "REDIRECT_HTTP_RESPONSE_CODE_301",
"script": {
"scriptPath": "auto"
},
"securityLevel": "SECURE_ALWAYS",
"urlRegex": "/.*"
}
],
"instanceClass": "F1",
"runtime": "python37"
}"
DEBUG: Response returned status 500, retrying
DEBUG: Retrying request to url https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json after exception HttpError accessing <https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json>: response: <{'status': '500', 'content-length': '109', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sun, 18 Oct 2020 15:45:29 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'h3-Q050=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-27=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-T050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
>
DEBUG: Response returned status 500, retrying
DEBUG: Retrying request to url https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json after exception HttpError accessing <https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json>: response: <{'status': '500', 'content-length': '109', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sun, 18 Oct 2020 15:45:29 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'h3-Q050=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-27=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-T050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
>
DEBUG: Response returned status 500, retrying
DEBUG: Retrying request to url https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json after exception HttpError accessing <https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json>: response: <{'status': '500', 'content-length': '109', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sun, 18 Oct 2020 15:45:29 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'h3-Q050=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-27=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-T050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
>
DEBUG: Response returned status 500, retrying
DEBUG: Retrying request to url https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json after exception HttpError accessing <https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json>: response: <{'status': '500', 'content-length': '109', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sun, 18 Oct 2020 15:45:29 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'h3-Q050=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-27=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-T050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
>
DEBUG: (gcloud.app.deploy) INTERNAL: Internal error encountered.
Traceback (most recent call last):
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/calliope/cli.py", line 983, in Execute
resources = calliope_command.Run(cli=self, args=args)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/calliope/backend.py", line 807, in Run
resources = command_instance.Run(args)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/surface/app/deploy.py", line 117, in Run
default_strategy=flex_image_build_option_default))
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/command_lib/app/deploy_util.py", line 643, in RunDeploy
ignore_file=args.ignore_file)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/command_lib/app/deploy_util.py", line 433, in Deploy
extra_config_settings)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/api_lib/app/appengine_api_client.py", line 172, in DeployService
extra_config_settings)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/api_lib/app/appengine_api_client.py", line 249, in _CreateVersion
return self.client.apps_services_versions.Create(create_request)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/third_party/apis/appengine/v1/appengine_v1_client.py", line 827, in Create
config, request, global_params=global_params)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/third_party/apitools/base/py/base_api.py", line 729, in _RunMethod
http, http_request, **opts)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/third_party/apitools/base/py/http_wrapper.py", line 346, in MakeRequest
check_response_func=check_response_func)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/third_party/apitools/base/py/http_wrapper.py", line 402, in _MakeRequestNoRetry
check_response_func(response)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/googlecloudsdk/api_lib/util/apis.py", line 281, in _CheckResponseForApiEnablement
http_wrapper.CheckResponse(response)
File "/opt/hostedtoolcache/gcloud/290.0.1/x64/lib/third_party/apitools/base/py/http_wrapper.py", line 223, in CheckResponse
raise exceptions.BadStatusCodeError.FromResponse(response)
BadStatusCodeError: HttpError accessing <https://appengine.googleapis.com/v1/apps/<project-name>/services/default/versions?alt=json>: response: <{'status': '500', 'content-length': '109', 'x-xss-protection': '0', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Sun, 18 Oct 2020 15:46:08 GMT', 'x-frame-options': 'SAMEORIGIN', 'alt-svc': 'h3-Q050=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-27=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-T050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
>
ERROR: (gcloud.app.deploy) INTERNAL: Internal error encountered.
How can I fix this problem?

aws boto3 paginator of list_images

can someone please help me understand and do paginator codehere in this list_images code:
When i run this to get list of images for ec2imagebuilder, in the responce i got the nexttoken so how to use this to list all images in next page/until end.
client = boto3.client('imagebuilder')
response = client.list_images(owner='Amazon')
print(response)
Response (Trucated result):
{'ResponseMetadata': {'RequestId': 'f4b9e178-b959-4e23-be57-0c234fbec69d', 'HTTPStatusCode': 200, 'HTTPHeaders': {'date': 'Sun, 10 May 2020 21:53:55 GMT', 'content-type': 'application/json', 'content-length': '8709', 'connection': 'keep-alive', 'x-amzn-requestid': 'f4b9e178-b959-4e23-be57-0c234fbec69d', 'x-amz-apigw-id': 'MVet5Hp0PHcFeaw=', 'x-amzn-trace-id': 'Root=1-5eb877f2-9bec8ecc200f1394f6b0d340;Sampled=1'}, 'RetryAttempts': 0}, 'requestId': 'f4b9e178-b959-4e23-be57-0c234fbec69d', 'imageVersionList': [{'arn': 'arn:aws:imagebuilder:us-west-2:aws:image/amazon-linux-2-x86/2019.11.21', 'name': 'Amazon Linux 2 x86', 'version': '2019.11.21', 'platform': 'Linux', 'owner': 'Amazon', 'dateCreated': '2019-11-30T07:37:51.495Z'}, {'arn': 'arn:aws:imagebuilder:us-west-2:aws:image/windows-server-2012-r2-rtm-english-core-x86/2019.11.19', 'name': 'Windows Server 2012 R2 RTM English Core x86', 'version': '2019.11.19', 'platform': 'Windows', 'owner': 'Amazon', 'dateCreated': '2019-11-30T07:38:07.177Z'}], 'nextToken': 'eyxxxMS4xOSIsICJBY2NvdW50SWQiOiAiNTgwMDg3NjIzMDA1In0sICJtYXhfcmVzdWx0cyI6IDI1LCAia2V5X2NvbmRpdGlvbnMiOiB7IkFjY291bnRJZCI6IHsiQXR0cmlidXRlVmFsdWVMaXN0IjogWyI1ODAwODc2MjMwdddiOiBmYWxzZSwgInNjYW5faW5kZXhfZm9yd2FyZCI6IHRydWUsICJleHBpcmF0aW9uX2RhdGUiOiAxNTg5MjM0MDM1fQ=='}
Based on the documentation, you can use the following code snippet to list all images owned by Amazon.
client = boto3.client('imagebuilder')
response = client.list_images(owner='Amazon')
print(response['imageVersionList'])
while 'nextToken' in response:
response = client.list_images(owner='Amazon', nextToken=response['nextToken'])
print(response['imageVersionList'])

Google Datalab is not working now

The easy-and-famous datalab create instance-name command is not longer working. We did not any change in project/apis/keys/ or any other google options.
The same cmd was ok yesterday and now:
user-used-yeserday#pruebaalexborrar:~$ datalab create alexborrarpurbea
ERROR: gcloud crashed (BadStatusCodeError): HttpError accessing
<https://sourcerepo.googleapis.com/v1/projects/pruebaalexborrar/repos
alt=json>:
response: <{'status': '500', 'content-length': '109', 'x-xss-protection':
'1; mod
e=block', 'x-content-type-options': 'nosniff', 'transfer-encoding':
'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-
encoding': 'gzip', 'cache-control': 'private', 'date': 'Wed, 19 Apr 2017
09:08:43 G
MT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json;
charset=UTF-8'}>, content <{
"error": {
"code": 500,
"message": "Internal error encountered.",
"status": "INTERNAL"
}
}
>
When I use the same URL on my browser to get the error, I got other different error:
{
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected
OAuth 2 access token, login cookie or other valid authentication credential.
See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
}
}
I guess the 401 error code is not related with the upper 501 from the `datalab create´ command...
I know google now is deploying new cloud release...
Anyone knows what's happening?
There is a reported issue as #37242989 at the issue tracker regarding this, so I suggest that you can add more details and star the issue there to get further updates from the related team working on this.

Google DataFlow python pipeline write failure

I'm running a simple DataFlow pipeline w/ the Python SDK for counting keywords. The job runs fine for pre-processing the input data, but it fails for grouping/output steps with the following error.
I guess the logs says the worker is having an issue accessing the temp folder, but the storage bucket in our project exists with proper permissions. What could be a possible issue for this?
"/usr/local/lib/python2.7/dist-packages/apache_beam/io/gcsio.py", line
606, in write raise self.upload_thread.last_error # pylint:
disable=raising-bad-type HttpError: HttpError accessing
<https://www.googleapis.com/resumable/upload/storage/v1/b/[PROJECT-NAME-REDACTED]-temp-2016-08-07_04-42-52/o?uploadType=resumable&alt=json&name=0015bf8d-fa87-4c9a-82d6-8ffcd742d770>:
response: <{'status': '404', 'alternate-protocol': '443:quic',
'content-length': '165', 'vary': 'Origin, X-Origin', 'server':
'UploadServer', 'x-guploader-uploadid':
'AEnB2UoYRPUwhz-OXlJ437k0J8Uxd1lJvTsFbfVJF_YMP2GQEvmdDpo7e-3DVhuqNd9b1A_RFPbfIcK6hCsFcar-hdI94rqJZUvATcDmGRRIvHecAt5CTrg',
'date': 'Sun, 07 Aug 2016 04:43:23 GMT', 'alt-svc': 'quic=":443";
ma=2592000; v="36,35,34,33,32,31,30"', 'content-type':
'application/json; charset=UTF-8'}>, content <{ "error": { "errors": [
{ "domain": "global", "reason": "notFound", "message": "Not Found" }
], "code": 404, "message": "Not Found" } } >
This is https://issues.apache.org/jira/browse/BEAM-539, which doesn't allow root buckets as outputs for TextFileSink. As a workaround, please use a subdirectory path (e.g. gs://foo/bar) as output locations.