Upload data from cloud storage to bigquery - google-cloud-platform

[The issue is resolved, Stackoverflow does not allow to delete the question, the issue was a mistmatch in schema]
I have a piece of code which uploads data from cloud storage into bigquery correctly. I run the code locally and works. Now I moved the code to cloud functions, and it fails, could you help me how I can fix it?
The logs : "Function cannot be initialized. Error: function terminated…"

Check in Cloud Logging to have more info on the error, this link is interesting regarding Cloud Functions troubleshooting :
Function cannot be initialized. Error: function terminated.
Recommended action: inspect logs for termination reason. Additional
troubleshooting information can be found in Logging.

Related

Serverless VPC access connector is in a bad shape

Our project is using a Serverless VPC access connector to allow access to DB over private IP from cloud functions and cloud runs. It was working flawlessly for a few months, but today I tried to deploy one of the functions that use such a connector and I got the message:
VPC connector
projects/xxxx/locations/us-central1/connectors/vpc-connector is not
ready yet or does not exist. Please visit
https://cloud.google.com/functions/docs/troubleshooting for in-depth
troubleshooting documentation.
I went to the Serverless VPC access view and found out that indeed the connector has a red marking on it. When I hover on it it says
Connector is in a bad state, manual deletion recommended
but I don't know for what reason, Link to logs doesn't show anything for the past 3 months.
I tried to google about the such error but without success.
I also tried to search through logs but also didn't find anything relevant.
I'm looking for any hints:
Why it happened?
How to fix it? I don't want to recreate the connector, it is related to many functions, and cloud runs
As the issue was blocking us from the deployment of cloud functions I was forced to recreate the connector.
But this time API returned an error:
Error: Error waiting to create Connector: Error waiting for Creating Connector: Error code 7, message: Operation failed: Google APIs Service Agent (<PROJECT_NUMBER>#cloudservices.gserviceaccount.com) needs editor role in the project.
After adding such permission old connector started to work again...
Before there was no such requirement, but it changed in meantime.
Spooky, one time something works other not.

New to GCP: Deploying basic Python cloud function fails

This may be a very simple question, but I'm getting persistent errors when trying to deploy a basic Python cloud function. I created this in the main.py file.
def hello_http(request):
return 'Success!'
Deployment failure:
Function failed on loading user code. This is likely due to a bug in the user code. Error message: Error: please examine your function logs to see the error cause: https://cloud.google.com/functions/docs/monitoring/logging#viewing_logs. Additional troubleshooting documentation can be found at https://cloud.google.com/functions/docs/troubleshooting#logging. Please visit https://cloud.google.com/functions/docs/troubleshooting for in-depth troubleshooting documentation.
Found out that Entry Point had to be the same name as the defined function.

unable to see any logs after updating cloud function

Suddenly I am not getting any logs except deployment logs for google cloud functions
Till now it worked fine but, after updating the function I haven't seen any logs. So I have done some research and deleted the cloud functions logs file and also the cloud function and again I have created a new function. Even then I am not able to see any logs related to the project excepted audit logs (i.e whenever the function gets updated)
Any clues what's wrong? I am not able to understand what exact problem.
any help is appreciated
I have viewed the Issue Tracker issuetracker.google.com/issues/155215191 and have found that work is still being done to address the scenario.

GCP Cloud Run deploy: "Failed to move user code into storage". What does this mean?

Deploying a new service into Google Cloud Run fails with the message:
Failed to move user code into storage, please verify the pod
configuration and try it again.
What does this mean, and how can one go about debugging it?
Just for the sake of giving this question an answer. All the credit should go to AhmetB and his insight about this being a Known issue to Google, In which a missing or invalid entrypoint will cause this issue to surface.
I have found a Public Issue Tracker here, in which this issue has also been forwarded to Google by that channel. Google will be delivering further information on that PIT.

Error when trying to access Lambda logs on CloudWatch?

I created some Lambda-Edge functions but I'm unable to set up the logs for it. When trying to access them I am seeing the error message:
There was an error loading Log Streams. Please try again by refreshing
this page.
I have gone to everything I could find on google, but as far as I can see my permissions are set up fine. I've created a custom role for them like this.
The role contains the following permissions:
I can't really figure out, what else could cause this error. It has been around 2h since setting up the functions and permissions.
For anyone experiencing the same problem. There is a weird quirk to LambdaEdge.
The logs will be stored in the AWS location closest to the user that executes it.
Even if you've deployed your functions in us-east-1, switch location to the destination that is closest to you.