I got a deletion notice for my Google Cloud Shell home directory. Does that mean that my data will also be deleted?
This is documented here:
If you do not access Cloud Shell for 120 days, we will delete your
home disk. You will receive an email notification before we do so and
simply starting a session will prevent its removal.
This only applies to the home directory of your Cloud Shell instance (you may want to store it on Cloud Storage anyway if you want to keep it). Any other Google services you use will be unaffected.
Considering i'm paying for their services this is extremely annoying. Lost a lot of important documents and feel like taking my business somewhere else.
Related
I am trying to use Google Cloud Shell as my development platform as it's free and come with an intensive code editor. But at the same time, I struggle because only 5GB disk storage and only 2 projects loaded.
Is there an option to buy storage for cloud shell?
I know i have option to subscribe another VM inside GCP but it doesn't suit me, due to ain't any COOL IDE as i get in Cloud Shell. All i can deal with only vi.
No, You cannot increase the disk size of cloud storage. In fact if you do not use cloud shell for 120 days It will delete your home directory.
See more limitations here
Your second point is an insult to open source community :)
Here is an alternative I can think of:
Setup cloud SDK in your local system
Google Cloud Shell Editor is eclipse's Orion based text editor. You can use eclipse IDE , It will have same shortcut keys and code validation features.
Alternatively , You can use Orion in case you're doing web development
I hope this helped.
there is more disk storage on root folder that you can use it
Suddenly i can't access it.
Steps I've done:
Tried to access different project - can't activate cloud shell
Tried to re-login in incognito - can't activate cloud shell
Tried to use different gmail account and different project - can't
activate cloud shell
Tried to use different network - can't activate cloud shell
Note: Previous days i can access it, but now i can't. the common things between the 2 accounts is my name. Is it possible that google blocked any emails that have my name?
I don't know what happened but after sometime i magically can access it. For me after exactly 24 hrs. Did not do anything to solved it. But for that time i don't have access i used the google cloud sdk in managing my project's resources. Hopefully this was helpful
my development team has been sparingly trying out Google Cloud Platform for about 10 months. Every member was using the same account to access GCP, say team#example.com. We created three projects under this account.
Starting in about July, we cannot see these projects in the GCP console anymore. Instead, there is one project named My First Project, which we have never created.
However, our original GCP projects still seem to exist, as we can still access for example some of the Google Cloud Functions via HTTP.
Therefore, I have the impression that the connection between our account and the projects has been lost.
OR
A second account with the same name has been accidentally created?
Additional curiosities:
Yesterday I tried to create a Google Cloud Identity account, using team#example.com. It did not work; when entering that address the input field showed an error like "Please use another email address. This is a private Google account." (It was actually in German, so I'm guessing the translation.)
When I go to accounts.google.com, the account selection screen offers team#example.com twice. No matter which entry I choose, I always end up in the GCP console with My First Project.
How can I recover my team's GCP projects?
Which Google support site may I consult to check on the account(s)?
Usually, there is a 1:1 mapping between a certain email address and a Google Account. However, this can be broken under certain situations - for example when creating / deleting / migrating G Suite or Cloud Identity accounts under the domain the email address uses.
If you hit such an edge case, there's not much you can do yourself. Reach out to GCP Support who should be able to resolve the issue for you.
Keep in mind that orphaned resources have a timer on them before they are deleted - so act quickly and do not rely on apps still responding being a sign that they will continue indefinitely.
I accidentally messed up the permissions of the file system, which shows the message sudo: /usr/local/bin/sudo must be owned by uid 0 and have the setuid bit set when attempting to use sudo, such as read protected files, etc.
Response from this answer (https://askubuntu.com/a/471503) suggest to login as root to do so, however I didn't setup a root password before and this answer (https://stackoverflow.com/a/35017164/4343317) suggest me to use sudo passwd. Obviously I am stuck in an infinite loop from the two answers above.
How can I read/get the files from Google Cloud Compute Engine's disk without logging in into the VM (I have full control of the VM instance and the disk as well)? Is there another "higher" way to login as root (such as from gcloud tool or the Google Cloud interface) to access the VM disk externally?
Thanks.
It looks like the following recipe may be of value:
https://cloud.google.com/compute/docs/disks/detach-reattach-boot-disk
What this article says is that you can shutdown your VM, detach its boot disk and then attach it as a data disk to a second VM. In that second VM you will have the ability to make changes. However, if you don't know what changes you need to make to restore the system to sanity, then as #John Hanley says, we might want to use this mounting technique to copy of your work and then destroy your tainted VM and recreate a new one fresh and copy back in your work and start from there.
I have a software that process some files. What I need is:
start a default image on google cloud (I think docker should be a good solution) using an API or a run command
download files from google storage
process it, run my software using those downloaded files
upload the result to google storage
shut the image down, expecting not to be billed anymore
What I do know is how to create my image hehe. But I can't find any info saying me what google cloud service should I use or even if I could do it like I'm thinking. I think I'm not using the right keywords to find what i need.
I was looking at Kubernetes, but i couldn't figure out how to manipulate those instances to execute a one time processing.
[EDIT]
Explaining better the process I have an app that receive images and send it to Google storage. After that, I need to process that images, apply filters, georeferencing, split image etc. So I want to start a docker image to process it and upload the results to google cloud again.
If you are using any of the runtimes supported by Google Cloud Functions, they are easiest way to do those kind of operations (i.e. fetch something from Google Cloud Storage, perform some actions on those files and upload them again). The Cloud Functions will be triggered by an event of your choice, and after the job, it will die.
Next option in terms of complexity would be to deploy a Google App Engine application in standard environment. It allows you to deploy your own application written in any of the supported languages for this environment. While there is traffic in your application, you will have instances serving, but the number of instances running can go down to 0 when they are not serving, which would mean less cost.
Another option would be Google App Engine in flexible environment. This product allows you to deploy your application in any custom runtime. This option has always at least one instance running, so it would never shut down.
Lastly, you can use Google Compute Engine to "create and run virtual machines on Google infrastructure". Otherwise than GAE, this is not that managed by Google, which means that most of the configuration is up to you. In this case, you would need to programmatically indicate your VM to shut down after you have finished your operations.
Based on your edit where you stated that you already have an app that is inserting images into Google Cloud Storage, your easiest option would be to use Cloud Functions that are triggered by additions, changes, or deletions to objects in Cloud Storage buckets.
You can follow the Cloud Functions tutorial for Cloud Storage to get an idea of the generic process and then implement your own code that handles your specific tasks. There are other tutorials like the Imagemagick tutorial for Cloud Functions that might also be relevant to the type of processing you intend to do.
Cloud Functions is probably your lightest weight approach. You could of course do more full scale applications, but that is likely overkill, more expensive, and more complex. You can write your processing code in Node.js, Python, or Go.