Can I upload any content to GCP storage? [closed] - google-cloud-platform

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
A friend suggested me an idea for creating an adult entertainment site, and I liked it so I started with the project and I chose GCP for cloud services.
Site should allow users to upload their own content so it should be saved somewhere and that's not big deal but I wonder if given the nature of the content there's some kind of limitation or prohibition of storing certain type of files in GCP.
Anyone knows if so?

Google Cloud Storage is a great option to store files like images and videos. This service does not have restrictions about that kind of content (unless of course, the site promotes illegal activities) as you can see in its Acceptable Use Policy. You should also make sure you are not violating any intellectual property rights or the content would be removed or disabled as mentioned in its DMCA Policy.
As an extra, you might want to take a look at some of the tech restrictions from the service itself that may apply as a validation for your site:
Don't upload files greater than 5 TB.
Use valid names for your Buckets.
Making sure you are using valid file names.

Related

Not able to launch any application in GCP, billing not working [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
For practicing Kafka, I have created a free GCP account.
I have given my account information and money is deducted from my credit card.
When I try to launch my Kafka application page it is again redirected to account information page and it is asking me for payment. I have made the payment for almost 10 times. Kafka application is never launched.
The same happens when I try to link a billing account to my project or perform anything from the GCP console.
I don't understand what went wrong. Can someone please guide me as I am new to GCP as well.
You can try to change your payment method (they vary depending on the country you live in) and of that doesn't help (or you are not able for some reason) I recommend contacting Google Support - in case of any billing issues it's free of charge.
You can contact GCP Billing Support via the chat and explain what's the issue.

Managed Service Provider on Google Cloud Platform [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
If a Managed Service Provider (MSP) wants to monitor existing GCP projects of a client that are associated to a client billing account ( meaning the client is directly paying to Google for it ) , how can the MSP start monitoring those ? What are the strategies around IAM that enable the MSP to start monitoring the client projects?
Using the Cloud IAM(Identity and Access Management), any member can be assigned with the access/control over the GCP resources/projects. You can check this documentation for various roles that can be given to a member.
For your specific concern over 'monitoring the billing', a member (in your case a MSP provider) can be given any specific role under “Cloud Billing Roles”. It will help that member to view/manage various billing scenarios over the GCP project/resources.
Apart from your use case scenario, you can also check this blog for various updates over MSP partners. You can make valuable suggestion by commenting there as well.

How to use one website with several amazon S3 buckets as a path suffixes?

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 days ago.
Improve this question
Could you please advise - I would like to host multiple S3 Amazon buckets under one site in next way:
Access to bucket1 - http://example.com/bucket1(bucket2...) and so on.
What now Amazon allows is access with links like http://s3-ap-northeast-1.amazonaws.com/bucket1 , but this is not allowed in my case (production server). Note that site DNS is hosted also at Amazon Route53.
Thanks ahead

Accidentally deleted Amazon Linux EC2 home directory, my whole site blew up. How to get back up and running? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 8 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
I was using sudo rm -rf to delete some folders, and accidentally deleted the /Users/ec2-user home directory. I had my EC2 and AWS Route 53 running my site, http://thinklikegeek.com. Now, it just disappeared. I have all the files for my website, but don't know how to get it back up and running well. It just says "No Data Received" if I visit my website. Help!
This isn't an answer for how to get your files back. This answer is how to make this be no big deal the next time it inevitably happens.
You should treat EC2 instances as being disposable. That means that you fully expect an EC2 instance to go down, so you design your application and deployment strategy around that fundamental concept.
Your deployments should be from the latest copy of your source code form source control (e.g., Git, SVN), and should be 100% automated other than kicking off the process.
Next, don't store important data on the instance (unless you keep it on a persistent EBS volume). Keep the data either in S3, on an EBS volume. Enable versioning in S3, take regular snapshots of your EBS volumes.
Since there is a 100% chance of something bad happening to an instance in the future, plan ahead, and make sure you're treating your instances as disposable compute units.
This really has nothing to do with AWS.
I'm not sure why you mentioned Route53 -- it's not lost, you can use the AWS console to control it. (including moving DNS to a new box if needed.)
In theory, there are possible ways of restoring deleted files. But it really depends on your filesystem, and takes an expert because the tools are complex. (Don't create a lot of files in the meantime, or the deleted files will be overwritten).
Your best defense is regular snapshots of your EBS drive. (Write a small shell script.)

Amazon EC2 - US East vs US West... Does it matter? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
My question is not so much related to Amazon's specific differences between these two regions, but more about how the distance could affect applications' performance.
I have an application that will only be accessed in by US West users (I know this for sure). However, my EC2 instances are located in US East... Should I worry about that?
Will my applications performance really be that affected?
Retrieving web pages vs streaming media, does this make a big difference?
From personal experience it doesn't make a whole lot of difference. But if all your users are in the Western US I would still make my servers run in the us-west-1 or us-west-2 regions. There's probably a slight lower latency, so it also depends on the latency requirements of your customers. For example we have a strict 70ms latency requirement from our customers so we want to have our server located close to our customers.
Furthermore, if you want higher performance you might want to consider upgrading the sizes of your instances (EC2), databases(RDS), etc.
If you want to migrate your app from us-east-1 to any of the us-west regions you can create AMI's for your instances for example and then copy them over using the AWS AMI Copy functionality.