MySQL Server + phpMyadmin on Ubuntu Server 20.04 from google cloud platform, where to start? - google-cloud-platform

I am a new starter for google cloud platform, I started a thing on marketplace which I think will help me install craft-cms, how to start? I mean where can I have the password of the root user on mysql, where is the password of phpadmin, I don't know where to start error I am facing
deployment

You have to finish the installation of MySQL after deploying the solution. You can't login to phpMyAdmin because you have not set up a password for root yet.
I assume you deployed the solution
MySQL Server + phpMyadmin on Ubuntu Server 20.04 and now you have a VM that you can SSH into.
SSH into the VM machine.
run the command sudo mysql_secure_installation to start the MySQL configuration.
Follow the onscreen prompts and reply 'y' to the prompts.
run the following commands to set up a password for root. Be sure to replace 'your_pass_here' with your own password.
sudo mysql
ALTER USER 'root'#'localhost' IDENTIFIED WITH mysql_native_password BY 'your_pass_here';
FLUSH PRIVILEGES;
Now you can log off your SSH session and login to phpMyAdmin using your new password.

If you deployed one of the marketplace solutions, you will have a deployment manifest in the Deployment Manager section of the Google cloud console.
Go to the Google Cloud Platform
https://console.cloud.google.com/
Ensure that you have the correct project selected (that will be the
one you deploy the marketplace solution in the first place) in the
dropdown menu in the top of the screen.
In the search bar type Deployment Manager and select "Google Cloud
Deployment Manager".
Press the button "Go To Cloud Deployment Manager"
You will find listed all the deployments for that project. You
should be able to find your deployment there.
Click on the deployment name and in the next screen you will be able
to find the deployment specifications, usually you will find the
names and passwords for the deployment there.
Here is an example of a deployment as seen in the Deployment Manager:

Related

Deployment script to apache, permissions problem

I'm developing a Django application that is being deployed to an apache server.
I'm using two servers right now. In one server I have the Development and Staging instances for which I use a bash shell script to deploy, and in the other server the Production instance in which I use a mina-deploy script.
The problem comes after deploying since the permissions on the /var/www/... folder are not correct after the deployment, this wont allow apache to serve the website.
I was wondering if there is anyway I can deploy this code without making any change to permissions. For both I'm not using a root user but an user with SUDO permissions.

Launch Jupyter Notebooks in AWS Sagemaker from a Custom Webapplication

We have a requirement where we are building a Webportal/platform that will use services of AWS and Git as both will host certain content to allow users to search for certain artifacts.
We also want to allow a user after they have searched for certain artifacts (lets say certain jupyter notebooks) to be able to launch these notebooks from our web-application. Note the notebooks are in different domain i.e AWS Console application host them.
Now, When user click on the notebook links from the webportal search it should open up the Jupyter notebook in a notebook instance in a new tab.
We understand there is integration of AWS Sagemaker and GIT so some repos that will store notebooks can be configured. When user performs the search in webapp it will pick up the results from github API Call.
The same repos can also be added in the sagemaker-github integration through AWS Console. So when a user launches the notebook he will see the github repos as well.
I understand we call Sagemaker API either through SDK or Rest API(not sure there is a rest api interface exploring on that). See a CLI call example -
aws sagemaker create-presigned-notebook-instance-url --notebook-instance-name notebook-sagemaker-git
this gives me a response url "AuthorizedUrl": "https://notebook-sagemaker-git.notebook.us-east-2.sagemaker.aws?authToken=eyJhbGciOiJIUzI1NiJ9.eyJmYXNDcmVkZW50aWFscyI6IkFZQURlQlR1NHBnZ2dlZGc3VTJNcjZKSmN3UUFYd0FCQUJWaGQzTXRZM0o1Y0hSdkxYQjFZbXhwWXkxclpYa0FSRUZvUVZadGMxSjFSVzV6V1hGVGJFWmphRXhWUTNwcVlucDZaR2x5ZDNGQ1RsZFplV1YyUkRoTGJubHRWRzVQT1dWM1RTdDBTR0p6TjJoYVdXeDJabnBrUVQwOUFBRUFCMkYzY3kxcmJYTUFTMkZ5YmpwaGQzTTZhMjF6T25WekxXVmhjM1F0TWpvMk5qZzJOek15TXpJMk5UUTZhMlY1THpObFlUaGxNMk14TFRSaU56a3RORGd4T0
However, when i open this url it again asks me the aws console username and password. I feel in the webapp when i logged in a user would already authenticate himself through AWS API as well as GIT API.
So there should be no need to re-authenticate themselves when they connect to AWS-Console to access their notebooks.
Is it something that can be circumvent using SIngle sign on etc.
thanks,
Aakash
The URL that you get from a call to CreatePresignedNotebookInstanceUrl is valid only for 5 minutes. If you try to use the URL after the 5-minute limit expires, you are directed to the AWS console sign-in page. See https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_CreatePresignedNotebookInstanceUrl.html
Jun

Unable to Deploy from GCP Marketplace - Missing Valid Default Service Account

I receive an error message while attempting to deploy anything from the marketplace into a specific GCP project.
You must have a valid default service account in order to create a
deployment, but this account could not be detected. Contact support
for help restoring the account.
Things I've Tried:
Every VM from the marketplace shows the same error message
I can deploy regular VM instance
I can see there is an enabled service account for the project with the name "Compute Engine default service account".
I am able to deploy VM's from the marketplace into other projects under the same organization
I've contacted GCP Billing support and they cannot find anything wrong from a billing perspective
Researching online shows that others that have had this issue have just rebuilt the project. It appears that service account is created by default when the project is spun up.
I'm hoping there is another way around it as this project is a host for a shared VPC deployment. There are already other projects with deployed VM's that are utilizing the host projects networks.
Thank you!
Looks like you deleted a default service account.
As mentioned in one comment some can be recreated by disable/enable the corresponding API
Below are the default service accounts I have in my project, hope it helps you to find the root cause. (these service accounts let me deploy a wordpress solution depending on what you are trying to deploy you might need more service accounts)
PROJECT-NUMBER-compute#developer.gserviceaccount.com Compute Engine
default service account
PROJECT-NUMBER#cloudservices.gserviceaccount.com Google APIs Service
Agent
PROJECT-ID#appspot.gserviceaccount.com App Engine default service
account
service-ORG-ID3#gcp-sa-cloudasset.iam.gserviceaccount.com Cloud Asset
Service Agent
service-PROJECT-NUMBER#cloud-ml.google.com.iam.gserviceaccount.com Google
Cloud ML Engine Service Agent
service-PROJECT-NUMBER#compute-system.iam.gserviceaccount.com Compute
Engine Service Agent
service-PROJECT-NUMBER#container-engine-robot.iam.gserviceaccount.com Kubernetes
Engine Service Agent
service-PROJECT-NUMBER#containerregistry.iam.gserviceaccount.com Google
Container Registry Service Agent
service-PROJECT-NUMBER#dataflow-service-producer-prod.iam.gserviceaccount.com Cloud
Dataflow Service Account
service-PROJECT-NUMBER#service-networking.iam.gserviceaccount.com Service
Networking Service Agent
The service account was intact and had the same permissions as other service accounts for working projects.
We purchased and opened a case with GCP technical support. After a little more than a week of them troubleshooting the issues, they determined there was no way to correct the problem. Their root cause was that something happened during the initial project deployment that caused some backend configuration issues. For what its worth, the project was deployed using Terraform, but its uncertain if that was a factor.
After recreating the host project, we were able to deploy from the marketplace again successfully.
If you run into this problem, save yourself the hassle and time and just recreate the project.

Google Cloud Datalab and Cloud Shell access issues

I recently made a google cloud account which was migrated from a .com.au email address to a .com email address. Now when I log in to Google Cloud Console, I correctly see my .com account and my permissions in IAM are owner, however, this migration does not seem to have been propagated to my Google Cloud shell and pre-existing Datalab instances.
When I try to do:
datalab connect test1 --no-user-checking
I get the cloud shell to connect and state that I can:
select *Change port > Port 8081*, and start using Datalab
However, when I go to port 8081 I get the error:
Error: Unauthorized
You are currently logged in as xxx.com.au which does not have access to Cloud Shell 3456864.
This is odd because in the Google Cloud Platform it clearly shows I am logged in as xxx.com
The same error occurs if I do a 'datalab create newbook' ... the compute engine instance is created, but when I go to connect to port 8081 it will not allow me access (same error as above).
The only exception is if I authenticate a local shell SDK with my XXX.com address and have done:
gcloud components install datalab
Then I can run datalab connect test 1 without any user checking. So it is only the Google Cloud Platform that is not allowing the connection.
The cloud shell and code editor Beta both have the same error as above, i.e. somehow the cloudshell is not seeing that I am logged in with my .com new profile rather than my old .com.au profile even though the platform can clearly see the difference. I'd rather not delete my entire profile and start again so any ideas are appreciated.
Get credentials for your user account via a web-based authorization flow with the following command:
gcloud auth login
This command will provide you the link to get the verification code.
Once verified, your cloud shell configuration will be updated to the new account.
Since this seems to be unsolved, I'll post the one solution I came across which has worked (as of now):
1) gcloud init
2) complete clearing of all browser caches
3) logout and remove browser from profile
4) restart computer
5) restart browser with new login
6) works!!!
I still got the error:
"Permission denied (publickey).
ERROR: (gcloud.compute.ssh) [/usr/bin/ssh] exited with return code [255]."
.... but Datalab is now accessible. I then tried again with a login from an incognito window and now it works without the permission error (but only in incognito mode). Not ideal, but will work in a pinch. This may be a cousin to some of the errors which can occur on GCP as seen in qwiklabs.

How can I login to PHPMyAdmin on AWS Lightsail?

I just set up an AWS Lightsail account and chose the LAMP installation, provided by Bitnami.
According to the Bitnami documentation, to connect to PHPMyAdmin, you create an SSH tunnel and browse to the path giving. Great, I can see the login page, but where do I get the credentials?
To log in, use username root for MySQL and the application password from the detail page for your cloud server.
What password are they talking about?
It is not the password to login to my AWS Console. Nor is it, as some threads on the Bitnami site suggest, bitnami or bitnami1.
Anyone what what password the service is looking for?
The password is found by connecting to server via SSH and running this command in your home directory:
cat bitnami_application_password
Or this command in any directory:
cat $HOME/bitnami_application_password
Username: root (it's default)
Password: MnQERJ8gcHkQ (Like this) .Connect to Bitnami Lightsail
through SSH on Lightsail Dashboard to access the command line tool.
Using cat bitnami_application_password to get the password, this
is the same with the password you access to Wordpress (based on
Lightsail) in the first time.
Everything is explained in the Bitnami Documentation:
https://docs.bitnami.com/aws/faq/#using-amazon-lightsail
You can find information there regarding how to find your credentials depending on the AWS service you used to launch your server.