When using the Google Cloud SQL instance, SSL can be enabled using the Server Certificate server-ca.pem downloaded from the Google Cloud SQL instance's Connections section. The Server Certificate seems to be only for a single instance. If I have multiple instances I will need to download and use multiple Server Certificates.
Is there a way to upload or customize the Server Certificate of the Google Cloud SQL instances that I am using? My goal is to use a single Root CA Certificate that can connect to all the Google Cloud SQL instances that I have. I read through the Google documentation and still not able to clearly understand whether it is possible. Appreciate any input from the community.
Example, for Amazon RDS, it supports a root certificate that works for all AWS Regions. I would like to understand whether there is something similar Google Cloud SQL is using.
Currently this is not possible. You can only manage client certificates as you mentioned. I found this Feature Request(FR) in the Public Issue Tracker. I would recommend you to "star" it to ensure that you receive updates about it. You can also adjust notification settings by clicking the gear icon in the top right corner and selecting settings.
The FR are evaluated by the number of users being affected by that. The more stars it have, the more possibilities to be developed it has.
In the end I don't think that having all the Instances with the same Certificate should be the best path to follow. I understand that this could help to reduce the amount of sensitive data managed, but in the end you "would never put all its eggs into one basket". This could be risky.
Related
I'm building a simple analytic service that needs to work for multiple countries. It's likely that someone from a restricted jurisdiction (e.g. Iran) hits the endpoint. I am not offering any service that would fall under sanctions-related restrictions, but it seems like Cloud Run endpoints do not allow traffic from places like Iran. I tried various configurations (adding a domain mapping, an external HTTPS LB, calling from Firebase, etc) and it doesn't work.
Is there a way to let read-only traffic through from these territories? Or is there another Google product that would allow this? It seems like the Google Maps prohibited territory list applies to some services, but not others (e.g. Firebase doesn't have this issue).
You should serve traffic through Load Balancer with Cloud Armour policy. Cloud Armour provide a feature for filtering traffic based on location.
I am exploring Google Dialogflow and creating chatbot for learning purpose. I would like to store/fetch the details into the SQL Server DB in an local machine taken from the dialogflow chat session. Similar to Workato & Celonis tools, we have an On-Prem agent to install it in the respective machine which will create a tunneling to access it without affecting the machine firewall.
I tried looking in google documentation, but unable to get proper result based on my analysis. It would be great, if I get guidance/support on how to connect SQL DB hosted in local machine from Dialogflow Inline editor using On-Prem agent.
Please let me know if I need to add any other details on the mentioned scenario.
NOTE: Based on my google search, I came to know that we can write NodeJs code and create a webhook call by hosting with ngrok or storing the data in GCP Cloud SQL instances to achieve this. But wanted to how to save/fetch data in the local machine's SQL Server from dialogflow.
Thanks in advance.
I am working on a project where a user clicks a link/button that says Access VM on a webpage, it should internally spin up a Linux based VM (using GCP, AWS or Azure) and provide the VM terminal in a new browser tab for the user to play around in the VM.
How can I achieve this using GCP/AWS/Azure? Which type of VM should I create so that the user can access the VM terminal over a browser without using an SSH client?
I tried creating a VM on Azure and explored the Bastion option. But this Bastion session should always be initiated from within the Azure portal.
Do we have any other option within GCP, AWS or Azure to achieve this?
I am looking for something similar to katacoda website.
There's no built in feature in GCP that will allow such thing possible. There is a button "SSH" in the VM's list but you have to be able to view the list and also have the permission to connect to the instance. But that requires to actually log into GCP which I believe is not what you want.
**You could try and built some solution that after clicking an "Connect" button you your website would send a series of commands to GCP's API to create & connect to the new isntance. It's possible but rather complicated.
Have a look at the documentation how to connect to VM using browser - maybe it will give yolu some ideas.
Ultimately use many other 3rd party tools but you still need to provide an address and credentials - additionally you rely on a service that you don't control so you have to take security (and reliability) into consideration.
At the end you may also consider going through general information how to connect to GCP's instances.
I am planning to use Amazon Workspace, to run a communication software which is restricted in a country where I am about to visit in few days, so what I was thinking is to use amazon workspace, but I was wondering if anyone can guide me if its safe to keep running any communication software with personal credentials on Amazon Workspace?
I have confusion if I run Workspace, will I get the same desktop each time? or if I log out from client it will end the existing desktop, and once I sign in again it will get me a new desktop with everything same as previous one?
Amazon WorkSpaces provisions a virtual server that is always "yours". Just keep it running and connect to it whenever you want to use it. It will continue exactly how you left-off, such as mid-sentence in a word processor.
Clients are available for Windows, Mac, iOS, Android and even via Web Browser so it should be easy to connect.
The only potential problem is if the country has blocked access to the AWS IP address range, which might happen if they want to block people from using VPN services.
I think it is a standard procedure for you to be cautious whenever you are using internet connection away from the trusted connection points. However, it is quite secure to keep running your communication software on AWS Workspaces. Their security protocol is advanced. You should also change your credentials on a regular basis.
You will always get the same desktop anytime you login and so that shouldn't be a problem.
Alternatively, you can checkout V2 Cloud's WorkSpaces they have an enterprise-grade security strategy to protect both your data and credential. They use multi-factor authentication to ensure that even if your credentials are stolen, the theif can't sign into your WOrkPSaces.
About having access to the same desktop, their desktop is very consistent and you will always have access to the same desktop. They will not only host your communication software, they will render to you via your web browser so you don't have to install any client application like in AWS WorkSpaces.
I hope that helps.
I tried to publish my report on dashboard. and I get this error
This data source cannot be accessed by data gateway.
I used direct query from my sql server. I downloaded and installed on-premise data gateway.
When I go to Manage Gateway in Power BI service. It says you do not have any gateways.
Not really sure as to what to do.
Please someone guide me.
When using Direct Query, you must configure the gateway's data sources under manage gateways
If you cannot see any gateways under manage gateways, then one of four scenarios has occurred.
You have not finished the install process for the On-Premise Data Gateway
You have not Signed into the OpDG ( locally where it was installed )
You used a different accounts when publishing the report and signing in to the OpDG
( Very outlandish ) You installed the OpDG somewhere that it just cannot see an outside network. Perhaps a bad region? This can be changed in the local config page for the OpDG
To Fix this:
Re-Install the OpDG
Sign-in locally to the OpDG using the same account you published with
Try changing the region. I don't think it's this.
Next:
Under manage gateways, once you can see the gateway, you need to add a source. The Source must Suit two conditions. It must have the FQDN of the server.database AND it must be the exact same connection string used in the local PBIx File you published.
Ex. If you Direct-Queried to MyDomain.ServerOne.ProductionDatabase, but in PowerBI Desktop, you just said Localhost:ProdData, you'll have an issue. Make sure both the PBIx and the DataSource under your gateway online both have the FQDN.
Then You should be able to assign a gateway under the dataset settings ( Dataset -> Schedule refresh -> Choose Data Gateway )
Good Luck!