I was working on a Arduino project that provides data to outside public but I want to keep only for family members or some guests. So I wanted to set up login authentication page where user can login and see data over a php website hosted locally (available over internet via port forwarding) . But since login would require database to store username/password I want it to be stored on dynamoDb. I think storing online on AWS is a good idea since db will grow over time but php page can be stored and moved easily . Another reason I would like to try is whilst I will get to learn how to use NoSql on dynamo db!
Please guide me the right path to host php locally that uses Amazon dynamoDb for storing logins?
You can certainly connect to DynamoDB from outside of the Amazon environment as long as you have your credentials. The PHP Getting Started Guide should give you most of what you need.
When you're ready to move to an EC2 instance, a t2.nano machine is about USD $4.32 per month. That would let you setup a full PHP server that could also talk to the database and you wouldn't have to have it locally.
Related
I am exploring Google Dialogflow and creating chatbot for learning purpose. I would like to store/fetch the details into the SQL Server DB in an local machine taken from the dialogflow chat session. Similar to Workato & Celonis tools, we have an On-Prem agent to install it in the respective machine which will create a tunneling to access it without affecting the machine firewall.
I tried looking in google documentation, but unable to get proper result based on my analysis. It would be great, if I get guidance/support on how to connect SQL DB hosted in local machine from Dialogflow Inline editor using On-Prem agent.
Please let me know if I need to add any other details on the mentioned scenario.
NOTE: Based on my google search, I came to know that we can write NodeJs code and create a webhook call by hosting with ngrok or storing the data in GCP Cloud SQL instances to achieve this. But wanted to how to save/fetch data in the local machine's SQL Server from dialogflow.
Thanks in advance.
I have a web app which connect to a Cloud Function (not Firebase, but GCP). The cloud Function is Python, it is massive and it connect to SQL on Google as well.
Now, I bought a domain from Google, and I need to host a simple static website, that will access this Google Function using a Function URL, and show data to client.
It need to be fast and serve many users. ( It is sort of a search engine)
I would like to avoid Firebase Hosting for multiple reasons, but mainly because i want to stay inside the GCP, where i deploy with and monitor everything.
I realized that my options to host this static(?) website with my custom domain in GCP are :
Load Balancer - which is expensive over kill solution.
Cloud Storage - which (i might be wrong) will be very limiting later if i need to manage paying users. ( or can i just send user ID to the Function using parameters?)
Cloud Run - which i am not sure exactly yet what it does.
What is a solution that fit a light web app(html/JS) that can Auth users but connect to a massive Cloud Function using the Cloud Function URL with simple REST?
Also - can i change the URL of that Cloud Function to be my domain without Balancer ? Currently it is like project-348324
When using the Google Cloud SQL instance, SSL can be enabled using the Server Certificate server-ca.pem downloaded from the Google Cloud SQL instance's Connections section. The Server Certificate seems to be only for a single instance. If I have multiple instances I will need to download and use multiple Server Certificates.
Is there a way to upload or customize the Server Certificate of the Google Cloud SQL instances that I am using? My goal is to use a single Root CA Certificate that can connect to all the Google Cloud SQL instances that I have. I read through the Google documentation and still not able to clearly understand whether it is possible. Appreciate any input from the community.
Example, for Amazon RDS, it supports a root certificate that works for all AWS Regions. I would like to understand whether there is something similar Google Cloud SQL is using.
Currently this is not possible. You can only manage client certificates as you mentioned. I found this Feature Request(FR) in the Public Issue Tracker. I would recommend you to "star" it to ensure that you receive updates about it. You can also adjust notification settings by clicking the gear icon in the top right corner and selecting settings.
The FR are evaluated by the number of users being affected by that. The more stars it have, the more possibilities to be developed it has.
In the end I don't think that having all the Instances with the same Certificate should be the best path to follow. I understand that this could help to reduce the amount of sensitive data managed, but in the end you "would never put all its eggs into one basket". This could be risky.
I want to build an application using Amazon Web Services (AWS).
The way the application should work is this;
I make a program that lets the user import a large file in an external format and send it to AWS (S3?) in my own format.
Next many users can access the data from web and desktop applications.
I want to charge per user accessing the data.
The problem is that the data on AWS must be in an unintelligible format or the users may copy the data over to another AWS account where I can not charge them. In other words the user need to do some "decrypting" of the data before they can be used. On the web this must be done in JavaScript which is plaintext and would allow the users to figure out my unintelligible format.
How can I fix this problem?
Is there for instance a built in encryption/decryption mechanism?
Alternatively is there some easy way in AWS to make a server that decrypts the data using precompiled code that I upload to AWS?
In general when you don't want your users to access your application's raw data you just don't make that data public. You should build some sort of server-side process that reads the raw data and serves up what the user is requesting. You can store the data in a database or in files on S3 or wherever you want, just don't make it publicly accessible. Then you can require a user to login to your application in order to access the data.
You could host such a service on AWS using EC2 or Elastic Beanstalk or possibly Lambda. You could also possibly use API Gateway to manage access to the services you build.
Regarding your specific question about a service on AWS that will encrypt your public data and then decrypt it on the fly, there isn't anything that does that out of the box. You would have to build such a service and host it on Amazon, but I don't think that is the right way to go about this at all. Just don't make your data publicly accessible in the first place, and make all requests for data go through some service to verify that the user should be able to access the data. In your case that would mean verifying that the user has paid to access the data they are requesting.
I have been trying to get my website whitelisted for Twitter cards. Every time I apply for whitelisting I get an email saying "The URL you provided for indiaoutside.org to use the summary_image_large card is inaccessible."
I tried searching their community and found this. It is apparent that Twitter's crawler is unable to access my website. I am running an AWS EC2 for this website. How do I find out if AWS is denying access to the web crawler and how do I change that?
I do not beleive that AWS will block a crawler. Generally it is upto you as the owner of the instance to control what ports are open and what traffic you want to accept or reject - I suspect you are having another problem.
Have you used the twitter card validator?:
https://cards-dev.twitter.com/validator
If you can't get it to pass that test, its not going to get approved by twitter.