I am trying to make join a Riak node to the cluster programmatic but I can't manage to do that. I have tried this operation through web and Riak_control but I have a 403 forbidden HTTP error code.
I there any other way to join a cluster programmatic ?
There is no out-of-the-box way of doing it. You have to wrap the console commands for building a cluster in a Web service to call them remotely (e.g. RESTful API).
You may also try the experimental Riak Explorer, see Explorer API endpoints, #3.
Related
I am trying to set up an app on AWS that ...
Deploys a react app to an S3 bucket
Deploys a node backend that interacts with an AWS RDS database
Connects the react app front end to the node backend to do CRUD operations
Doing part 1 is easy and there are plenty of tutorials. However, parts 2 and 3 seem totally foreign to me. I have found nothing that explains how to tie the front end to the database or how to tie the front end to the back end.
Do I need an API Gateway?
Does the node backend have to be hosted on an EC2 instance?
If so, how do I do this?
Where does cloudformation come into play?
I have found nothing that explains how to tie the front end to the
database or how to tie the front end to the back end.
Frontend connects to the backend by making HTTP API calls (via fetch or a library like axios) to the URL associated with the backend server.
The backend would connect to the database via NodeJS database connections.
The frontend should never connect directly to the database.
Do I need an API Gateway?
Using API Gateway is entirely optional.
Does the node backend have to be hosted on an EC2 instance?
The Node backend needs to be deployed on a compute service that can run NodeJS code, such as AWS EC2, ECS, EKS, Lambda...
If so, how do I do this?
This part of your question is so broad it is off-topic for this site. Given your level of experience I suggest looking at AWS Elastic Beanstalk for deploying your backend.
Where does cloudformation come into play?
CloudFormation is a tool you use to define your AWS infrastructure as code, so instead of clicking around in the AWS UI to create everything, and then not being able to reproduce that reliably when you need to, everything is defined in template files that can be tracked in source control.
Where it "comes into play" is if you decide you want to use an Infrastructure as Code tool, you might use CloudFormation. It is entirely optional.
I'm testing the waters for running Apache Airflow on AWS through the Managed Workflows for Apache Airflow (MWAA). The version of Airflow that AWS have deployed and are managing for me is 1.10.12.
When I try to access the v1 REST API at /api/experimental/test I get back status code 403 Forbidden.
Is it possible to enable the experimental API in MWAA? How?
I think MWAA provide a REST endpoint to use the CLI
https://$WEB_SERVER_HOSTNAME/aws_mwaa/cli
It's quite confusing because you fisrt need to create a cli-token using the awscli to then hit the endpoint using that token. You will need a policy to allow your awscli to request that token.
Lastly there isn't support for all the commands, just a bunch.
Anyway it's all explained on the user guide
https://docs.aws.amazon.com/mwaa/latest/userguide/amazon-mwaa-user-guide.pdf
By default, api.auth_backend configuration option is set to airflow.api.auth.backend.deny_all in MWAA environments. You need to override it to one of the authentication methods mentioned in the documentation as shown in the figure bellow:
Note: it is highly discouraged to use airflow.api.auth.backend.default as it'll
leave your environment publicly accessible.
[2021/07/29] Edit:
Based on this comment, AWS blocked access to the REST API.
I am currently working on a web portal for a foundation. Applicants for a grant will receive access data in advance independently of this portal. New applications will then be created and processed in the portal itself. Once an application is complete, it is sent off. Later the application will be approved or rejected.
There are a number of technical specifications on which I have no influence. The frontend should be implemented using Html+Javascript. The backend should use the Amazon Web Services (AWS). If there is a need to program something for the backend - then C# should be used.
I know how to implement the classic client-server solution. At the moment, however, AWS offers me an unmanageable set of services. And here I'm hoping for suggestions as to which of the services I should take a closer look at. Ideally, no complete 'server solution' should run on a virtual server. Instead, Lambda functions are mentioned again and again. So would Amazon RDS and AWS Lambda be a sensible and sufficient combination? Did I miss something?
Thank you very much for your suggestions.
One solution would be to use AWS S3 to server HTML, CSS, JS, Images and other static content. You could use AWS Lambda via AWS API Gateway to serve as a backend. AWS Lambda would then connect to AWS RDS or AWS DynamoDB if you would prefer a NoSQL solution.
Image taken from AWS Github repo
You can get a more detailed description of how to set this up at
https://github.com/aws-samples/aws-serverless-workshops/tree/master/WebApplication/
Just wondering if this is possible. In NiFi, it is possible to connect to S3 buckets.
Can you call Comprehend? Or is that capability totally beyond the pale? Thanks
There are no out-of-the-box Apache NiFi processors to communicate with AWS Comprehend at the moment, but there are multiple ways you can achieve this.
ExecuteStreamCommand using the AWS CLI -- execute shell commands that use the CLI tool to communicate with AWS
ExecuteScript with the AWS SDK -- execute custom code in Groovy/Python/Ruby using the relevant AWS SDK
InvokeHTTP with the Comprehend API -- execute HTTP requests sending and receiving JSON content
CustomProcessor with the AWS SDK -- write a custom processor using the AWS Java SDK
You can also open a feature request on the NiFi Jira for this capability.
I have Requirement of developing a REST API with DB on AWS with Our custom Jar, that will be processing the data coming in the request, once processed we will give a response the result comes from our jar.
We have :
Our Java application that will process the data.
Need to develop Authorisation platform for a various client using REST API.
Need to log all the transaction that is requested and how many are rejected and processed successfully.
We are thinking to deploy the complete application on AWS, so I am looking for best study material on developing and Deployment on AWS that is free (budget issue).
Please suggest where should I start as I am a newbie on the cloud platform.
Thanks in advance for the help.
To save on cost with AWS, try to go serverless architecture.
Use:
S3: to host your front end code by making your bucket a website
Lambda: to host your backend code to insert and retrive from database. You get 1 million requers free per month
Api Gateway: it would provide an interface to access lambda function and detailed logging can be done to cloud watch. It also provides with Authorization with API keys and Cognito user pools.
DynamoDb: it is aws managed database, that give you 15 free read write provisioned throughput
You can start with this
https://medium.com/byteagenten/serverless-architecture-with-aws-adcaa3415acd?source=linkShare-22ecbac0bdc-1526628767