I wish to implement the answer that is outlined here: https://stackoverflow.com/a/50397276/1980516
However, I find that I keep running into Unable to import module 'index' at exactly this line:
const _archiver = require('archiver');
So, I'm guessing that I cannot do this via the online console. Instead, I probably have to create a deployment package.
How do I go about this? I apparently need AWS CLI, Node.js, npm and I'm new to all of it. In the Amazon docs I can't find a practical list of how to set up my local development environment.
What tools do I install, which versions and in what order exactly?
Edit: Windows :)
My guess is that you need to npm install archiver and package the node_modules dependencies along with your index.js (handler file for your lambda entry point). You can zip up and deploy/upload it to your lambda.
Also have a look at https://github.com/serverless/serverless framework, that will do these type of things easier.
Have a look at AWS SAM, the Serverless Application Model. It provides a local development setup for things like Lambda functions and API Gateway endpoints, and a way to easily package and deploy things. The exact steps you need are:
Create an AWS account and an IAM user with admin privileges
Install node.js
Install the AWS CLI (and configure it with aws configure)
Install SAM CLI and Docker (the local instances run in docker containers)
Initialize a new SAM project with sam init --runtime nodejs (or other runtime version if need)
Run through the quickstart to get an idea of how to define a SAM template, build a SAM app, and deploy.
If you don't want to use the framework or local development environment and just want to create the source bundle, there are docs. The gist is:
Install nodejs (e.g. using homebrew or an installer)
npm install the modules you need
Zip up your code including the node_modules folder
Upload the zip via the AWS Console
Related
I am working on an AWS SAM serverless project, on a Lambda function written in Node.js.
The Lambda execution environment already provides the AWS SDK, this it is not necessary to push this dependency into the deployment.
The problem arises when aws-sdk comes up as a nested dependency of another package.
For instance, I need aws-appsync, which depends in turn on aws-sdk.
Because of that, the deployment size is too big. The entire aws-appsync package with its dependency weights about 140mb, a notable portion of it being the AWS SDK. In this situation, the maximum deployment size is exceeded and the deployment procedure fails.
Can I make npm install a package with all its dependencies except a specific one? I would exclude aws-sdk from the dependencies in this case.
An easy solution for this is to add the aws-sdk as a devDependencies instead of a normal dependency. You can the run npm i --production in your deployment pipeline before bundling the code and uploading it. This will ensure that the devDependencies are not available in the node_modules. (I don't think it will remove them if they are there already, so if you're doing it locally, you might have to delete the node_modules folder before running the npm command)
I am using webpack and serverless to deploy to aws lambda. So far I have been able to configure it to bundle all dependencies into one ts file, but aws complains there is no package.json. So, I found a way to upload the node modules folder as well, which also brought in the package.json but since I am on windows the aws instances don't like the libraries.
How do I include package.json when I run the serverless package or deploy commands so that aws lambda can run the install?
include:
- package.json
Doesn't work.
If you're using the Serverless Webpack plugin, you should be able to get whatever native modules you need installed by using the packagerOptions config for the plugin and specifying the linux platform for x64 architecture along with the list of npm modules to package.
See the Custom scripts section of the plugin's documentation for more info.
For example, if your Lambda function depends on the sharp npm package, you'd add something like the following to your serverless.yml file:
custom:
webpack:
includeModules: true
packagerOptions:
scripts:
- npm_config_platform=linux npm_config_arch=x64 yarn add sharp
From what I am aware, Google Cloud Functions only allows you to deploy NodeJs or Python scripts.
Question: How would I be able to deploy a simple Hello_World.cpp on Google Cloud Functions? For example, writing a hello world HTTP function.
What are alternate methods to do this? I want to use serverless approach, since it's cheapest method. Therefore, that is why I'm going with Google Cloud Functions. Would I have to use docker in order to run C++ files? I've been stuck on this for a while and any guidance or help would be appreciated.
You can compile your C++ function into a WebAssembly module using emscripten. Then you can call it from a small nodejs glue code.
I built an example for you here:
https://github.com/ArthurSonzogni/gcloud-cpp-starter
You can run C++ Code by node.js on google cloud functions (tested with node.js 10)
how to using C++ and N-API (node-addon-api) https://medium.com/#atulanand94/beginners-guide-to-writing-nodejs-addons-using-c-and-n-api-node-addon-api-9b3b718a9a7f
use https://console.cloud.google.com/functions and click CREATE FUNCTION to upload .zip or gcloud functions deploy --runtime nodejs10 --trigger-http
The trick is when you zip file you need to remove /build and /node_modules folder then use command line by cd to folder of index.js and 'zip your_name.zip -r *'
ps. when I use firebase deploy --only functions it will error because it doesn't know file addon.node format (in fact it shouldn't read this file because it need to be recompiled) but I think if we using gcloud functions command line with .gcloudignore for /build and /node_modules it will success https://cloud.google.com/functions/docs/deploying/filesystem
HOW DOES IT WORK
I think when you deploy node.js source code to cloud functions it will run npm install and your C++ code will be compiled too (like npm run build will be auto run after npm install)
You can't use C++ on Cloud Functions, period. You can only use Node.js 6.14, Node.js 8.11.1 (beta) and Python 3.7 (also beta).
If you wish to use C++ in GCP with a serverless approach, my best suggestion would be running your own Custom Runtime in App Engine. You would still need to configure some instances options, but you don't have to manage servers and so on.
You can only use App Engine Flexible Environment (or, of course, standard VM architecture, Compute Engine). Extract from the docs (https://cloud.google.com/appengine/docs/flexible/):
Runtimes - The flexible environment includes native support for Java 8
(with no web-serving framework), Eclipse Jetty 9, Python 2.7 and Python 3.6,
Node.js, Ruby, PHP, .NET core, and Go. Developers can customize these
runtimes or provide their own runtime by supplying a custom Docker image
or Dockerfile from the open source community.
As an interesting side note, Google Serverless Containers will give you the chance to deploy your dockerized application but in a serverless flavour (in fact it's built on top of Google Cloud Functions technology). It's currently in Alpha stage.
I am used to developing on IntelliJIdea, pushing to gitlab which makes an automatic build and pushes everything where it is needed. In the end I have a self-containing git repository which has all the code and even the scripts for building & deploying. If anything happens to my dev/prod server Ill get everything back in minutes. If I want another dev to help me out - I can just send him the git link. Everything he changes will be easily deployed on the dev server.
However I have no clue how achieve this kind of setup. Should I go for Eclipse with AWS plugin? But I really dont want to switch my favority IDE just to work with AWS. Maybe make scripts which will use aws-cli to upload/update all the lambdas/policies/etc? I would really appreciate any good practices listed.
If you have AWS Lambda functions, there is a great CLI tool called autolambda, see here. It commits to git and publishes your changes to Lambda at the same time. Assuming you have git and the AWS CLI in set-up in your terminal. It uses some bash scripts in the backend.
You can get it from NPM
npm install -g autolambda
Use it like this:
autolambda init --name myFunctions --desc "These functions are related"
cd myFunctions
To make an AWS Lambda function:
autolambda create --name HelloNode --runtime node --role "arn:aws:iam::abcdefghijk" --desc "Hello World Function in Node"
To publish any changes:
autolambda publish --name HelloNode --desc "changed main.js text"
I found it really useful. Cheers.
Please vote for implementing Intellij IDEA support for Amazon Lambda here:
https://youtrack.jetbrains.com/issue/IDEA-180070
Previously, we were limited to the AWS Toolkit for Eclipse if we wanted to perform local testing for Amazon Lambda functions, and build serverless applications in Amazon.
But since Aug 11, 2017, Amazon provides the AWS SAM Local, a CLI tool that allows us to locally test and debug our AWS Lambda functions. SAM Local supports Lambda functions written in Node.js, Java, and Python.
Im migrating a Play! application from Heroku to AWS Beanstalk.
Heroku is really straight forward when it comes to deploying: Just push changes to a remote git repository on Heroku and the build occurs on the server side.
This is very convenient because it is not necessary to upload the whole project for each tiny change (Including all libraries!).
Basically for each change we are generating a huge 140 MB Docker zipped file that takes at least 10 minutes to upload.
Surely there must be a better way but a long search on Google only returned options to automize the file generation with scripts and alternatives like Jenkins but this does not solve the problem, it just automates the problem.
Does anyone have a better solution?
You can set up a AWS CodeCommit repository, and use that as a remote for your local git repository. Next you can set up AWS CodePipeline to build your application and deploy to Elastic Beanstalk whenever there is a new commit to the AWS CodeCommit repository.
This way you don't have to upload everything every time. Whenever you do git push, only the changed files are uploaded to the AWS CodeCommit repository, and then AWS CodePipeline takes care of building your application and deploying it to Elastic Beanstalk.
So I got curious about this question too and had a conversation with an AWS specialist about different options here. Each option has it's downsides tho.
The first option is to bake your application code, create an AMI out of it and carry out deployment using baked AMI. More on that
You have to test this approach first before adopting. The downside is that you would have to regularly maintain the AMI. You might also miss out critical patches from Beanstalk since AMI has been locked down
A good read on this topic
The next approach would be to move out of Beanstalk and use CloudFormation where you can just upload your application folder to S3. Your CloudFormation template has to take care of spinning up all the resources required and using AWS::CloudFormation::Init and cfn-signals, it would be possible to install and setup software.Changes within the resource Metadata can be detected by making use of the proper CloudFormation signal and we can also run user-specified actions when a change is detected on the template specification.
(AWS::CloudFormation::Init)
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/cfn-helper-scripts-reference.html (set of helper scripts that can be used with CloudFormation)
Although these are not exactly a solution to what you asked for, they can be a good alternative. At least I made sure that you are not missing out any available options at Beanstalk.
Also one advice I got from them was to consider splitting up application into multiple components and sub-components. This would reduce your application size considerably.
Hope this helped.
Short answer: No.
Long Answer: I ended up packaging the app with activator and not using Docker.
Crate a folder named "dist" in the root of the project.
Include a file named Procfile with the following line:
web: ./bin/YOUR_APP_NAME -Dhttp.port=5000 -Dconfig.file=conf/application.conf
Make sure to replace YOUR_APP_NAME with the name of your app as configured in build.sbt.
Package the Play app with the following command:
activator clean dist
That will generate a zip file inside target/universal/ folder in the project.
Deploy that zip file to AWS Elastic Beanstalk.