CasperJS and AWS Lambda - amazon-web-services

I'm trying to get casperjs to work with my AWS Lambda function.
{
"errorMessage": "Cannot find module 'casper'",
"errorType": "Error",
"stackTrace": [
"Function.Module._load (module.js:276:25)",
"Module.require (module.js:353:17)",
"require (internal/module.js:12:17)",
"Object.<anonymous> (/var/task/index.js:3:14)",
"Module._compile (module.js:409:26)",
"Object.Module._extensions..js (module.js:416:10)",
"Module.load (module.js:343:32)",
"Function.Module._load (module.js:300:12)",
"Module.require (module.js:353:17)"
]
}
I keep getting this error where Lambda can't detect casperjs. I uploaded my zip file into Lambda, and installed the casperjs modules into my directory before I zipped the files up.
My package.json file says I have casperjs installed.
{
"name": "lambda",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"casperjs": "^1.1.3",
}
}
Would anyone know what I'm doing wrong? Thanks.

Since CasperJs relies on PhantomJs, you can set it up very similarly to this repo: https://github.com/TylerPachal/lambda-node-phantom.
The main difference being that you need to add and target CasperJs and you need to make sure that CasperJs can find and load PhantomJs.
Create a node_modules directory in your package directory.
Add a dependency for CasperJs to the packages.json file:
"dependencies": {
"casperjs": "latest"
}
In Terminal, navigate to your package directory and run 'npm update' to add the CasperJs package to the node_modules directory.
Assuming that you want to run CasperJs with the 'test' argument, the index.js file will need to be changed to look like this:
var childProcess = require('child_process');
var path = require('path');
exports.handler = function(event, context) {
// Set the path as described here: https://aws.amazon.com/blogs/compute/running-executables-in-aws-lambda/
process.env['PATH'] = process.env['PATH'] + ':' + process.env['LAMBDA_TASK_ROOT'];
// Set the path to casperjs
var casperPath = path.join(__dirname, 'node_modules/casperjs/bin/casperjs');
// Arguments for the casper script
var processArgs = [
'test',
path.join(__dirname, 'casper_test_file.js')
];
// Launch the child process
childProcess.execFile(casperPath, processArgs, function(error, stdout, stderr) {
if (error) {
context.fail(error);
return;
}
if (stderr) {
context.fail(error);
return;
}
context.succeed(stdout);
});
}
If you don't want to run CasperJs with the 'test' argument, just remove it from the arguments list.
The PhantomJs binary in the root directory of your package needs to be renamed to phantomjs, so that CasperJs can find it. If you would like to get a new version of PhantomJs, you can get one here: https://bitbucket.org/ariya/phantomjs/downloads. Make sure to download a linux-x86_64.tar.bz2 type so that it can run in Lambda. Once downloaded, just pull a new binary out of the bin directory and place it in your root package directory.
In order for Lambda to have permission to access all the files, it's easiest to zip the package in a Unix-like operating system. Make sure that all the files in the package have read and execute permissions. From within the package directory: chmod -R o+rx *. Then zip it up with: zip -r my_package.zip *.
Upload the zipped package to your Lambda function.

According to Casper.js Docs, it is not a actually Node Module. So you cannot require it in Package.json and zip it up with node modules. You will need to find how to install it on the lambda instance or find a an actual node module that does what you want. I suspect installing casper on lambda might not be possible, but that's just my gut.
Warning
While CasperJS is installable via npm, it is not a NodeJS module and will not work with NodeJS out of the box. You cannot load casper by using require(‘casperjs’) in node. Note that CasperJS is not capable of using a vast majority of NodeJS modules out there. Experiment and use your best judgement.
http://docs.casperjs.org/en/latest/installation.html

Related

How to add chrome to AWS Lambda Api?

I'm making an Api that returns a pdf by puppeteer. I just instaled puppeteer with npm install chrome-Aws-lambda, install puppeteer --save-dev but when I run the Api, I get this exception.
I tried runing npm install but it doesn't work, how can I install chromium or make puppeteer works???
this is my code and package.json
let browser = await chromium.puppeteer.launch({ headless: true });
let page = await browser.newPage();
await page.goto("https://www.google.com");
const pdf = await page.pdf({
format: "A4",
printBackground: false,
preferCSSPageSize: true,
displayHeaderFooter: false,
headerTemplate: `<div class="header" style="font-size:20px; padding-left:15px;"><h1>Main Heading</h1></div> `,
footerTemplate: '<footer><h5>Page <span class="pageNumber"></span> of <span class="totalPages"></span></h5></footer>',
margin: { top: "200px", bottom: "150px", right: "20px", left: "20px" },
height: "200px",
width: "200px",
});
return {
statusCode: 200,
// Uncomment below to enable CORS requests
headers: {
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Credentials": true
},
body: pdf
};
Pakage:
{
"name": "amplifysandboxpdf",
"version": "2.0.0",
"description": "Lambda function generated by Amplify",
"main": "index.js",
"license": "Apache-2.0",
"devDependencies": {
"#types/aws-lambda": "^8.10.92",
"#types/puppeteer": "^5.4.6",
"puppeteer": "^17.1.2"
},
"dependencies": {
"chrome-aws-lambda": "^10.1.0",
"puppeteer-core": "^10.0.0"
}
}
Usually, when you upload a zip to lambda, you need to provide both - your source code and node_modules (modules can also be added as a lambda layer). In your case error is because of missing package, so the first place I'd look at is the zip you provide to lambda and does it contain all the necessary packages
I see in your package.json you have both puppeteer and puppeteer-core. Puppeteer downloads chromium on installation (that's the bit that errors for you). First you need to decide if puppeteer is even necessary, maybe core package is enough, but if it is necessary - once again, it should be in your zip that you provide.
I'm not sure about how puppeteer does this, but if it downloads chromium into node_modules it should work once your zip file is correctly packaged. Otherwise, you might need to create your own docker container by using puppeteer base image Puppeteer base image.
If this is the case, these are the steps you need to take:
Create a dockerfile with puppeteer base image, build your own image and make sure your application is installed in there. Example of what this file could look like:
FROM ghcr.io/puppeteer/puppeteer:16.1.0
WORKDIR /home/pptruser/app
ADD ./package-lock.json /home/pptruser/app/package-lock.json
ADD ./package.json /home/pptruser/app/package.json
RUN npm ci
# Customize the line below - copy files that your application requires
ADD ./src/. /home/pptruser/app/src/
# Remove development dependencies (in your case puppeteer is not a devDependency)
RUN npm prune --production
CMD [ "index.js" ]
Test locally if it works in the container and later publish this image to ECR (AWS registry for docker containers)
Launch lambda by using the image instead of using NodeJS environment

How to set env variable in local development for google cloud functions?

I'm trying to run cloud function on my local system for which I need to set some env variables. I'm following docs for env and for local development docs.
I'm trying to run my project via the following command:
node node_modules/#google-cloud/functions-framework --target=syncingredients --env-vars-file=.env.yaml
Where my .env.yaml looks like:
API_KEY: key
AUTH_DOMAIN: project.firebaseapp.com
Seems like --env-vars-file aren't supported with functions framework (https://github.com/GoogleCloudPlatform/functions-framework-nodejs/issues/38)
I would recommend the workaround suggested by relymd-djk:
pre-req:
npm install env-cmd
npm install yaml2json
modifying the package.json scripts section with:
"scripts": {
"start":"yaml2json .env.yaml >.env.json && env-cmd -r ./.env.json functions-framework --target=syncingredients",
"deploy": "gcloud functions deploy myFunction --entry-point syncingredients --trigger-http --runtime nodejs16 --env-vars-file ./.env.yaml"
}
to run the function:
npm start
Thanks to ClumsyPuffin for highlighting that it isn't an available feature so I went for dotenv
Changed the file to .env:
API_KEY="key"
AUTH_DOMAIN="project.firebaseapp.com"
And used the following command to run the function locally
node -r dotenv/config node_modules/#google-cloud/functions-framework --target=syncingredients
As reference, This link helped me get it working, I leveraged env-cmd on top of the script's in this link. :
npm install env-cmd --save-dev
and then in the script I was able to feed the .env file prior to functions-the framework portion I used:
env-cmd -r ./.env
full working:
env-cmd -r ./.env node --inspect node_modules/.bin/functions-framework --source=dist/ --target=testFunction
Install env-cmd as dev dependendy
npm i env-cmd --save-dev
Update package.json
{
"name": "google-cloud-function",
"version": "0.0.1",
"dependencies": {
"#google-cloud/functions-framework": "^3.0.0",
},
"scripts": {
"start": "env-cmd functions-framework --target=googleCloudFunction"
},
"devDependencies": {
"env-cmd": "^10.1.0"
}
}
Create .env in root directory
ENV_VARIABLE_HAHA="hahahah"
ENV_VARIABLE_FOO="fooo"
In index.js
'use strict';
console.log(process.env);
exports.googleCloudFunction= async (req, res) => {
console.info('googleCloudFunction started...');
try {
const {ENV_VARIABLE_HAHA, ENV_VARIABLE_FOO} = process.env;
console.log(ENV_VARIABLE_HAHA, ENV_VARIABLE_FOO);
console.info('googleCloudFunction finished.');
res.status(200).send(process.env);
} catch (err) {
console.error(err.message);
console.error('googleCloudFunctionfailed.');
res.status(500).send(err.message);
}
}

How to execute .jar file from AWS Lambda Serverless?

I have tried with following code.
var exec = require('child_process').execFile;
var runCmd = 'java -jar ' + process.env.LAMBDA_TASK_ROOT + '/src/' + 'myjar.jar'
exec(runCmd,
function (err, resp) {
if (err) {
cb(null, { err: err})
} else {
cb(null, { resp: resp})
}
)
Here, I have put my jar file in the root folder and src folder also.
but it is giving my following error. I have already added the.jar file with the code.but i got following error.
"err": {
"code": "ENOENT",
"errno": "ENOENT",
"syscall": "spawn java -jar /var/task/src/myjar.jar",
"path": "java -jar /var/task/src/myjar.jar",
"spawnargs": [],
"cmd": "java -jar /var/task/src/myjar.jar"
}
So How, Can I execute this .jar file in AWS Lambda environment?
Please help me.
With Lambda Layers you can now bring in multiple runtimes.
https://github.com/lambci/yumda and https://github.com/mthenw/awesome-layers both have a lot of prebuilt packages that you can use to create a layer so you have a second runtime available in your environment.
For instance, I'm currently working on a project that uses the Ruby 2.5 runtime on top of a custom layer built from lambci/yumbda to provide Java.
mkdir dependencies
docker run --rm -v "$PWD"/dependencies:/lambda/opt lambci/yumda:1 yum install -y java-1.8.0-openjdk-devel.x86_64
cd dependencies
zip -yr ../javaLayer .
upload javaLayer.zip to aws lambda as a layer
add layer to your function
within your function, java will be located at /opt/lib/jvm/{YOUR_SPECIFIC_JAVA_VERSION}/jre/bin/java
AWS Lambda lets you select a runtime at the time of creation of that lambda function, or later you can change it again.
So, as you are running the Lambda function with NodeJs runtime, the container will not have Java runtime available to it.
You can only have one type of runtime in one container in case of AWS Lambda.
So, Create a separate Lambda with the Jar file that you want to run having Java as the runtime and then you can trigger that lambda function from your current NodeJS lambda function if that's what you ultimately want.
Following is an example of how you can call another Lambda function using NodeJS
var aws = require('aws-sdk');
var lambda = new aws.Lambda({
region: 'put_your_region_here'
});
lambda.invoke({
FunctionName: 'lambda_function_name',
Payload: JSON.stringify(event, null, 2)
}, function(error, data) {
if (error) {
context.done('error', error);
}
if(data.Payload){
context.succeed(data.Payload)
}
});
You can refer to the official documentation for more details.
In addition to the other answers: Since 2020 December, Lambda supports container images: https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/
Ex.: I created a container image using AWS's open-source base image for python, adding a line to install java. One thing my python code did was execute a .jar file using a sys call.

AWS Create New Application Failed

I am new to AWS. I am exploring hosting my rest server on AWS but have not been able to solve the following problem:
Failed to find package.json. Node.js may have issues starting. Verify package.json is valid or place code in a file named server.js or app.js.
There were suggestions that I should zip only the files and sub-dir but not the root folder.
I tried the recommendation suggested to zip just the files and sub-folders but it still doesn't work.
The following are the files I zip:
bin
models
node_modules
public
routes
views
app.js
authenticate.js
ca.crt
ca.key
config.js
db.json
ia.crt
ia.csr
ia.key
ia.p12
package.json
The following are the content of my package.json file:
{
"name": "rest-server",
"version": "0.0.0",
"private": true,
"scripts": {
"start": "node ./bin/www"
},
"dependencies": {
"body-parser": "~1.15.2",
"cookie-parser": "~1.4.3",
"debug": "~2.2.0",
"express": "~4.14.0",
"jade": "~1.11.0",
"jsonwebtoken": "^7.1.9",
"mongoose": "^4.7.0",
"mongoose-currency": "^0.2.0",
"morgan": "~1.7.0",
"passport": "^0.3.2",
"passport-facebook": "^2.1.1",
"passport-local": "^1.0.0",
"passport-local-mongoose": "^4.0.0",
"serve-favicon": "~2.3.0"
}
}
I am using the Elastic Beanstalk Dashboard not the eb command line method.
What am I doing wrongly?
There is a possible solution from AWS developer forum regarding about the error in NodeJS deploy in elasticbeanstalk.It might due to the package.json was not zip in the correct location.
Node.js deploy fail in Elasticbeanstalk
https://forums.aws.amazon.com/thread.jspa?threadID=130140&tstart=0
You may try to download the sample application from AWS website and follow the folder structure from there:
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/tutorials.html

Deploy Alexa skill to AWS Lambda with "alexa-app" dependency

I wrote a simple Alexa skill. It uses "alexa-app" as dependency.
var alexa = require('alexa-app');
When I save and test my skill I get the following response
{
"errorMessage": "Cannot find module 'alexa-app'",
"errorType": "Error",
"stackTrace": [
"Function.Module._load (module.js:276:25)",
"Module.require (module.js:353:17)",
"require (internal/module.js:12:17)",
"Object.<anonymous> (/var/task/index.js:4:13)",
"Module._compile (module.js:409:26)",
"Object.Module._extensions..js (module.js:416:10)",
"Module.load (module.js:343:32)",
"Function.Module._load (module.js:300:12)",
"Module.require (module.js:353:17)"
]
}
Is it possible to use this "alexa-app" dependency without baking it into a zip file. To make development quicker I'd prefer working with just one file in the online Lambda code editor. Is this possible?
No, you will need to include it in a zip along with any other files. It really isn't difficult to do though. You can use the AWS CLI to simplify this.
Here is a bash script that I use on my Mac for doing this:
# Create archive if it doesn't already exist
# Generally not needed, and just a refresh is performed
if [ ! -f ./Archive.zip ];
then
echo "Creating Lambda.zip"
else
echo "Updating existing Lambda.zip"
fi
# Update and upload new archive
zip -u -r Lambda.zip index.js src node_modules
echo "Uploading Lambda.zip to AWS Lambda";
aws lambda update-function-code --function-name ronsSkill --zip-file fileb://Lambda.zip
In the above script, It's packaging up an index.js file along with all the files in the ./src and ./node_modules directories. It is uploading them to my 'ronsSkill' Lambda function.
I use alexa-app also, and it is included in the node_modules directory by npm.