AWS Lambda function throws ClassNotFoundException - amazon-web-services

I have a Spring Boot demo project. I am trying to deploy them in AWS Lambda, but I get ClassNotFoundException even thought my jar that I upload contains of the necessary dependencies.
Here goes my code:
pom.xml
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>3.6.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws.serverless</groupId>
<artifactId>aws-serverless-java-container-spring</artifactId>
<version>[0.1,)</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Application Class
#SpringBootApplication
public class DemoApplication extends SpringBootServletInitializer {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
System.out.println("Welcome to demo project");
}
}
Controller Class
#RestController
public class DemoController {
#GetMapping(value="/getValue")
public String getId() {
return " Call from controller";
}
}
LambdaHandler Class
public class DemoLambdaHandler implements RequestStreamHandler {
public static SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
static {
try {
handler = SpringBootLambdaContainerHandler.getAwsProxyHandler(DemoApplication.class);
} catch (ContainerInitializationException e) {
e.printStackTrace();
throw new RuntimeException("Container not initialized", e);
}
}
#Override
public void handleRequest(InputStream input, OutputStream output, Context context) throws IOException {
// TODO Auto-generated method stub
handler.proxyStream(input, output, context);
}
}
Not sure what I could be missing in this. Kindly help. Below is my inspected jar

Did you find a solution to this? I have been getting the same exact error.
What I can tell so far is that unlike other languages like nodejs, the docker image for java has the command specified like CMD [ "com.example.LambdaHandler::handleRequest" ], instead the nodejs one is CMD [ "app.handler" ]. The difference here is that the nodejs one specifies what the file is e.g. app.js with function called handler.
But the java one only has the class path without the ability to specify the path to the fat JAR file even when the jar file is located in "var/task". How would lambda know the JAR file name?

I had the same issue, I fixed it marking Spring Boot as a dependency.
By default, Spring Boot plugin will pack your classes into "BOOT-INF/classes" directory, but AWS is looking for the handler class from the root directory so it can't find it.
You can check this by extracting your .jar file and seeing if your file is in:
BOOT-INF/classes/com/example/demo/handler/LambdaHandler.class
instead of:
com/example/demo/handler/LambdaHandler.class
To solve it, just mark your package as exec in your pom.xml file:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.6.1</version>
<configuration>
<classifier>exec</classifier>
</configuration>
</plugin>
More info here: Spring Boot as a Dependency

Try building the project with Maven and ensure you have the required maven-shade-plugin in your POM so that the JAR that is built with the dependencies when you run mvn package. If you are missing this plugin, you will create a JAR that does not contain the dependencies and you will encounter ClassNotFoundException.
To learn how to build and deploy a Lambda function by using the Lambda runtime Java API, see this AWS tutorial. It will walk you step by step through the process of building Lambda functions that work:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/master/javav2/usecases/creating_workflows_stepfunctions
This tutorial then uses the Lambda functions within Step Functions to create a workflow.
Lambda functions are created using the Lambda runtime Java API -- com.amazonaws.services.lambda.runtime.RequestHandler. Please do not use Spring BOOT APIs to create a Lambda function.
However, If you want to invoke a Lambda function from a Spring boot app and then for example display the result in a view, then you can use the Lambda Client Java API - which is software.amazon.awssdk.services.lambda.LambdaClient.
So in summary:
com.amazonaws.services.lambda.runtime.RequestHandler is used to create a Lambda function by using the Java Lambda runtime API.
software.amazon.awssdk.services.lambda.LambdaClient is the Lambda client Java API V2 that lets you interact with deployed Lambda functions. For
example, you can use this client to invoke a Lambda function from a
Java app - like a Spring boot web app. To see a working example of how to use the Lambda client API to invoke a Lambda function, see https://github.com/awsdocs/aws-doc-sdk-examples/blob/master/javav2/example_code/lambda/src/main/java/com/example/lambda/LambdaInvoke.java.

Please add these two dependencies in pomfile
1)aws-lambda-java-core
2)aws-lambda-java-events
This should fix it

Related

SpringBoot fails to deploy after adding .ebextensions for ngingx SSL -[An error occurred during execution of command [app-deploy]]

I have a SpringBoot app that deploys just fine to AWS Beanstalk, and the default nginx proxy works, allowing me to connect via port 80.
Following the instructions here: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/https-singleinstance.html, and verifying with another of my projects that works with this exact config, Beanstalk fails to deploy the app with error:
2020/05/29 01:27:56.418780 [ERROR] An error occurred during execution of command [app-deploy] - [CheckProcfileForJavaApplication]. Stop running the command. Error: there is no Procfile and no .jar file at root level of your source bundle
The contents of my war file are as such:
app.war
-.ebextensions
-nginx/conf.d/https.conf
-https-instance-single.config
-https-instance.config
-web-inf/
My config files pass as valid yaml files. (These files are identical to those in the AWS doc, and those that work in other project on mine.)
I am using a single instance, with port 443 set open.
These are the errors reported throughout the various log files:
----------------------------------------
/var/log/eb-engine.log
----------------------------------------
2020/05/29 01:37:53.054366 [ERROR] /usr/bin/id: healthd: no such user
...
2020/05/29 01:37:53.254965 [ERROR] Created symlink from /etc/systemd/system/multi-user.target.wants/healthd.service to /etc/systemd/system/healthd.service.
...
2020/05/29 01:37:53.732794 [ERROR] Created symlink from /etc/systemd/system/multi-user.target.wants/cfn-hup.service to /etc/systemd/system/cfn-hup.service.
----------------------------------------
/var/log/cfn-hup.log
----------------------------------------
ReadTimeout: HTTPSConnectionPool(host='sqs.us-east-1.amazonaws.com', port=443): Read timed out. (read timeout=23)
Taking count #Dean Wookey's answer for Java 11, I have successfully deployed Spring Boot application jar along with .ebextensions folder. I just added maven antrun plug to my maven build configurations and for output I am receiving .zip file, which contains .ebextensions folder and spring Boot .jar file at the same level. Just deploying this final zip file to AWS UI Console.
The following is the maven antrun plugin configuration
....
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<id>prepare</id>
<phase>package</phase>
<configuration>
<tasks>
<copy todir="${project.build.directory}/${project.build.finalName}/" overwrite="false">
<fileset dir="./" includes=".ebextensions/**"/>
<fileset dir="${project.build.directory}" includes="*.jar"/>
</copy>
<zip destfile="${project.build.directory}/${project.build.finalName}.zip" basedir="${project.build.directory}/${project.build.finalName}"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
....
Issue with Java and Linux version
If you are using Java 8 and Linux 2.10.9 code will work and override ngingx configuration but if you choose Corretto 11 and Linux 2.2.3 get following error.
Error: there is no Procfile and no .jar file at root level of your
source bundle
Create new environment with Java 8 and deploy app again will resolve issue.
Instead of changing to java 8 as described in vaquar khan's answer, an alternative is to package your source jar inside a zip that also contains the .ebextensions folder.
In other words:
source.zip
-.ebextensions
-nginx/conf.d/https.conf
-https-instance-single.config
-https-instance.config
-web-inf/
-app.war
If you look at the latest documentation https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/platforms-linux-extend.html, you'll see that the nginx config now goes in the .platform folder instead, so your structure would be:
source.zip
-.ebextensions
-https-instance-single.config
-https-instance.config
-.platform
-nginx/conf.d/https.conf
-web-inf/
-app.war
After following vaquar's answer above, also change the 'buildspec.yml' file to have the correct java version. E.g:
runtime-versions:
java: corretto8 # previously this was openjdk8
Should work.
It is still possible to use .ebextensions within your war file.
Add following to your pom.xml in the <build><plugins> section:
<build>
<plugins>
...
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>add-resource-ebextensions</id>
<phase>generate-resources</phase>
<goals>
<goal>add-resource</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/.ebextensions</directory>
<targetPath>.ebextensions</targetPath>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
...
</plugins>
</build>
This will copy .ebextensions folder to WEB-INF/classes folder. There AWS picks it up while starting and applies scripts from there.

How can I utilize AWS::Serverless::LayerVersion in order to use external libraries in my AWS Lambda functions

I need to use external library that is located on my local file system in order to successfully execute my Lambda function. Using AWS SAM framework I found out that this can be done by specifying AWS::Serverless::LayerVersion resource.
What I am not sure is how does this exactly work and how do I specify path to my external library. Do I first need to deploy my external library to S3 bucket or?
You need to deploy the jar on layer in AWS Lambda layers section
AWS Lambda Layers :
You can configure your Lambda function to pull in
additional code and content in the form of layers. A layer is a ZIP
archive that contains libraries, a custom runtime, or other
dependencies. With layers, you can use libraries in your function
without needing to include them in your deployment package.
https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html
Following are the steps to use AWS lambda layers
Write a Lambda layer code
Package Lambda layer
Deploy Lambda layer
Attached a layer to function Call a method
Verify the results
Once you complete writing your function make sure the pom.xml contains the artifacts and maven-shade-plugin
<groupId>java-lambda-layer</groupId>
<artifactId>java-lambda-layer</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration> <createDependencyReducedPom>false</createDependencyReducedPom> </configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Run Maven
mvnclean install and package
Please read further on following link
https://medium.com/#zeebaig/working-with-aws-lambda-layers-ddf5c91674d3

Arquillian tomee remote jacoco code coverage

I am doing integration test using Arquillian in TomEE-Plus 7.0.4 remote and trying to get Code coverage using Jacoco 0.8.2. My code coverage is not covered since I am using arquillian-tomee-remote. Since code is not covered not able to take build. I need sample code with has TomEE-plus arquillian remote and Code coverage using Jacoco. I will appreciate if I get any sample working code or sample project.
I used prepare-agent goal which will generate surefireArgLine ( javaagent) and passed the same in surefire plugin. issue here is, I am using remote Tomee and don't know how to generate correct java agent surefireArgLine set to -javaagent:/home/user/.m2/repository/org/jacoco/org.jacoco.agent/0.8.2/org.jacoco.agent-0.8.2-runtime.jar=destfile=/home/user/project/target/coverage-reports/jacoco-ut.exec,append=true,excludes=/config/*.class:/util/*Constants.class
what is the correct javaagent option for my configuration which will connect to arquillian-remote-tomee ?
Jacoco plugin
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${plugin.maven.jacoco.version}</version>
<configuration>
<propertyName>coverageAgent</propertyName>
<append>true</append>
<excludes>
<exclude>**/config/*.class</exclude>
<exclude>**/util/*Constants.class</exclude>
</excludes>
</configuration>
<executions>
<execution>
<id>pre-unit-test</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<destFile>${sonar.jacoco.reportPath}</destFile>
<propertyName>surefireArgLine</propertyName>
<append>true</append>
</configuration>
</execution>
<execution>
<id>post-unit-test</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<dataFile>${sonar.jacoco.reportPath}</dataFile>
<outputDirectory>${project.reporting.outputDirectory}/jacoco-ut</outputDirectory>
<append>true</append>
</configuration>
</execution>
<execution>
<id>check</id>
<goals>
<goal>check</goal>
</goals>
<configuration>
<dataFile>${sonar.jacoco.reportPath}</dataFile>
<haltOnFailure>true</haltOnFailure>
<rules>
<rule>
<element>BUNDLE</element>
<limits>
<limit>
<counter>LINE</counter>
<value>COVEREDRATIO</value>
<minimum>0.99</minimum>
</limit>
<limit>
<counter>BRANCH</counter>
<value>COVEREDRATIO</value>
<minimum>0.99</minimum>
</limit>
<limit>
<counter>CLASS</counter>
<value>MISSEDCOUNT</value>
<maximum>0</maximum>
</limit>
</limits>
</rule>
</rules>
</configuration>
</execution>
</executions>
</plugin>
Dependency
<dependency>
<groupId>org.jboss.arquillian.testng</groupId>
<artifactId>arquillian-testng-container</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.config</groupId>
<artifactId>arquillian-config-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.extension</groupId>
<artifactId>arquillian-jacoco</artifactId>
<version>1.0.0.Alpha10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jacoco</groupId>
<artifactId>org.jacoco.agent</artifactId>
<classifier>runtime</classifier>
<scope>test</scope>
<version>${plugin.maven.jacoco.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jacoco/org.jacoco.core -->
<dependency>
<groupId>org.jacoco</groupId>
<artifactId>org.jacoco.core</artifactId>
<version>${plugin.maven.jacoco.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.tomee</groupId>
<artifactId>arquillian-tomee-remote</artifactId>
<version>${tomee.version}</version>
<scope>test</scope>
</dependency>
Arquillian.xml
<extension qualifier="jacoco">
<property name="includes">com.demo.*</property>
</extension>
You can set catalina_opts in arquillian.xml for tomee container. Filter it with maven to pass jacoco javaagent and you are done :).
I have added the proper java agent ( surefireArgLine) to TomEE remote server via catalina opts in surefire pluggin. it works.
surefireArgLine - Will be populate by Surefire prepare-agent at runtime.
<tomee.catalina_opts> ${surefireArgLine}</tomee.catalina_opts>
Disclaimer: I'm not an expert in neither Arquillian nor TomEE, so you might adjust the answer for your purposes.
Anyway, in a nutshell, JaCoCo instruments bytecode in order to provide a coverage report.
Since when Arquillian is used, the actual test execution happens in a TomEE JVM and not in a JVM that actually runs the test suite (probably a CI server or just a build script that runs the test), so configuring JaCoCo on this test machine won't do much, you'll have to configure the server itself.
JaCoCo has a -javaagent option for doing this, and this Java Agent will "intercept" the loading of classes by the server and instrument them.
Now, when JaCoCo works, it produces a jacoco.exec file that actually contains a coverage report that can be shown later in various ways (jenkins plugin to show coverage, sonar integration whatever).
And this is by far the most used option AFAIK, so if you go with this approach, given the instrumentation really works, after the tests are done, you'll have to find the server on the test machine and download it to the build machine and integrate with CI/Sonar whatever.
However, there are alternative solutions:
JaCoCo Documentation states that there are three modes of running an instrumenting Java Agent:
File System: At JVM termination execution data is written to a local file.
TCP Socket Server: External tools can connect to the JVM and retrieve execution data over the socket connection. Optional execution data reset and execution data dump on VM exit is possible.
TCP Socket Client: At startup, the JaCoCo agent connects to a given TCP endpoint. Execution data is written to the socket connection on request. Optional execution data reset and execution data dump on VM exit is possible.
Technically you can just give different parameters to that javaagent so that it will run JaCoCo in one of these modes.
Anyway, we've discussed the first option, but you can also work with TCP configurations if it's required. Of course, here you'll have to handle security concerns (like permission to expose/access the port, etc).
If you work with TCP mode, there is a Maven Plugin that can come in handy. I haven't used it by myself, just googled so I can't comment whether its any good, it has only 2 stars on Github, so probably it's not production ready but maybe you could get some ideas from its source code.

I'm getting error 404 while trying to access my spring boot app on Amazon Elastic Bean Stalk

I developed a spring boot application and I've put the following entries in src/main/resources/application.properties:
spring.mvc.view.prefix: /
spring.mvc.view.suffix: .jsp
server.port=5000
Now when I start it (mvn clean spring-boot:run) locally, I'm getting the output Tomcat started on port(s): 5000 (http) and the app is accessible in the browser under http://localhost:5000/welcome .
I created a Java instance in Amazon Elastic Bean Stalk, I've uploaded war, I even opened the port 5000 in the corresponding Security Group on EC2 instance:
but when I now go to http://my-aws-ebs-instance.com/welcome:5000, I'm getting the following message:
Whitelabel Error Page This application has no explicit mapping for
/error, so you are seeing this as a fallback.
Thu Dec 20 16:30:33 UTC 2018 There was an unexpected error (type=Not
Found, status=404). /welcome.jsp
Why oh why does it happen like this? What did I forget to configure?
----EDIT
as requested, here's the root java class:
package com.hellokoding.auth;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.context.web.SpringBootServletInitializer;
#SpringBootApplication
public class WebApplication extends SpringBootServletInitializer {
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(WebApplication.class);
}
public static void main(String[] args) throws Exception {
SpringApplication.run(WebApplication.class, args);
}
}
Here is also the structure of my project with highlighted welcome.jsp page:
When I unzip the generated war file, this is the file structure on my hard drive:
My pom.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>auth</artifactId>
<name>auth</name>
<description>my descr</description>
<packaging>war</packaging>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.5.RELEASE</version>
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.7</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
and the UserController class contains:
...
#Controller
#Scope("session")
public class UserController {
#RequestMapping(value = {"/", "/welcome"}, method = RequestMethod.GET)
public String welcome(Model model) {
return "welcome";
}
...
I added some logs inside the welcome method and I see it is running correctly. Also, in log files I can see the following entry:
Mapped "{[/ || /welcome],methods=[GET]}" onto public java.lang.String com.hellokoding.auth.web.UserController.welcome(org.springframework.ui.Model)
so I have no idea why this thing does not work. After trying for 11 hours straight to make it work I'm questioning my life choices, and also I'm wondering why anyone would ever use such a stupid framework since it doesn't work ootb.
--- edit:
I've uploaded a simplified code to github https://github.com/nalogowiec/springbootProblem
Solution 1:
If you want Spring Boot With JSPs in Executable Jars
Keep in mind that we will ultimately place the JSP templates under src/main/resources/META-INF/resources/WEB-INF/jsp/
Note :
define the template prefix and suffix for our JSP files in application.properties
spring.mvc.view.prefix=/WEB-INF/jsp/
spring.mvc.view.suffix=.jsp
Then your can run jar file using below command :
java -jar <your jar name>
for your project you can below command
java -jar auth-1.3.5.RELEASE.jar
For More reference : https://dzone.com/articles/spring-boot-with-jsps-in-executable-jars-1
Solution 2:
JSP Limitations
When running a Spring Boot application that uses an embedded servlet container (and is packaged as an executable archive), there are some limitations in the JSP support.
With Jetty and Tomcat, it should work if you use war packaging. An executable war will work when launched with java -jar, and will also be deployable to any standard container. JSPs are not supported when using an executable jar.
Undertow does not support JSPs.
Creating a custom error.jsp page does not override the default view for error handling. Custom error pages should be used instead.
I have clone your GitHub project able to run project(if you follow below steps your problem will get solve definitely)
Step To run your project :
Step 1 : Create war package of your project
Step 2 : Run your war package using below command
java -jar <your war file name>
i.e for your project command should be like :
java -jar auth-1.3.5.RELEASE.war
Step 3 : Hit the URL http://localhost:5000/
You can see the result in browser.
More reference : https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-developing-web-applications.html#boot-features-jsp-limitations
Nice explanation #dipak-thoke.
Just to add if anyone automating the deployment process (In my case, it was through CodeBuild And CodeDeploy), you can create Procfile and deploy the war. I have added Procfile into the root directory of the project and added it as an artifact.
Hope this helps someone looking for same usage case :)
ProcFile:
web: java -jar <your_war_file>.war
This is how my CodeBuild Buildspec looks like:
version: 0.2
phases:
build:
commands:
# - command
- ./gradlew bootWar
post_build:
commands:
# - command
- echo Build must be completed
- mv build/libs/*.war <WarFileName>.war
artifacts:
files:
# - location
- <WarFileName>.war
- Procfile
#name: $(date +%Y-%m-%d)
#discard-paths: yes
#base-directory: location
#cache:
#paths:
# - paths
If you check the Spring Boot docs, its clear that you are using the wrong directory structure.
https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#boot-features-spring-mvc
By default, Spring Boot serves static content from a directory called /static (or /public or /resources or /META-INF/resources) in the classpath or from the root of the ServletContext ... Do not use the src/main/webapp directory if your application is packaged as a jar. Although this directory is a common standard, it works only with war packaging, and it is silently ignored by most build tools if you generate a jar.
Since you have your app on port 5000 it is accessible on that port, not default http port 80.
Either access it with
http://my-aws-ebs-instance.com:5000/welcome
or create port forwarding rune in AWS so traffing going to port 80 will be pushed you application server's port 5000.

AWS Lambda NoClassDefFoundError

I am having difficulty with a Java based Lambda function setup to receive messages from SNS. My function looks like the below:
package com.mycompany;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.LambdaLogger;
import com.amazonaws.services.lambda.runtime.events.SNSEvent;
public class LambdaHandler {
public void Handler(SNSEvent event, Context context) {
//Process the event
}
}
It compiles just fine and I don't have any problems uploading the jar file to Lambda (via the web console).
However, when I publish to it (via SNS through to the subscribed Lambda function) with JSON representing the SNSEvent model, the Lambda function throws the following exception:
Error loading method handler on class com.mycompany.LambdaHandler:
class java.lang.NoClassDefFoundError java.lang.NoClassDefFoundError:
com/amazonaws/services/lambda/runtime/events/SNSEvent at
java.lang.Class.getDeclaredMethods0(Native Method) at
java.lang.Class.privateGetDeclaredMethods(Class.java:2701) at
java.lang.Class.privateGetPublicMethods(Class.java:2902) at
java.lang.Class.getMethods(Class.java:1615) Caused by:
java.lang.ClassNotFoundException:
com.amazonaws.services.lambda.runtime.events.SNSEvent at
java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357)
I use Maven + Netbeans and it's a Maven Java Application project. I downloaded the function from the Lambda console and confirmed, the jar has a lib/ directory with all of the jar's for the imports, including aws-lambda-java-events-1.1.0.jar, which itself includes the /com/amazonaws/services/lambda/runtime/events/SNSEvent.class file.
Why is the runtime unable to find the class when it's definitely in the jar file? Is there anything else I need to do, set any environment variables, etc?
Any help would be appreciated!
EDIT 1
I tried downgrading to aws-lambda-java-events 1.0.0 and it's still reporting the same exception. As requested, below is my POM file (with just project name changed). I don't know how to tell Maven to put the libraries in a tree structure.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.app</groupId>
<artifactId>Handler</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-lambda</artifactId>
<version>1.10.6</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
Use the maven-shade plugin so that the JAR contains the dependencies in an uber-jar.
So, add this to your pom.xml
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Source: http://docs.aws.amazon.com/lambda/latest/dg/java-create-jar-pkg-maven-no-ide.html
Potentially you may have this issue https://github.com/aws/aws-lambda-java-libs/issues/2 which requires a downgrade to aws-lambda-java-events-1.0.0.jar
=== If this issue exists even after including shaded jar ===
If you have this issue even after having shaded jar then the issue should be related to aws-lambda-java-events package version (should be some incompatibility between AWS lamda version and newer aws-lambda-java-events version) . i.e. I had this issue with latest version (2.0.2) of aws-lambda-java-events package and I have to downgrade the version to 1.3.0.
Seems like newer aws-lambda-java-events version doesn't have many dependencies.
Sometimes you have to upload your lambda again. Also I got the same issue I fixed with this pom.xml:
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.83</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
In the plugins section of your pom.xml, add the Apache Maven Shade Plugin.It is used during the build process. This plugin is used for packaging jars to create a standalone .jar.The maven-shade-plugin will take artifacts (jars) produced by the package goal , and created a standalone .jar that contains the compiled code, and the resolved dependencies from the pom.xml.
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
If you have <scope>provided</scope>
in aws-lambda-java-events artifact, remove it.
Whenever We tried to upload java based Jar or Zip into AWS lambda Console we have to take care of some basic things like,
The Code URI, Which is present in SAM Template or the template.yml file.
eg:
Eg: runwayDetails - package Name
App - class name
handleRequest - lambda handler method.
Syntax should be like this - packageName.className::methodName
It will solve the the error
upload the function.zip instead of jar to lambda