I am building a spark application using aws sdk to access an S3 source. I am getting the below error:
java.lang.NoSuchMethodError:
org.apache.http.conn.ssl.SSLConnectionSocketFactory.(Ljavax/net/ssl/SSLContext;Ljavax/net/ssl/HostnameVerifier;)V
Looked up online for solution and it appears my spark application is using a wrong httpclient. The following thread seems to offer a solution but not sure how I can override the default default httpclient.
What version of httpclient is compatible with the Amazon SDK v 1.11.5?
Here are the different httpclients that I have in my system.
./Applications/IBM Notes.app/Contents/MacOS/shared/eclipse/plugins/org.apache.wink_1.1.2.20150826-0855/lib/httpclient-4.0.1.jar
./Users/XXXXX/.ivy2/cache/org.apache.httpcomponents/httpclient/jars/httpclient-4.1.2.jar
./Users/XXXXX/.ivy2/cache/org.apache.httpcomponents/httpclient/jars/httpclient-4.5.1.jar
./Users/XXXXX/.m2/repository/org/apache/httpcomponents/httpclient/4.0.2/httpclient-4.0.2.jar
./Users/XXXXX/.m2/repository/org/apache/httpcomponents/httpclient/4.3.6/httpclient-4.3.6.jar
./Users/XXXXX/Downloads/aws-java-sdk-1.11.110/third-party/lib/httpclient-4.5.2.jar
./usr/local/aws-java/aws-java-sdk-1.11.109/third-party/lib/httpclient-4.5.2.jar
./usr/local/spark/spark-2.1.0-bin-hadoop2.7/jars/httpclient-4.5.2.jar
./usr/local/zeppelin/interpreter/alluxio/httpclient-4.3.6.jar
./usr/local/zeppelin/interpreter/bqsql/httpclient-4.3.6.jar
./usr/local/zeppelin/interpreter/elasticsearch/httpclient-4.3.6.jar
./usr/local/zeppelin/interpreter/hbase/httpclient-4.3.6.jar
./usr/local/zeppelin/interpreter/kylin/httpclient-4.3.6.jar
./usr/local/zeppelin/interpreter/lens/httpclient-4.3.6.jar
./usr/local/zeppelin/interpreter/livy/httpclient-4.3.4.jar
./usr/local/zeppelin/interpreter/pig/httpclient-4.3.6.jar
./usr/local/zeppelin/lib/httpclient-4.3.6.jar
./usr/local/zeppelin/lib/interpreter/httpclient-4.3.6.jar
I do not have classpath specified, so I am not sure which httpclient it is picking. How do I override such that it always picks up ./usr/local/aws-java/aws-java-sdk-1.11.109/third-party/lib/httpclient-4.5.2.jar?
Copying httpclient-4.5.2.jar and httpcore-4.4.4.jar into the zeppelin/interpreter/spark folder got rid of this error.
Related
I am trying to create a lambda S3 listener leveraging Lambda as a native image. The point is to get the S3 event and then do some work by pulling the file, etc. To get the file I am using het AWS 2.x S3 client as below
S3Client.builder().httpClient().build();
This code results in
2020-03-12 19:45:06,205 ERROR [io.qua.ama.lam.run.AmazonLambdaRecorder] (Lambda Thread) Failed to run lambda: software.amazon.awssdk.core.exception.SdkClientException: Unable to load an HTTP implementation from any provider in the chain. You must declare a dependency on an appropriate HTTP implementation or pass in an SdkHttpClient explicitly to the client builder.
To resolve this I added the aws apache client and updated the code to do the following:
SdkHttpClient httpClient = ApacheHttpClient.builder().
maxConnections(50).
build()
S3Client.builder().httpClient(httpClient).build();
I also had to add:
[
["org.apache.http.conn.HttpClientConnectionManager",
"org.apache.http.pool.ConnPoolControl","software.amazon.awssdk.http.apache.internal.conn.Wrapped"]
]
After this I am now getting the following stack trace:
Caused by: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty
at java.security.cert.PKIXParameters.setTrustAnchors(PKIXParameters.java:200)
at java.security.cert.PKIXParameters.<init>(PKIXParameters.java:120)
at java.security.cert.PKIXBuilderParameters.<init>(PKIXBuilderParameters.java:104)
at sun.security.validator.PKIXValidator.<init>(PKIXValidator.java:86)
... 76 more
I am running version 1.2.0 of qurkaus on 19.3.1 of graal. I am building this via Maven and the the provided docker container for Quarkus. I thought the trust store was added by default (in the build command it looks to be accurate) but am I missing something? Is there another way to get this to run without the setting of the HttpService on the S3 client?
There is a PR, under review at the moment, that introduces AWS S3 extension both JVM & Native. AWS clients are fully Quarkified, meaning configured via application.properties and enabled for dependency injection. So stay tuned as it most probably be available in Quarkus 1.5.0
I am trying to get the push notifications to work from amazon AWS simple notifications services and unity using their AmazonAWS SDK. I've been following the setup guide linked here. but when I tried to build the sample scene provided with the sdk I get this error on my phone.
Error Image
I did put google-play-services.jar and android support.jar inside the Assets\plugins\android folder but for some reason its not able to find the GCM class. Could you please tell me as of what I might be doing wrong?
Error :
AndroidJavaException: java.lang.NoclassDefFoundError: Failed resolution of Lcom/google/android/gms/gcm/googlecloudmessaging;
caused by java.lang.classnotfoundexception: com.google.android.gms.gcm.googlecloudmessaging
(I screen shot it from the logscreen on my phone there is no way to copy the whole error message.)
after a bit of digging I found the another jar file of play sevices , turns out the one I was using didnt have the reference to the GCM class
I have two python projects running locally:
A cloud endpoints python project using the latest App Engine version.
A client project which consumes the endpoint functions using the latest google-api-python-client (v 1.5.1).
Everything was fine until I renamed one endpoint's function from:
#endpoints.method(MyRequest, MyResponse, path = "save_ocupation", http_method='POST', name = "save_ocupation")
def save_ocupation(self, request):
[code here]
To:
#endpoints.method(MyRequest, MyResponse, path = "save_occupation", http_method='POST', name = "save_occupation")
def save_occupation(self, request):
[code here]
Looking at the local console (http://localhost:8080/_ah/api/explorer) I see the correct function name.
However, by executing the client project that invokes the endpoint, it keeps saying that the new endpoint function does not exist. I verified this using the ipython shell: The dynamically-generated python code for invoking the Resource has the old function name despite restarting both the server and client dozens of times.
How can I force the api client to get always the latest endpoint api document?
Help is appreciated.
Just after posting the question, I resumed my Ubuntu PC and started Eclipse and the python projects from scratch and now everything works as expected. This sounds like a kind of a http client cache, or a stale python process, which prevented from getting the latest discovery document and generating the corresponding resource code.
This is odd as I have tested running these projects outside and inside Eclipse without success. But I prefer documenting this just in case someone else has this issue.
I get the error below when I try to access the Amazon SQS WSDL:
http://queue.amazonaws.com/doc/2012-11-05/QueueService.wsdl
Cannot access the WSDL or the WSDL file is invalid.
I believe I have .NET 4 SDK installed and I have tried downloading the WSDL file to a local drive and pointing the proxy wizard to it. Still the same error.
Can someone try to use it and let me know your outcomes?
Try running the .Net WSDL utility directly on the WSDL. That utility reports back error information. It's also what PowerBuilder is calling under the covers, but is not sharing the error information back to you.
When I do that, I get this result:
Error: Unable to import binding 'SimpleQueueServicePostBinding' from namespace '
http://queue.amazonaws.com/doc/2012-11-05/'.
- The operation 'GetQueueUrl' on portType 'SimpleQueueServicePortType' from na
mespace 'http://queue.amazonaws.com/doc/2012-11-05/' had the following syntax error:
The operation has no matching binding. Check if the operation, input and
output names in the Binding section match with the corresponding names in the PortType section.
It looks like it might be a problem with the format of the WSDL. Not the first time that's happened, I've had to edit one of their other WSDL files by hand to correct an error in it.
If you choose to do that, you can download the file to your local machine, make the edits, and then run the PB proxy tool against the local file.
I implemented an application with Elastic Beanstalk. Since some classes shall be persisted, I use (Apache's) JDO annotations in combination with DataNucleus.
When running the application on a local server everything works fine, meaning I can persist plain old Java objects to the connected Amazon RDS. When deploying the same application to Elastic Beanstalk I receive following error message:
org.datanucleus.api.jdo.exceptions.ClassNotPersistenceCapableException: The class "edu.kit.aifb.cloudcampus.dom.mgmt.Dozent" is not persistable. This means that it either hasnt been enhanced, or that the enhanced version of the file is not in the CLASSPATH (or is hidden by an unenhanced version), or the Meta-Data/annotations for the class are not found.
NestedThrowables:
org.datanucleus.exceptions.ClassNotPersistableException: The class "edu.kit.aifb.cloudcampus.dom.mgmt.Dozent" is not persistable. This means that it either hasnt been enhanced, or that the enhanced version of the file is not in the CLASSPATH (or is hidden by an unenhanced version), or the Meta-Data/annotations for the class are not found.
org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:380)
org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:731)
org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:751)
I am wondering why this is happening, since I have programmaticaly enhanced my classes with following lines of code
public void enhanceJDOClasses(String...clazzes) {
JDOHelper.getEnhancer().addClasses(clazzes).enhance();
}
Is there any recommended way to handle this in another way or did anybody experience similar exceptions? Any help is appreciated.
So the enhanced version of the class is not in the current classLoader. That command doesn't put it there, just enhances the classes, and once a class (unenhanced) is loaded in a ClassLoader it can't be replaced. You can however set the classLoader to the enhancer