I'm using this tutorial: https://github.com/MKergall/osmbonuspack/wiki/Tutorial_2
I set this Code in my Project:
NominatimPOIProvider poiProvider = new NominatimPOIProvider();
ArrayList<POI> pois = poiProvider.getPOICloseTo(startPoint, "cinema", 50, 0.1);
But I get some Errors:
NominatimPOIProvider (String) in NominatimPOIProvider cannot be applied to ()
and
java.lang.NoClassDefFoundError: Failed resolution of: Lokhttp3/Request$Builder;
at org.osmdroid.bonuspack.utils.HttpConnection.doGet(HttpConnection.java:65)
at org.osmdroid.bonuspack.utils.BonusPackHelper.requestStringFromUrl(BonusPackHelper.java:70)
at org.osmdroid.bonuspack.location.NominatimPOIProvider.getThem(NominatimPOIProvider.java:83)
at org.osmdroid.bonuspack.location.NominatimPOIProvider.getPOICloseTo(NominatimPOIProvider.java:133)
at x.x.UserArea.onCreate(UserArea.java:152)
at android.app.Activity.performCreate(Activity.java:6876)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1135)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:3207)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3350)
at android.app.ActivityThread.access$1100(ActivityThread.java:222)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1795)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:158)
at android.app.ActivityThread.main(ActivityThread.java:7229)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120)
Caused by: java.lang.ClassNotFoundException: Didn't find class "okhttp3.Request$Builder"
Error:
NominatimPOIProvider (String) in NominatimPOIProvider cannot be
applied to ()
is caused by an absence of nonparametric constructor for NominatimPOIProvider. You are required to specify user agent which will be used in headers sent to a Nominatim service provider. More details can be found in this issue and in usage policy of openstreetmap.
Use something like:
NominatimPOIProvider poiProvider = new NominatimPOIProvider("YourUserAgentSpecificForYourApplicationOrWhatever");
resolved with:
compile 'com.github.bumptech.glide:okhttp3-integration:1.4.0#aar'
compile 'com.squareup.okhttp3:okhttp:3.2.0'
Related
I'm using WSO2 IS 5.10 with docker and after making a change to the image, which has nothing to do with JSPs, opening the dashboard on the service provider list I see a white screen.
In wso2 log I found errors like this:
Servlet.service() for servlet [bridgeservlet] threw exception org.apache.jasper.JasperException: Unable to compile class for JSP:
An error occurred at line: [17] in the generated java file: [/home/wso2carbon/wso2is-5.10.0/lib/tomcat/work/Catalina/localhost/ROOT/proxytemp/hc_1893914628/org/apache/jsp/application/list_002dservice_002dproviders_jsp.java]
Only a type can be imported. org.wso2.carbon.identity.application.common.model.xsd.ApplicationBasicInfo resolves to a package
An error occurred at line: [118] in the jsp file: [/application/list-service-providers.jsp]
ApplicationBasicInfo cannot be resolved to a type
115: <%
116: String BUNDLE = "org.wso2.carbon.identity.application.mgt.ui.i18n.Resources";
117: ResourceBundle resourceBundle = ResourceBundle.getBundle(BUNDLE, request.getLocale());
118: ApplicationBasicInfo[] applications = null;
119:
120: String filterString = request.getParameter(ApplicationMgtUIConstants.SP_NAME_FILTER);
121: filterString = ApplicationMgtUIUtil.resolveFilterString(filterString);
this disappears when restarting the image.
I’d like to know what it’s due to
Here is the code snippet :
String strIndexRole = "arn:aws:iam::<my acct no>:role/Kendra-CloudwatchRole";
AWSSecurityTokenService stsClient = AWSSecurityTokenServiceClientBuilder.standard()
.withCredentials(new DefaultAWSCredentialsProviderChain())
.withEndpointConfiguration(new EndpointConfiguration("console.aws.amazon.com/kendra/home?region=us-east-1", "us-east-1"))
.build();
AssumeRoleRequest roleRequest = new AssumeRoleRequest()
.withRoleArn(strIndexRole).withDurationSeconds(7200);
AssumeRoleResult roleResponse = stsClient.assumeRole(roleRequest);
This is the exception:
15:38:30.301 [main] DEBUG org.apache.http.impl.conn.PoolingHttpClientConnectionManager - Connection released: [id: 0][route: {s}->https://console.aws.amazon.com:443][total available: 1; route allocated: 1 of 50; total allocated: 1 of 50]
Exception in thread "main" com.amazonaws.SdkClientException: Unable to unmarshall response (ParseError at [row,col]:[19,24]
Message: The reference to entity "state" must end with the ';' delimiter.). Response Code: 200, Response Text: OK
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleResponse(AmazonHttpClient.java:1750)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleSuccessResponse(AmazonHttpClient.java:1446)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1368)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1145)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:802)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:770)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:744)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:704)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:686)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:550)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:530)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.doInvoke(AWSSecurityTokenServiceClient.java:1719)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.invoke(AWSSecurityTokenServiceClient.java:1686)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.invoke(AWSSecurityTokenServiceClient.java:1675)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.executeAssumeRole(AWSSecurityTokenServiceClient.java:589)
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.assumeRole(AWSSecurityTokenServiceClient.java:561)
at com.aws.kendra.trial.SampleKendraTrial.main(SampleKendraTrial.java:73)
Caused by: javax.xml.stream.XMLStreamException: ParseError at [row,col]:[19,24]
Message: The reference to entity "state" must end with the ';' delimiter.
at com.sun.org.apache.xerces.internal.impl.XMLStreamReaderImpl.next(XMLStreamReaderImpl.java:604)
at com.sun.xml.internal.stream.XMLEventReaderImpl.peek(XMLEventReaderImpl.java:276)
at com.amazonaws.transform.StaxUnmarshallerContext.nextEvent(StaxUnmarshallerContext.java:220)
at com.amazonaws.services.securitytoken.model.transform.AssumeRoleResultStaxUnmarshaller.unmarshall(AssumeRoleResultStaxUnmarshaller.java:40)
at com.amazonaws.services.securitytoken.model.transform.AssumeRoleResultStaxUnmarshaller.unmarshall(AssumeRoleResultStaxUnmarshaller.java:28)
at com.amazonaws.http.StaxResponseHandler.handle(StaxResponseHandler.java:106)
at com.amazonaws.http.StaxResponseHandler.handle(StaxResponseHandler.java:42)
at com.amazonaws.http.response.AwsResponseHandlerAdapter.handle(AwsResponseHandlerAdapter.java:69)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleResponse(AmazonHttpClient.java:1726)
... 16 more
I think part of the problem you have here is the way you are configuring your AWSSecurityTokenService. This problem is also indicated by following line in exception stack trace that you have posted above.
at com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient.assumeRole(AWSSecurityTokenServiceClient.java:561)
Please refer this on how to assume an IAM role, get temporary credentials and invoke an AWS Service (here S3 is the AWS Service being called using temporary credentials) in Java. You can use the same concept to invoke Kendra APIs. From the above example you can take a clue on how to build BasicSessionCredentials and use that to build KendraClient (similar to how AmazonS3 client was built using AmazonS3ClientBuilder in above example). Once you have built KendraClient, you can refer to this example on how to query your Kendra index.
I'm trying to use a multidelimiter in a table insert for a hive job in emr on amazon aws. As explained in this link. The delimiter for the file is "|".
https://cwiki.apache.org/confluence/display/Hive/MultiDelimitSerDe
However, I ended up having to use...
ROW FORMAT SERDE 'org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe'
Instead of the documented...
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.MultiDelimitSerDe'
in order for it to not give me this error.
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Cannot validate serde: org.apache.hadoop.hive.serde2.MultiDelimitSerDe
OK. So when I don't get that error, by adding the .contrib, I get this error which is caused by Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found
Status: Failed
Vertex failed, vertexName=Map 1, vertexId=vertex_1548264520414_0027_1_00, diagnostics=[Task failed, taskId=task_1548264520414_0027_1_00_000021, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1548264520414_0027_1_00_000021_0:java.lang.RuntimeException: java.lang.RuntimeException: Map operator initialization failed
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370)
at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1840)
at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Map operator initialization failed
at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:354)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:184)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassNotFoundException: Class org.apache.hadoop.hive.contrib.serde2.MultiDelimitSerDe not found
at org.apache.hadoop.hive.ql.exec.MapOperator.getConvertedOI(MapOperator.java:328)
at org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:420)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.init(MapRecordProcessor.java:286)
... 15 more
So I've been reading that you have to add the .jar file.
https://community.hortonworks.com/questions/82189/hive-cannot-see-jar.html
And so I've tried all kinds of things to get this to work. It says that it is adding it it to the class path.
hive> add jar /usr/lib/hive/lib/hive-contrib-2.3.3-amzn-1.jar
> ;
Added [/usr/lib/hive/lib/hive-contrib-2.3.3-amzn-1.jar] to class path
Added resources: [/usr/lib/hive/lib/hive-contrib-2.3.3-amzn-1.jar]
hive> add jar /usr/lib/hive/lib/hive-contrib.jar
> ;
Added [/usr/lib/hive/lib/hive-contrib.jar] to class path
Added resources: [/usr/lib/hive/lib/hive-contrib.jar]
hive> exit;
So I'm not sure what to do. It's acting as if the .jar file for hive-contrib isn't in the class path despite me adding it. I've also tried running...
export HADOOP_USER_CLASSPATH_FIRST=true
which is found here...
How to include jars in Hive (Amazon Hadoop env)
And that doesn't fix it either.
How can I use a multidelimiter SerDe property for a hive job on aws?
Thank you.
I could not get MultiDelimitSerDe to work. Instead, I was lucky in that the delimiter had quotations on either side of the pipe. So it looks like "|". This turns the values between the quotes into strings, so the additional pipes in those column values don't act as delimiters.
"Test | Test2 "|" Test3 | Test 4 | Test 5 "|" Test 6 "
You can see an explanation in the link below. The part that talks about it is in the comments, not the article.
https://www.ericlin.me/2015/07/how-to-create-a-hive-multi-character-delimitered-table/
If I didn't have those quotation marks around the delimiter, I'm not sure how I would have been able to work with a multi delimiter. Especially if I had quotations in any of my fields, but after checking, out of the billions of rows, there is not a single quote.
1) I'm getting below error in wso2carbon logs when I try to configure wso2 apim-analytics(2.1) server with Oracle DB(12c version). I have tried with ojdbc6.jar and ojdbc7.jar in lib folder but still error is there.
error:
Caused by: java.lang.RuntimeException: ORA-28040: No matching authentication
protocol
2) Is there any REST api available for wso2 apim-analytics similar to DAS server to extract data?
full error:
ERROR
{org.wso2.carbon.analytics.spark.core.AnalyticsTask} - Error while executing
the scheduled task for the script: APIM_LAST_ACCESS_TIME_SCRIPT
{org.wso2.carbon.analytics.spark.core.AnalyticsTask}
org.wso2.carbon.analytics.spark.core.exception.AnalyticsExecutionException:
Exception in executing query create temporary table APILastAccessSummaryData
using CarbonJDBC options (dataSource "WSO2AM_STATS_DB", tableName
"API_LAST_ACCESS_TIME_SUMMARY", schema "tenantDomain STRING ,
apiPublisher STRING , api STRING , version STRING , userId STRING ,
context STRING , max_request_time LONG ", primaryKeys
"tenantDomain,apiPublisher,api" )
at
org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQueryLocal(SparkAnalyticsExecutor.java:764)
at
org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.executeQuery(SparkAnalyticsExecutor.java:721)
at
org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeQuery(CarbonAnalyticsProcessorService.java:201)
at
org.wso2.carbon.analytics.spark.core.CarbonAnalyticsProcessorService.executeScript(CarbonAnalyticsProcessorService.java:151)
at
org.wso2.carbon.analytics.spark.core.AnalyticsTask.execute(AnalyticsTask.java:60)
at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:67)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException:
ORA-28040: No matching authentication protocol
thanks,
Santosh
This was an issue identified in Oracle, and the workaround is : to set SQLNET.ALLOWED_LOGON_VERSION=8 in the $crs_home/network/admin/sqlnet.ora file. [1]
[1] https://community.softwaregrp.com/t5/UCMDB-and-UD-Practitioners-Forum/ORA-28040-No-matching-authentication-protocol/m-p/253403
I am trying to import a JSON data from S3, and after making some queries, export the output as JSON format to S3 again. However, I get the "org.apache.hadoop.hive.serde2.SerDeException: java.io.IOException: Start token not found where expected" error at hive step on EMR cluster. In order to understand what the problem is, I simplify the Hive script and JSON data, but it keeps giving the same error. How can I solve this problem?
Cluster configuration:
Release: emr-5.3.1
Hive version: 2.1.1
Hadoop distribution: Amazon 2.7.3
Service Role: EMR_DefaultRole
MasterInstanceType: m4.large
The content of the simplifed JSON data:
[{"MyID":"FOO123","MyField":"FOO"},{"MyID":"BAR123","MyField":"BAR"}]
Hive script:
DROP TABLE IF EXISTS SOURCE;
DROP TABLE IF EXISTS DESTINATION;
CREATE EXTERNAL TABLE SOURCE(MyID STRING, MyField STRING)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
LOCATION 's3://myPath/subPath/';
CREATE EXTERNAL TABLE DESTINATION(MyID STRING, MyField STRING)
ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
LOCATION 's3://anotherPath/subPath/';
INSERT OVERWRITE TABLE DESTINATION SELECT MyID, MyField FROM SOURCE;
And here is the stack trace:
Vertex failed, vertexName=Map 4, vertexId=vertex_1278452616863_0001_1_00, diagnostics=[Task failed, taskId=task_1278452616863, diagnostics=[TaskAttempt 0 failed, info=[Error: Error while running task ( failure ) : attempt_1278452616863:java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable [{"MyID":"FOO123","MyField":"FOO"},{"MyID":"BAR123","MyField":"BAR"}]
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:211)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:168)
at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:370)
at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable [{"MyID":"FOO123","MyField":"FOO"},{"MyID":"BAR123","MyField":"BAR"}]
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:95)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:70)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:383)
at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:185)
... 14 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable [{"MyID":"FOO123","MyField":"FOO"},{"MyID":"BAR123","MyField":"BAR"}]
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497)
at org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:86)
... 17 more
Caused by: org.apache.hadoop.hive.serde2.SerDeException: java.io.IOException: Start token not found where expected
at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:183)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.readRow(MapOperator.java:128)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.access$200(MapOperator.java:92)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:488)
... 18 more
Caused by: java.io.IOException: Start token not found where expected
at org.apache.hive.hcatalog.data.JsonSerDe.deserialize(JsonSerDe.java:169)
... 21 more
Thanks.
JSON should start with { and not with array ([)
I tried with this approach updated my JSON file with structure as
{"MyID":"FOO123","MyField":"FOO"},
{"MyID":"BAR123","MyField":"BAR"}
but after done, I noticed only the first object is being inserted into the table.