I am adding new api to wso2 as described in the wso2 Publisher APIs
The query is the following
http://testapiaddress:9763/publisher/site/blocks/item-add/ajax/add.jag?action=addAPI&name=PhoneVerification&context=/phoneverify&version=1.0&visibility=public&thumbUrl=&description=Verify a phone number&tags=phone,mobile,multimedia&endpointType=nonsecured&tiersCollection=Gold,Bronze&http_checked=http&https_checked=https&uriTemplate-0=/*&default_version_checked=default_version&bizOwner=xx&bizOwnerMail=xx#ee.com&techOwner=xx&techOwnerMail=ggg#ww.com"&endpoint_config={"production_endpoints":{"url":"http://myaccountapi.dev.payoneer.com","config":null},"endpoint_type":"address"}&swagger={"paths" : {"/CheckPhoneNumber?PhoneNumber={number}" : {"get" : {"parameters" : [{"description" : "phone number", "name" : "number", "allowMultiple" : false, "type" : "string", "required" : true, "in" : "path"}], "responses" : {"200" : {}}, "x-auth-type" : "Application%20%26%20Application%20User", "x-throttling-tier" : "Unlimited"}}, "/test" : {}, "/" : {}}, "swagger" : "2.0", "info" : {"title" : "WeatherAPI", "version" : "1.0.0"}}
I can see that the api was created in publisher portal but i get error as a response
{"error" : false}
Their response is very generic, but maybe someone has idea why i get this error.
This is not an error. It says that there is no error.
If there is an error response will be like below:
{"error" : true, "description":"error description"}
Related
I'm trying to run a dataflow batch job using template "Text file on Cloud Storage to BigQuery". First three steps are working but in the last stage it is getting failed giving error:
Error message from worker: java.lang.RuntimeException: Failed to create job with prefix beam_bq_job_LOAD_textiotobigquerydataflow0releaser1025091627592969dd_1a449a94623645758e91dcba53a86498_fc44bdad405c2c80860231502c18eb1e_00001_00000, reached max retries: 3, last failed job: { "configuration" : { "jobType" : "LOAD", "labels" : { "beam_job_id" : "2022-11-10_02_06_07-15255037958352274885" }, "load" : { "createDisposition" : "CREATE_IF_NEEDED", "destinationTable" : { "datasetId" : "minerals_test_dataset", "projectId" : "jio-big-data-poc", "tableId" : "mytable01" }, "ignoreUnknownValues" : false, "sourceFormat" : "NEWLINE_DELIMITED_JSON", "useAvroLogicalTypes" : false, "writeDisposition" : "WRITE_APPEND" } }, "etag" : "LHqft9L/H4XBWTNZ7BSRXA==", "id" : "jio-big-data-poc:asia-south1.beam_bq_job_LOAD_textiotobigquerydataflow0releaser1025091627592969dd_1a449a94623645758e91dcba53a86498_fc44bdad405c2c80860231502c18eb1e_00001_00000-2", "jobReference" : { "jobId" : "beam_bq_job_LOAD_textiotobigquerydataflow0releaser1025091627592969dd_1a449a94623645758e91dcba53a86498_fc44bdad405c2c80860231502c18eb1e_00001_00000-2", "location" : "asia-south1", "projectId" : "jio-big-data-poc" }, "kind" : "bigquery#job", "selfLink" : "https://bigquery.googleapis.com/bigquery/v2/projects/jio-big-data-poc/jobs/beam_bq_job_LOAD_textiotobigquerydataflow0releaser1025091627592969dd_1a449a94623645758e91dcba53a86498_fc44bdad405c2c80860231502c18eb1e_00001_00000-2?location=asia-south1", "statistics" : { "creationTime" : "1668074949767", "endTime" : "1668074949869", "startTime" : "1668074949869" }, "status" : { "errorResult" : { "message" : "Provided Schema does not match Table jio-big-data-poc:minerals_test_dataset.mytable01. Cannot add fields (field: marks)", "reason" : "invalid" }, "errors" : [ { "message" : "Provided Schema does not match Table jio-big-data-poc:minerals_test_dataset.mytable01. Cannot add fields (field: marks)", "reason" : "invalid" } ], "state" : "DONE" }, "user_email" : "49449455496-compute#developer.gserviceaccount.com", "principal_subject" : "serviceAccount:49449455496-compute#developer.gserviceaccount.com" }. org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:200) org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:153) org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:378)
I tried running same job with other datasets csv files, also the javascript udf and json schema are according to the documentation, but the job is failing at the same stage. So, what can be the possible solution to this error?
The Json schema you given doesn't matches the BigQuery schema of your table :
"Provided Schema does not match Table jio-big-data-poc:minerals_test_dataset.mytable01. Cannot add fields (field: marks)", "reason" : "invalid" }, "errors" : [ { "message" : "Provided Schema does not match Table jio-big-data-poc:minerals_test_dataset.mytable01. Cannot add fields (field: marks)", "reason" : "invalid" } ]
There is a field called field: marks that seems to not exists in the BigQuery table.
If you update your BigQuery schema to match perfectly with the fields of your input Json line and element, that will solve the issue.
Iam using django-push-notification package in django project ,
In android it the json response should be in the format :
{ "data": { "title" : "You have a notification", "body" : "The body of the notification", "id" : 1234,
"backgroundImage" : "assets/notifications/background.png", },
"notification" : { "alert" : "You have a notification", "title" : "You have a notification",
"body" : "The body of the notification", "sound" : "default",
"backgroundImage" : "assets/notifications/background.png", "backgroundImageTextColour" : "#FFFFFF" } }
So that only android can get the foreground and background notifications.
I tried the code :
Views.py -
from push_notifications.models import APNSDevice, GCMDevice
try:
fcm_device = GCMDevice.objects.all()
fcm_device.send_message("new-notification-out",title="title-out",\
extra={"data": { "title-data" : "title-in", "body" : "new-notification-in" }, \
"notification" : { "alert" : "You have one new notice", "title" : "title-notify",\
"body" : "new-notification" }})
except:
pass
But not getting the payload "data","notification" in android side,
How can I send the payload accurately from backend side ?
I need to call elastic search engine directly from api gateway using http connection eg.
https:////_doc/_search?pretty&filter_path=hits.hits._source
i have n number of orders in elastic search engine which i want to get using get query ,but i want only array of json i posted and dont want any other information in the response.how can i do that ?
eg.
this is what i am getting :
{
"hits" : {
"hits" : [
{
"_index" : "gpss_orders",
"_type" : "_doc",
"_id" : "4867254",
"_score" : 1.0,
"_source" : {
"orderId" : 4867254,
"loadId" : 18214,
"orderTypeId" : 1
}
]
}
}
But i would want response something like this :
[ {
"orderId" : 4867254,
"loadId" : 18214,
"orderTypeId" : 1
}]
do i need to change in api gateway method response?
i changed the api gateway method response template and got the expected out
#set($esOutput = $input.path('$.hits.hits'))
#set($orders = [])
#foreach( $elem in $esOutput )
#set($order = $elem["_source"])
#set($response = $orders.add($order) )
#end
$orders
but now the problem i am facing is that though response from elastic search engine is the proper json,response after method integration template update become like this without any braces :
[{orderId=4867254, loadId=18214, orderTypeId=1, orderTypeName=Fuel}]
response from elastic search :
"took" : 1,
"hits" : {
"hits" : [
{
"_id" : "4867254",
"_score" : 1.0,
"_source" : {
"orderId" : 4867254,
"loadId" : 18214,
"orderTypeId" : 1,
There isn't a way to shape the return object from elasticsearch. Depending on how you access this data, you could have your own server-side code as a proxy make the query and remove extraneous information before returning it to the clients. A bonus is you can use the proxy to decide what information to return depending on factors such as permissions, caching or rate-limiting, etc.
I have the following DynamoDB mapping template, to update an existing DynamoDB item:
{
"version" : "2017-02-28",
"operation" : "UpdateItem",
"key" : {
"id": $util.dynamodb.toDynamoDBJson($ctx.args.application.id),
"tenant": $util.dynamodb.toDynamoDBJson($ctx.identity.claims['http://domain/tenant'])
},
"update" : {
"expression" : "SET #sourceUrl = :sourceUrl, #sourceCredential = :sourceCredential, #instanceSize = :instanceSize, #users = :users",
"expressionNames" : {
"#sourceUrl" : "sourceUrl",
"#sourceCredential" : "sourceCredential",
"#instanceSize" : "instanceSize",
"#users" : "users"
},
"expressionValues" : {
":sourceUrl" : $util.dynamodb.toDynamoDbJson($ctx.args.application.sourceUrl),
":sourceCredential" : $util.dynamodb.toDynamoDbJson($ctx.args.application.sourceCredential),
":instanceSize" : $util.dynamodb.toDynamoDbJson($ctx.args.application.instanceSize),
":users" : $util.dynamodb.toDynamoDbJson($ctx.args.application.users)
}
},
"condition" : {
"expression" : "attribute_exists(#id) AND attribute_exists(#tenant)",
"expressionNames" : {
"#id" : "id",
"#tenant" : "tenant"
}
}
}
But I'm getting the following error:
message: "Unable to parse the JSON document: 'Unrecognized token '$util': was expecting ('true', 'false' or 'null')↵ at [Source: (String)"{↵ "version" : "2017-02-28",↵ "operation" : "UpdateItem",↵ "key" : {↵ "id": {"S":"abc-123"},↵ "tenant": {"S":"test"}↵ },↵ "update" : {↵ "expression" : "SET #sourceUrl = :sourceUrl, #sourceCredential = :sourceCredential, #instanceSize = :instanceSize, #users = :users",↵ "expressionNames" : {↵ "#sourceUrl" : "sourceUrl",↵ "#sourceCredential" : "sourceCredential",↵ "#instanceSize" : "instanceSize",↵ "#users" : "users"↵ }"[truncated 400 chars]; line: 17, column: 29]'"
I've tried removing parts, and it seems to be related to the expressionValues, but I can't see anything wrong with the syntax.
Seems like you misspelled the toDynamoDBJson method
Replace
$util.dynamodb.toDynamoDbJson($ctx.args.application.sourceUrl)
with
$util.dynamodb.toDynamoDBJson($ctx.args.application.sourceUrl)
Note the uppercase B in toDynamoDBJson.
I'm using django-haystack and ElasticSearch to index Stores.
Until now, each store had one lat,long coordinate pair; we had to change this to represent the fact that one store can deliver products to very different regions (disjunct) I've added up to ten locations (lat,long pairs) to them.
When using one location field everything was working fine and I got right results. Now, with multiple location fields, I can't get any results, not even the previuos one, for the same user and store coordinates.
My Index is as following:
class StoreIndex(indexes.SearchIndex,indexes.Indexable):
text = indexes.CharField(document=True, use_template=True,
template_name='search/indexes/store/store_text.txt')
location0 = indexes.LocationField()
location1 = indexes.LocationField()
location2 = indexes.LocationField()
location3 = indexes.LocationField()
location4 = indexes.LocationField()
location5 = indexes.LocationField()
location6 = indexes.LocationField()
location7 = indexes.LocationField()
location8 = indexes.LocationField()
location9 = indexes.LocationField()
def get_model(self):
return Store
def prepare_location0(self, obj):
# If you're just storing the floats...
return "%s,%s" % (obj.latitude, obj.longitude)
# ..... up to prepare_location9
def prepare_location9(self, obj):
# If you're just storing the floats...
return "%s,%s" % (obj.latitude_9, obj.longitude_9)
Is this the correct way to build my index?
From elasticsearch I get this mapping information:
curl -XGET http://localhost:9200/stores/_mapping?pretty=True
{
"stores" : {
"modelresult" : {
"properties" : {
"django_id" : {
"type" : "string"
},
"location0" : {
"type" : "geo_point",
"store" : "yes"
},
"location1" : {
"type" : "geo_point",
"store" : "yes"
},
"location2" : {
"type" : "geo_point",
"store" : "yes"
},
"location3" : {
"type" : "geo_point",
"store" : "yes"
},
"location4" : {
"type" : "geo_point",
"store" : "yes"
},
"location5" : {
"type" : "geo_point",
"store" : "yes"
},
"location6" : {
"type" : "geo_point",
"store" : "yes"
},
"location7" : {
"type" : "geo_point",
"store" : "yes"
},
"location8" : {
"type" : "geo_point",
"store" : "yes"
},
"location9" : {
"type" : "geo_point",
"store" : "yes"
},
"text" : {
"type" : "string",
"analyzer" : "snowball",
"store" : "yes",
"term_vector" : "with_positions_offsets"
}
}
}
}
}
Then, I try to query this way:
sqs0 = SearchQuerySet().dwithin('location0', usuario, max_dist).distance('location0',usuario).using('stores')
where:
usuario is a Point instance representing the user trying to find stores near his position and
max_dist is a D instance.
If I query directly, using curl I got no results, too.
Here is the result of quering using curl with multiple location fields:
$ curl -XGET http://localhost:9200/stores/modelresult/_search?pretty=true -d '{ "query" : { "match_all": {} }, "filter" : {"geo_distance" : { "distance" : "6km", "location0" : { "lat" : -23.5, "lon" : -46.6 } } } } '
{
"took" : 1,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 0,
"max_score" : null,
"hits" : [ ]
}
}
If comment out the fields location1-9 from the StoreIndex class everything works fine, but if I leave them to get multiple location points, I get no results for the same query (user position). This happens for the same query, in django as directly, using curl. That is, if I have only one location (say location0), both queries returns correct results. With more locations (location0-9), both queries didn't give any results.
Here's the results of quering directly using curl with only one location field:
$ curl -XGET http://localhost:9200/stores/modelresult/_search?pretty=true -d '{ "query" : { "match_all": {} }, "filter" : {"geo_distance" : { "distance" : "6km", "location0" : { "lat" : -23.5, "lon" : -46.6 } } } } '
{
"took" : 3,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 9,
"max_score" : 1.0,
"hits" : [ {
"_index" : "stores",
"_type" : "modelresult",
"_id" : "store.store.110",
"_score" : 1.0, "_source" : {"django_ct": "store.store", "text": "RESULT OF THE SEARCH \n\n", "django_id": "110", "id": "store.store.110", "location0": "-23.4487554,-46.58912"}
},
lot's of results here
]
}
}
Of course, I rebuild_index after any change in StoreIndex.
Any help on how to get multiple location fields working with elasticsearch and django?
PS.: I've cross posted this question on Django-Haystack and ElasticSearch Google Groups.
https://groups.google.com/d/topic/elasticsearch/85fg7vdCBBU/discussion
https://groups.google.com/d/topic/django-haystack/m2A3_SF8-ls/discussion
Thanks in advance
Mário