failed to parse search source, expected field name but got [START_OBJECT] - amazon-web-services

My Search Body:
{
"query":{
"filtered":{
"filter":{
"bool":{
"should":[
{
"term":{
"categories":2
}
},
{
"term":{
"categories":5
}
}
]
}
}
},
"bool":{
"should":[
{
"match":{
"name":"perferendis"
}
},
{
"match":{
"brand":"in"
}
}
]
}
},
"filter":{
"and":{
"filters":[
{
"bool":{
"must_not":{
"term":{
"condition":1
}
}
}
},
{
"range":{
"highest_sales_rank":{
"gte":96
}
}
},
{
"range":{
"review_rating":{
"gte":1
}
}
},
{
"range":{
"review_count":{
"gte":12
}
}
},
{
"range":{
"upper_price":{
"gte":68
}
}
},
{
"bool":{
"must_not":{
"term":{
"updated_at":0
}
}
}
}
]
}
},
"sort":{
"updated_at":"asc"
},
"size":10,
"from":40
}
However if I take out the filtered part the query succeeds
"filtered":{
"filter":{
"bool":{
"should":[
{
"term":{
"categories":2
}
},
{
"term":{
"categories":5
}
}
]
}
}
},
I previously used this format:
"filter":{
"bool":{
"should":[
{
"match":{
"categories":"16310211"
}
},
{
"match":{
"categories":"493964"
}
}
]
}
},
but it only works with elastic search 2, and as AWS only supports 1.5.6 I am not able to use this format, related to my previous question Narrowing search result to multiple categories

the Query DSL had changes among version 1.x and 2.x, do you need change your query, i did an example.
{
"query": {
"filtered": {
"filter": {
"bool": {
"should": [
{
"term": {
"categories": 2
}
},
{
"term": {
"categories": 5
}
},
{
"bool": {
"should": [
{
"match": {
"name": "perferendis"
}
},
{
"match": {
"brand": "in"
}
}
]
}
}
],
"must": [
{
"range": {
"highest_sales_rank": {
"gte": 96
}
}
},
{
"range": {
"review_rating": {
"gte": 1
}
}
},
{
"range": {
"review_count": {
"gte": 12
}
}
},
{
"range": {
"upper_price": {
"gte": 68
}
}
}
],
"must_not": [
{
"term": {
"condition":1
}
},
{
"term":{
"updated_at":0
}
}
]
}
}
}
},
"sort":{
"updated_at":"asc"
},
"size":10,
"from":40
}
And i removed your AND filter, AND filters does not cache in a good way.
Feel free to ask some questions.

Related

How to add new attributes to map in DynamoDB?

My database structure,
{
"email":"example#mail.com",
"products": {
"product1":{
"price":"$10",
"details":"detail"
},
"product2":{
"price":"$20",
"details":"detail"
}
}
}
I want to add new attributes to "products" map and expected output as follow,
{
"email":"example#mail.com",
"products": {
"product1":{
"price":"$10",
"details":"detail"
},
"product2":{
"price":"$20",
"details":"detail"
},
"product3":{
"price":"$10",
"details":"detail"
}
}
}
I am using API Gateway and UpdateItem action. Here is my mapping template,
{
"TableName": "tableName",
"Key": {
"email": {
"S": "$input.path('$.email')"
}
},
"UpdateExpression": "SET #pr = :vals",
"ExpressionAttributeNames": {
"#pr": "products"
},
"ExpressionAttributeValues": {
":vals": {
"M": {
"$input.path('$.productId')": {
"M": {
"price": {
"N": "$input.path('$.price')"
},
"details": {
"S": "$input.path('$.details')"
}
}
}
}
}
},
"ReturnValues": "NONE"
}
Using above template will replace all my attributes. Actual output,
{
"email":"example#mail.com",
"products": {
"product3":{
"price":"$10",
"details":"detail"
}
}
}
How I can add new attributes to map instead of replace it?
Thanks.
In your request you are SETting the entire Products map, but you only want to add a nested map.
{
"TableName": "tableName",
"Key": {
"email": {
"S": "$input.path('$.email')"
}
},
"UpdateExpression": "SET #pr.product3 = :vals",
"ExpressionAttributeNames": {
"#pr": "products"
},
"ExpressionAttributeValues": {
":vals": {
"M": {
"$input.path('$.productId')": {
"M": {
"price": {
"N": "$input.path('$.price')"
},
"details": {
"S": "$input.path('$.details')"
}
}
}
}
}
},
"ReturnValues": "NONE"
}

Request body variable from JSON not accepting integer values

My sample JSON file for postman runner:
[ { "name": "runner", "hitler_id": "4006abc", "year": "2017", "boolean": "false", "expected": 717962 } ]
Pre request script:
var member = data.name; var booking = data.boolean; var fyyear = data.year; var sid = data.hitler_id;
console.log(data.name); console.log(data.boolean); console.log(data.year); console.log(data.hitler_id);
Body with parameters:
{ "size": 0, "query": { "bool": { "filter": [ { "terms": { "name": [ "{{name}}" ] } }, { "terms": { "salesman_id": [ "{{sid}}" ] } }, { "terms": { "fyyear": [ "{{fyyear}}" ] } }, { "terms": { "boolean": [ "{{boolean}}" ] } } ] } }, "aggs": { "year": { "terms": { "field": "year" }, "aggs": { "value": { "sum": { "field": "value" } } } } } }
For only string variables are accepted - name and boolean fields are working and the value is populated
for the other two, the variable values are not passed.
The variables are not used in your request body that way.
Either you have to store them in environment oder global variables via
pm.globals.set("variable_key", variable_value)
pm.environment.set("variable_key", "variable_value");
or just skip the pre-request script if you just want to use your data and reference the fields directly in your body:
{
"size": 0,
"query": {
"bool": {
"filter": [
{
"terms": {
"name": [
"{{name}}"
]
}
},
{
"terms": {
"salesman_id": [
"{{hitler_id}}"
]
}
},
{
"terms": {
"fyyear": [
{{year}}
]
}
},
{
"terms": {
"boolean": [
{{boolean}}
]
}
}
]
}
},
"aggs": {
"year": {
"terms": {
"field": "year"
},
"aggs": {
"value": {
"sum": {
"field": "value"
}
}
}
}
}
}
However take care you're storing the values in your data file. You stored the bool and the year as strings". But they should be represented as you already did for the "expected" var.

How can I delete indexes including multiple conditions?

I try to delete specified indexes which meet the chosen conditions.
At the moment I delete them using one condition like shown below
localhost:9200/pictures/picture/_delete_by_query?pretty
{
"query": {
"regexp":{
"tag": ".*something.*"
}
}
}
}
I would like to delete them for instance in this way
localhost:9200/pictures/picture/_delete_by_query?pretty
{
"query": {
"regexp":{
"tag": ".*something.*",
"path": "this/is/my/path",
"user_id": 2,
}
}
}
}
Do you have any ideas how can I do this?
I guess using a bool query would be the right direction, something like this should work:
localhost:9200/pictures/picture/_delete_by_query?pretty
{
"query": {
"bool": {
"must": [
{
"regexp":{
"tag": ".*something.*",
"path": "this/is/my/path",
"user_id": 2,
}
}
},
{
"term": {
"path.keyword": "this/is/my/path"
}
},
{
"term": {
"user_id.keyword": 2
}
}
]
}
}
}

Elasticsearch query with wildcard and match conditions

I have this index:
{
"mappings": {
"records" : {
"properties" : {
"suggest" : {
"type" : "completion",
"contexts": [
{
"name": "year",
"type": "category",
"path": "year"
}
]
}
}
}
}
}
I put some records:
POST http://localhost:9200/example/records
{
"suggest": {
"input": "foo123" ,
"contexts": {
"year": "1999"
}
}
}
POST http://localhost:9200/example/records
{
"suggest": {
"input": "some123" ,
"contexts": {
"year": "1999"
}
}
}
POST http://localhost:9200/example/records
{
"suggest": {
"input": "thing123" ,
"contexts": {
"year": "2000"
}
}
}
Now I would do this query (sql like):
SELECT * FROM example WHERE SUGGEST LIKE %123% AND YEAR=1999
How can I do in Elastic Search?
I type:
POST http://localhost:9200/example/records/_search?pretty
{
"query": {
"bool": {
"must": [
{ "wildcard" : { "suggest" : "*123*" } }
],
"filter":[
{ "term" : { "year" : "1999" } }
]
}
}
}
I have returned this response with blank results:
{
"took": 1,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 0,
"max_score": null,
"hits": []
}
}
I am expecting to have returned this records:
foo123, year 1999
some123, year 1999
How can I do?
You need to use bool query with must if you care about score:
{
"query": {
"bool": {
"must": [
{ "wildcard" : { "name" : "*foo*" } },
{ "term" : { "year" : "1999" } }
]
  }
}
}
or with filter if you just want to filter values and possibly cache the filter:
{
"query": {
"filter": {
"must": [
{ "wildcard" : { "name" : "*foo*" } },
{ "term" : { "year" : "1999" } }
]
  }
}
}

How do I filter by geo-distance in elasticsearch-py?

Using Python 2.7 and elasticsearch-py.
Given the following JSON:
[
{
"Name":
"Addresses": [
"StreetAdd": "xxx",
"GeoLocation": {
"lat": xx,
"long": yy
}
]
},
{
// And so on.
}
]
And the following mapping:
mapping = {
"mappings": {
"leads": {
"properties": {
"Addresses": {
"type": "nested",
"include_in_parent": "true",
"properties": {
"GeoLocation": "geo_point"
}
}
}
}
}
}
How would I get the locations within 10km of latitude 40, longitude -70? My attempt is as follows:
search_body = {
"query" : {
"filtered": {
"query": {
"match_all" : { }
},
"filter": {
"geo_distance": {
"distance": "10km",
"Addresses.GeoLocation": {
"lat": 40.0,
"lon": -70.0
}
}
}
}
},
"size": 50
}
result = es.search(index=ES_INDEX, body=search_body, sort="Name:asc")
for hit in result["hits"]["hits"]:
print hit["_source"]["Name"]
However, this is throwing the following error:
...
C:\Users\xxx\AppData\Local\Continuum\Anaconda2\lib\site-packages\elasticsearch\connection\base.pyc in _raise_error(self, status_code, raw_data)
103 pass
104
--> 105 raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)
106
107
RequestError: TransportError(400, u'search_phase_execution_exception')
Not that proficient yet with ES, so I'm having a hard time envisioning what schema I should use to approach this problem.
What gives?
The issue is in your mapping. Here is the fixed version
mapping = {
"mappings": {
"leads": {
"properties": {
"Addresses": {
"type": "nested",
"include_in_parent": "true",
"properties": {
"GeoLocation": {
"type":"geo_point" <-- Note the type here
}
}
}
}
}
}
}