How can I delete indexes including multiple conditions? - regex

I try to delete specified indexes which meet the chosen conditions.
At the moment I delete them using one condition like shown below
localhost:9200/pictures/picture/_delete_by_query?pretty
{
"query": {
"regexp":{
"tag": ".*something.*"
}
}
}
}
I would like to delete them for instance in this way
localhost:9200/pictures/picture/_delete_by_query?pretty
{
"query": {
"regexp":{
"tag": ".*something.*",
"path": "this/is/my/path",
"user_id": 2,
}
}
}
}
Do you have any ideas how can I do this?

I guess using a bool query would be the right direction, something like this should work:
localhost:9200/pictures/picture/_delete_by_query?pretty
{
"query": {
"bool": {
"must": [
{
"regexp":{
"tag": ".*something.*",
"path": "this/is/my/path",
"user_id": 2,
}
}
},
{
"term": {
"path.keyword": "this/is/my/path"
}
},
{
"term": {
"user_id.keyword": 2
}
}
]
}
}
}

Related

Range filter in openSearch, ElasticSearch is not working correctly

I have a problem while using Opensearch which based on Elasticsearch when I use range as filter query I get all the data that apply to the filter query despite it doesn't match the search query with a score of 0.0 down below sample of the query I use
{
"query": {
"bool": {
"should": [
{
"bool": {
"must": [
{
"match": {
"FIELD": "TEXT"
}
},
{
"match": {
"FIELD": "TEXT"
}
}
]
}
},
{
"bool": {
"must": [
{
"match": {
"FIELD": "TEXT"
}
},
{
"match": {
"FIELD": "TEXT"
}
}
]
}
}
],
"filter": [
{
"range": {
"FIELD": {
"gt": 1,
"lt": 500
}
}
}
]
}
}
}
what I want is to get all the data that only match the search Query and it matches the filter at the same time and I don't know what I'm doing wrong
any help.
Thanks in advance

Getting all values of 2 columns

I am looking for appropriate elasticsearch query for,
SELECT col1,col2 FROM myTable WHERE col1="value1" AND col2 = "value2"
eg:
This is my mapping,
{
"mapping": {
"doc": {
"properties": {
"book": {
"properties": {
"name": {
"type": "text"
},
"price": {
"type": "integer"
},
"booktype": {
"properties": {
"booktype": {
"type": "text"
}
}
}
}
}
}
}
}
}
I am trying to write a query which will give me price and name which has booktype=Fiction
Try this:
GET myTable/_search
{
"size": 1000,
"_source": [
"price",
"name"
],
"query": {
"bool": {
"must": [
{
"match": {
"booktype.booktype": "Fiction"
}
}
]
}
}
}
Note: you might need to adapt "size" to fit your needs.

Elasticsearch 5.4 - filter by term if the term exists and don't filter when term is not present

I'm searching on multiple types. One of the types returned has a field called my_field. The other types returned do not have that field. I want all results where the term does not exist and only the results where the field has the value True when the term does exist.
It would be great if the filter on my_field didn't contribute to the query score and only filtered.
This is as close as I got. I will self-flagellate for 1 hour if you help me please.
(Don't use this it is wrong!)
body = {
'query': {
'bool': {
'must': {
'multi_match': {
'query': 'pangolin',
'fields': ['_all', '_partials']
}
},
"should": {
"must_not": {
"exists": {
"field": "my_field"
}
}
},
"should": {
'term': {
'my_field': True
}
}
}
}
}
The following seems to be what I need.
Documents must match on 'pangolin' and documents are filtered on 2 shoulds. Only 1 of the shoulds needs to match.
https://www.elastic.co/guide/en/elasticsearch/reference/5.4/query-dsl-bool-query.html (see keywords: filter and should).
body = {
"query": {
'bool': {
'must': {
'multi_match': {
'query': 'pangolin',
'fields': ['_all', '_partials']
}
},
"filter": {
"bool": {
"should": [{
"term": {
"my_field": True
}
},
{
"bool": {
"must_not": {
"exists": {
"field": "my_field"
}
}
}
}
]
}
}
}
}
}
Have you tried something like this?
$ curl -XPOST localhost:9200/type1,type2/_search -d '{
"query": {
"bool": {
"must": [
{
"term": {
"my_field": true
}
},
{
"constant_score": {
"filter": {
"not_missing": {
"field": "type2.my_field"
}
}
}
}
]
}
}
}'
Let me know if this works. Good luck!

failed to parse search source, expected field name but got [START_OBJECT]

My Search Body:
{
"query":{
"filtered":{
"filter":{
"bool":{
"should":[
{
"term":{
"categories":2
}
},
{
"term":{
"categories":5
}
}
]
}
}
},
"bool":{
"should":[
{
"match":{
"name":"perferendis"
}
},
{
"match":{
"brand":"in"
}
}
]
}
},
"filter":{
"and":{
"filters":[
{
"bool":{
"must_not":{
"term":{
"condition":1
}
}
}
},
{
"range":{
"highest_sales_rank":{
"gte":96
}
}
},
{
"range":{
"review_rating":{
"gte":1
}
}
},
{
"range":{
"review_count":{
"gte":12
}
}
},
{
"range":{
"upper_price":{
"gte":68
}
}
},
{
"bool":{
"must_not":{
"term":{
"updated_at":0
}
}
}
}
]
}
},
"sort":{
"updated_at":"asc"
},
"size":10,
"from":40
}
However if I take out the filtered part the query succeeds
"filtered":{
"filter":{
"bool":{
"should":[
{
"term":{
"categories":2
}
},
{
"term":{
"categories":5
}
}
]
}
}
},
I previously used this format:
"filter":{
"bool":{
"should":[
{
"match":{
"categories":"16310211"
}
},
{
"match":{
"categories":"493964"
}
}
]
}
},
but it only works with elastic search 2, and as AWS only supports 1.5.6 I am not able to use this format, related to my previous question Narrowing search result to multiple categories
the Query DSL had changes among version 1.x and 2.x, do you need change your query, i did an example.
{
"query": {
"filtered": {
"filter": {
"bool": {
"should": [
{
"term": {
"categories": 2
}
},
{
"term": {
"categories": 5
}
},
{
"bool": {
"should": [
{
"match": {
"name": "perferendis"
}
},
{
"match": {
"brand": "in"
}
}
]
}
}
],
"must": [
{
"range": {
"highest_sales_rank": {
"gte": 96
}
}
},
{
"range": {
"review_rating": {
"gte": 1
}
}
},
{
"range": {
"review_count": {
"gte": 12
}
}
},
{
"range": {
"upper_price": {
"gte": 68
}
}
}
],
"must_not": [
{
"term": {
"condition":1
}
},
{
"term":{
"updated_at":0
}
}
]
}
}
}
},
"sort":{
"updated_at":"asc"
},
"size":10,
"from":40
}
And i removed your AND filter, AND filters does not cache in a good way.
Feel free to ask some questions.

Elasticsearch/Lucene Regex fquery/query_string not returning all documents

I currently have this mapping in Elasticsearch that I am indexing with a not_analyzed field:
PUT /twitter/_mapping/tweet
{
"tweet": {
"properties" : {
"user" : {
"type" : "string",
"index": "not_analyzed"
}
}
}
}
PUT /twitter/tweet/1
{
"user": "CNN"
}
PUT /twitter/tweet/2
{
"user": "cnn"
}
PUT /twitter/tweet/3
{
"user": "Cnn"
}
PUT /twitter/tweet/4
{
"user": "cNN"
}
PUT /twitter/tweet/5
{
"user": "CnN"
}
I want to search on this index with a case-insensitive filter like so (generated through NEST, so not too flexible in changing this query syntax):
POST /twitter/tweet/_search
{
"from": 0,
"size": 10,
"query": {
"filtered": {
"filter": {
"bool": {
"must": [
{
"fquery": {
"query": {
"query_string": {
"query": "user:/[cC][nN][nN]/"
}
}
}
}
]
}
}
}
}
}
This query only returns 1 documents though: "user": "cnn" (lowercase), not all of the documents.
Why is this? The same query with "query": "user:CNN" returns the correct document with the correct casing (uppercase).
EDIT: Also, if I remove the document with cnn (lowercase), the query returns nothing.
EDIT 2: In the case that this is a problem with my NEST code, here's the code used to generate the query:
// property path would be something like "user". queryTerm would be something like "cnn"
filterDescriptor.Query(
q =>
q.QueryString(
d =>
d.Query(string.Format("{0}:{1}", propertyPath,
GetCaseInsentitiveRegexExpression(queryTerm))))); // returns something like /[cC][nN][nN]/
You need to set lowercase_expanded_terms:false. By default lowercase_expanded_terms is set to true which lower-cases wildcard,regexp queries.
Example:
POST /twitter/tweet/_search
{
"from": 0,
"size": 10,
"query": {
"filtered": {
"filter": {
"bool": {
"must": [
{
"fquery": {
"query": {
"query_string": {
"query": "user:/[Cc][nN][nN]/",
"lowercase_expanded_terms": false
}
}
}
}
]
}
}
}
}
}
Or on nest code it would be something on these lines
q.QueryString(
d =>
d.Query(string.Format("{0}:{1}", propertyPath,
GetCaseInsentitiveRegexExpression(queryTerm))).LowercaseExpendedTerms(false))