Elasticsearch escape "¬" character in regex - regex

I am stuck with this symbol "¬" when trying to run a elasticsearch regex query
to return from set of record in format "prefix-content¬value".
Example (not limited to website pattern, can be any value) : "website-website descriptions that not required¬www.google.com" .
{
"query": {
"filtered": {
"query": {
"match_all": {}
},
"filter": {
"regexp": {
"information": "(website?)(.*¬)(www.google.com?)"
}
}
}
}
}
Has anyone encounter such problem before and manage to handle this ? Thanks.

Related

Kibana Query Language - find numbers in a field

I am struggling with a simple query that is supposed to work based on many tutorials but cannot make it work. Havin log field
Request sent, method=GET, headers={}, queryParams={forceArray=[true]}, entity=null, payload length=null} playerId=102
I am trying to get playerId with 3 digits value. Following query fails
log: /playerId=[0-9]{1,3}/
with KQLSyntaxError: Expected AND, OR, end of input, whitespace but "{" found. and log: /playerId=[0-9]{1,3}/
but supposed to work according to https://dzone.com/articles/getting-started-with-kibana-advanced-searches
This log: /playerId=[0-9][0-9][0-9]/returns basically everything with a single '0' character
This log: /playerId=*/ for some mysterious reasons returns nothing.
Edit
regular elastic search lucene based query does not work either
{
"query": {
"regexp": {
"log": {
"value": "*playerId*"
}
}
}
}
mapping:
{
"my-index" : {
"mappings" : {
"log" : {
"full_name" : "log",
"mapping" : {
"log" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
}
}
}
}
}
}
any help appreciated
Edit
I validated my regex queries in https://regex101.com/
and they all work.
Edit 2
this works
"query": {
"match": {
"log": "playerId"
}
}
this return empty hits
"query": {
"regexp": {
"log": "playerId"
}
}
regards
This is because Kibana uses KQL (Kibana Query Language) by default and that doesn't support regular expressions.
You need to switch to the Lucene Query Language with the query string syntax which supports the regular expression you're trying.
Just click on KQL at the right end of the search bar to change the search syntax.
Also worth noting that regular expression queries are real performance hogger. You should really parse your logs before ingesting them so you can query the playerId field independently.
In any case, if you really want to do it that way, your query is not that far off from the real thing. Here is the correct version that will work for your case:
{
"query": {
"query_string": {
"query": "/.*playerId=[0-9]{3}/",
"default_field": "log.keyword"
}
}
}

ElasticSearch wildcard not returning when value has special characters

I have an elastic search service that fetches when you type into a text input to then populate a table. The search is working (returning filtered data) correctly for all alphanumeric values but not special characters (hyphens in particular). For example for the country Timor-Leste if I pass in Timor as the term I get the result but as soon as I add the hyphen (Timor-) I get an empty array response.
const queryService = {
search(tableName, field, term) {
// If there is no search term, run the wildcard search with 20 values
// for the smaller lists to be pre-populated, like "Gender"
return `
{
"size": ${term ? 200 : 20},
"query": {
"bool": {
"must": [
{
"match": {
"tablename": "${tableName}"
}
},
{
"wildcard": {
"${field}": {
"value": "${term ? `*${term.trim()}*` : '*'}",
"boost": 1.0,
"rewrite": "constant_score"
}
}
}
]
}
}
}
`;
},
};
Is there a way I can modify my wildcard request to allow hyphens? The other response I've seen on here has suggested using "analyze_wildcard": true which hasn't worked. I've also tried to manually escape by putting a \ before each hyphen with .replace.
It all boils down to Elasticsearch analyzers.
By default, all text fields will be run through the standard analyzer, e.g.:
GET _analyze/
{
"text": ["Timor-Leste"],
"analyzer": "standard"
}
This will lowercase your input, strip any special chars, and produce the tokens:
["timor", "leste"]
If you'd like to forgo this default process, add a .keyword mapping:
PUT your-index/
{
"mappings": {
"properties": {
"country": {
"type": "text",
"fields": { <---
"keyword": {
"type": "keyword"
}
}
}
}
}
}
Then reindex your docs, and when dynamically constructing the wildcard query with the newly created .keyword field, make sure the hyphen (and all other special chars) is properly escaped:
POST your-index/_search
{
"query": {
"wildcard": {
"country.keyword": {
"value": "*Timor\\-*" <---
}
}
}
}

ElasticSearch regexp query of a path

So far I've used a query that would match paths and get aggregations of those paths:
{
"query": {
"terms": {
"path.keyword": [
"/api/v1.0/cc-dashboard/aggregated",
"/api/v1.1/cc-dashboard/aggregated",
"/api/v1.2/cc-dashboard/aggregated",
"/api/v1.3/cc-dashboard/aggregated"
]
}
},
"size": 0,
"aggs": { ...
Since the only difference between the paths is the version number (which keeps changing) I thought about using Regexp query.
In a normal regex I would search for \/api\/v1\.\d\/cc-dashboard\/aggregated.
I know ElasticSearch uses different reserved characters for this and I've tried everything I know, but the search comes back without hits.
Any Thoughts?
I think there are a couple of things to watch out for here. First make sure that path.keyword is actually of the type "keyword" or else you will have problem matching b/c you are actually trying to match against tokens and Elasticsearch will split on /. Second it doesn't look like Elasticsearch supports \d to escape for a digit, but it does allow [0-9]. Third to escape the . I had to use two backslashes \\.
So all together now:
PUT /stackoverflow
{
"mappings": {
"properties": {
"path.keyword": {
"type": "keyword"
}
}
}
}
POST /stackoverflow/_doc/1
{
"path.keyword": "/api/v1.0/cc-dashboard/aggregated"
}
POST /stackoverflow/_doc/2
{
"path.keyword": "/api/v1.1/cc-dashboard/aggregated"
}
POST /stackoverflow/_doc/3
{
"path.keyword": "/api/not/cc-dashboard/aggregated"
}
GET /stackoverflow/_search
GET /stackoverflow/_search
{
"query": {
"regexp": {
"path.keyword": {
"value": "/api/v1\\.[0-9]/cc-dashboard/aggregated"
}
}
}
}
DELETE /stackoverflow

AWS ElasticSearch Query for Keyword not getting results I expect

I have an ElasticSearch query that looks like:
{
"query": {
"constant_score": {
"filter": {
"bool": {
"should": [
{
"wildcard": {
"Message.keyword": "*System.Net.WebClient).DownloadString(*"
}
},
{
"wildcard": {
"Message.keyword": "*system.net.webclient).downloadfile(*"
}
}
]
}
}
}
}
}
And a Doc in my Index that includes:
message:Engine state is changed from None to Available. Details: NewEngineState=Available PreviousEngineState=None SequenceNumber=13 HostName=ConsoleHost HostVersion=5.1.18362.628 HostId=3dd1a50a-cc15-45e0-bf63-4456d556fb67 HostApplication=powershell.exe -command PowerShell -ExecutionPolicy bypass -noprofile -windowstyle hidden -command (New-Object System.Net.WebClient).DownloadFile('https://drive.google.com/uc?export=download EngineVersion=5.1.18362.628 RunspaceId=de762b62-056c-4be1-90bf-a12cfe6fbc72
As you can see above it includes:
(New-Object System.Net.WebClient).DownloadFile('https:....
It seems like the filter here should be matching the message, but when I execute the Query through Kibana, nothing matches even though I can see the doc above inside my index through Kibana UI if I just query for *.
I think maybe this is because the query above is querying for Message.keyword? How do I get it to successfully hit the document above?
Edit:
mapping: https://pastebin.com/cWN4jF3d
Sample data: https://pastebin.com/SyErqaG8
There are two reasons for the query not returning the result:
The field name in mapping is message whereas in query you are using Message.
A field with keyword datatype index the data as it is. This means it will be case sensitive as well. The document you shared has text System.Net.WebClient).DownloadFile( where you can see that there are characters with upper case whereas the search query you expect to match "*system.net.webclient).downloadfile(*" has all lower case characters.
Therefore the query should be:
{
"query": {
"constant_score": {
"filter": {
"bool": {
"should": [
{
"wildcard": {
"message.keyword": "*System.Net.WebClient).DownloadString(*"
}
},
{
"wildcard": {
"message.keyword": "*System.Net.WebClient).DownloadFile(*"
}
}
]
}
}
}
}
}
The keyword fields are used only for exact match. You will need to match the regular fields if you only want to match a substring / subset of the string, by querying on Message instead of Message.keyword:
{
"query": {
"constant_score": {
"filter": {
"bool": {
"should": [
{
"wildcard": {
"Message": "*System.Net.WebClient).DownloadString(*"
}
},
{
"wildcard": {
"Message": "*system.net.webclient).downloadfile(*"
}
}
]
}
}
}
}
}

Using regular expressions in elasticsearch term queries

I want find all items filtered by ID match some regular expression like
*TEST123* //pattern for regexp
So expected result are items
ATEST123001
ATEST123002
ATEST123003
TTTTEST123001
...
I can create some script which scan full storage and save IDs in log-file which can check later. But I want to find some better solution
Updated
I tried
"query" : { "match_all" : { }, "filtered" : { "filter" : { "regexp": { "id":".test123." } } } }, }
I receive
//nested: ElasticsearchParseException[Expected field name but got START_OBJECT \"filtered\"]
When I tried
{
"regexp": {
"id": "test123"
}
}
//Parse Failure [No parser for element [regexp]]]
ES 1.7.4 and Lucene 4.10.4
You can use regular expression queries. The regexp query allows you to use regular expression term queries.
Ref:
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-regexp-query.html
Sample regex query :
{
"regexp":{
"id": "*test123*"
}
}
Update:
In 2.0 regexp filter has been replaced by regexp query.
{
"query": {
"filtered": {
"filter": {
"regexp":{
"id":".*TEST123.*"
}
}
}
}
}
You can try Query String.
{
"query": {
"query_string": {
"default_field": "if",
"query": "*test123*"
}
}
}