Using a regex in the custom field of Filebeat - regex

Here I can read that when configuring a prospect I can add a custom field to the data, which later I can use for filtering.
So for example I can write
- type: log
paths:
- /my/path/app1.csv
fields:
app_name: app1
- type: log
paths:
- /my/path/app2.csv
fields:
app_name: app2
This means that anytime I will have a new CSV file to track I have to add it to the filebeat.yml file adding the custom app_name field accordingly.
I was wondering if I could use a regex with a capture group in the prospect definition to "automatically" track any new file and assign the right app_name value. Something like this:
- type: log
paths:
- /my/path/(.*).csv
fields:
app_name: \1
What do you think? I didn't find any documentation regarding this possibility with the fields feature.

As adviced here, I can use the source Filebeat field to filter the data. This field is the path of the harvested files, so no other field is required to filter.

Related

Add a new field in ISTIO envoy access log based on regex

I have an IstioOperator deployment with logs enabled in JSON format:
spec:
meshConfig:
accessLogFile: /dev/stdout
accessLogEncoding: JSON
No specific accessLogFormat is defined so default one applies.
[%START_TIME%] \"%REQ(:METHOD)% %REQ(X-ENVOY-ORIGINAL-PATH?:PATH)% %PROTOCOL%\" %RESPONSE_CODE% %RESPONSE_FLAGS% %RESPONSE_CODE_DETAILS% %CONNECTION_TERMINATION_DETAILS%
\"%UPSTREAM_TRANSPORT_FAILURE_REASON%\" %BYTES_RECEIVED% %BYTES_SENT% %DURATION% %RESP(X-ENVOY-UPSTREAM-SERVICE-TIME)% \"%REQ(X-FORWARDED-FOR)%\" \"%REQ(USER-AGENT)%\" \"%REQ(X-REQUEST-ID)%\"
\"%REQ(:AUTHORITY)%\" \"%UPSTREAM_HOST%\" %UPSTREAM_CLUSTER% %UPSTREAM_LOCAL_ADDRESS% %DOWNSTREAM_LOCAL_ADDRESS% %DOWNSTREAM_REMOTE_ADDRESS% %REQUESTED_SERVER_NAME% %ROUTE_NAME%\n
However, what i want is to add another field at the end of log by the name PATH_MAIN which is derived from original path attribute but based on same regex (regex patterns already figured out) it would alter some values, such as redacting GUIDs etc.
My question is, how can I, if possible define a new field in Log Format by giving another field as attribute and defining its value based on regex.

GCP Ops agent add static field

I'm trying to add a static value field to the ops agent without success. This is the processor I'm using:
modify_fields:
type: modify_fields
fields:
env:
static_value: somenv
Also tried:
modify_fields:
type: modify_fields
fields:
env:
default_value: somenv
I just need all the documents sent by that machine to have a "env" field with value "someenv"
The error I'm getting is "env field not found"
Thank you
The issue was this feature was not available in the version I was using (2.16), so I upgraded to 2.18 and now works.
Also, need to follow the LogEntry structure
modify_fields:
type: modify_fields
fields:
jsonPayload.env:
static_value: somenv
Also record_log_file_path property which wasn't working is working now.

How to customize field representation in django drf AutoSchema?

I am new to using Django drf schema generation. I installed and configured the schema according to the documentation. It generates the OpenAPI schema. Now I want to customize the choice field representation and add some description in front of each field.
Current schema choice field is like below:
condition:
enum:
- used
- new
- like new
type: string
nullable: true
I want the following:
condition:
enum:
- used: "for used items"
- new: "for new items"
- like new: " brand new items"
type: string
nullable: true
In the documentation I could find little on how to customize the AutoSchema class but that was above my knowledge. I would be grateful if someone can help me on how can I override the schema methods to represent customize fields?

Tiddlywiki: Configure '$:/config/FileSystemPaths' to save to a folder based on a field

I have a TiddlyWiki where the tiddlers have a field "saveto". I want to add a line to the file '$:/config/FileSystemPaths' to prefix the name of the tiddler with the value of the "saveto" field. For example, the tiddler
created: 20200114160408003
modified: 20200114160440095
saveto: test
tags:
title: New Tiddler
type: text/vnd.tiddlywiki
should be saved at test/New Tiddler.tid
Is it possible? I don't know much about filters, these are some things I have tried:
[has[saveto]addprefix[get[saveto]]]
[has[saveto]addprefix:get[saveto]]
[has[saveto]addprefix{!!saveto}]
Thanks for any help!
Here is something that works:
[has[saveto]get[saveto]] [get[title]] +[join[/]]
This solution was created by Jeremy Ruston as an answer to my post on the Tiddlywiki google groups forum

Ansible Lineinfile - escaping single quote when using a back reference

I am trying using the following Ansible play to change the default weak password for a number of SQL scripts:
- name: amend SQL User Passwords
sudo: yes
lineinfile: dest=/path/to/script/{{ item.file }} state=present backup=yes
regexp="CREATE USER (.*) PASSWORD 'password';$" line='CREATE USER \1 PASSWORD ''strongpassword'';' backrefs=yes
with_items:
- { file: create_db1_users_tables.sql }
- { file: create_db2_users_tables.sql }
- ...
- { file: create_dbNN_users_tables.sql }
Though the YAML specification suggests that it should,
Inside single quotes, you can represent a single quote in your string by using two single quotes next to each other.
the double single-quote are not acting as an escape character; instead they are being completely removed from the output line:
CREATE USER user1 PASSWORD stR0ngP#55w0rD
instead of
CREATE USER user1 PASSWORD 'stR0ngP#55w0rD'
So far I have tried:
Using double quotes - this allows the single quote around the password, however the backref is no-longer parsed, and just prints out as \1. Attempting to escape this as \\1 appears to have no affect.
Various different ways of escaping the single quote - backslashes, ticks. Most produce syntax errors or just output nothing.
Surrounding all the lineinfile arguments in double quotes (and suitably escaping everything within)
I can't use the Ansible database modules
I can't use templates, the scripts create and populate a number of tables, and can change between versions of the product - keeping templates up to date would be too much overhead.
I've found a similar Ansible question, Quotes in ansible lineinfile, but the solutions did not help.
Does anyone have any further suggestions?
Not that I enjoy answering my own posts but having trawlled more of the internet I came across this post, where the suggestion was made to use the replace module instead of the lineinfile**. The play:
- name: amend SQL User Passwords
sudo: yes
replace: dest=/path/to/script/{{ item.file }}
backup=yes
regexp="^CREATE USER (.*) PASSWORD 'password';$"
replace="CREATE USER \\1 PASSWORD 'strongpassword';"
with_items:
- { file: create_db1_users_tables.sql }
- { file: create_db2_users_tables.sql }
- ...
- { file: create_dbNN_users_tables.sql }
Works as expected, generating the desired SQL.
** This also re-enforces the fact that I should spend more time reading the Ansible manuals.