Resource Regex Cause Panic - regex

This code is an attempt to replace a service that already works in production, written in java, with one written in rust.
This service will serve as a sidecar proxy for a redis cluster, exposing an api rest. It needs to maintain compatibility with the current api.
The Route is:
"/api/keys/{path:*}"
In path we can put de name of the key for redis, and can contain any format below:
/api/keys/users/41728391
/api/keys/users/1000/followers
/api/keys/users/{1234}/data
this is my attempt
HttpServer::new(move || App::new()
.data(redis_config)
.service(
web::resource("/set/{path:*}").route(web::put().to(set_key))
) ).bind(("127.0.0.1", 8080))?
.run()
.await
I tried like this too:
#[get("/set/{path:*}")]...
But in two cases i get this error:
.service(web::resource("/set/{path:*}").route(web::put().to(path_regex)))
| ^^^^^^^^^^ the trait `Factory<_, _, _>` is not implemented for `path_regex`
thread 'thread 'actix-rt:worker:1actix-rt:worker:0' panicked at '' panicked at 'Wrong path pattern: "/set/{path:*}" regex parse error:
^/set/(?P<path>*)$
^
error: repetition operator missing expressionWrong path pattern: "/set/{path:*}" regex parse error:
^/set/(?P<path>*)$
^
I have read the https://actix.rs/actix-web/actix_web/web/fn.resource.html
My code is : https://github.com/rogeriob2br/enge-sidecar-redis

Related

How to disable JSON format and send only the log message to Sumologic with Fluentbit?

We are using Fluentbit as as Sidecar container in our ECS fargate Cluster which is running a dotnet application, initially we faced the issue of fluentbit sending the logs in multiline and we solved it using Fluentbit Multilne feature. Now the logs are being sent to Sumologic in Multiple however it is being sent as Json format whereas we just want fluentbit send only the raw log
Logs are currently
{
date:1675120653.269619,
container_id:"xvgbertytyuuyuyu",
container_name:"XXXXXXXXXX",
source:"stdout",
log:"2023-01-30 23:17:33.269Z DEBUG [.NET ThreadPool Worker] Connection.ManagedDbConnection - ComponentInstanceEntityAsync - Executing stored proc: dbo.prcGetComponentInstance"
}
We want only the line
2023-01-30 23:17:33.269Z DEBUG [.NET ThreadPool Worker] Connection.ManagedDbConnection - ComponentInstanceEntityAsync - Executing stored proc: dbo.prcGetComponentInstance
You need to modify Fluent Bit configuration to have the following filters and output configuration:
fluent.conf:
## prepare headers for Sumo Logic
[FILTER]
Name record_modifier
Match *
Record headers.content-type text/plain
## Set headers as headers attribute
[FILTER]
Name nest
Match *
Operation nest
Wildcard headers.*
Nest_under headers
Remove_prefix headers.
[OUTPUT]
Name http
...
# use log key as body
body_key $log
# use headers key as headers
headers_key $headers
That way, you are going to craft HTTP request manually. This is going to send request per log, which is not necessary a good idea. In order to mitigate that you can add the following parser and use it (flush_timeout may need an adjustment):
parsers.conf
# merge everything as one big log
[MULTILINE_PARSER]
name multiline-all
type regex
flush_timeout 500
#
# Regex rules for multiline parsing
# ---------------------------------
#
# configuration hints:
#
# - first state always has the name: start_state
# - every field in the rule must be inside double quotes
#
# rules | state name | regex pattern | next state
# ------|---------------|--------------------------------------------
rule "start_state" ".*" "cont"
rule "cont" ".*" "cont"
fluent.conf:
[INPUT]
name tail
...
multiline.parser multiline-all

Gradle copy task expand yaml file escape whole string

I have the following gradle task:
processResources {
inputs.properties(project.properties.findAll { it.value instanceof String })
filesMatching("**/*.yaml") {
filteringCharset = 'UTF-8'
expand project.properties
}
}
that I use to process a spring boot application.yaml file that contains variable placeholders.
How can I escape a whole log pattern without escaping every single special character, thus keeping the pattern clean?
application.yaml:
logging:
pattern:
console: %clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}
Tried using slashy strings with no success:
1.Tried:
/%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}/
Error:
Caused by: groovy.lang.GroovyRuntimeException: Failed to parse template script (your template may contain an error or be trying to use expressions not currently supported): startup failed:
SimpleTemplateScript8.groovy: 138: expecting '}', found 'HH' # line 138, column 60.
ATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.S
^
2.Tried:
$/%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}/$
Error:
Caused by: groovy.lang.GroovyRuntimeException: Failed to parse template script (your template may contain an error or be trying to use expressions not currently supported): startup failed:
SimpleTemplateScript10.groovy: 138: illegal string body character after dollar sign;
solution: either escape a literal dollar sign "\$5" or bracket the value expression "${5}" # line 138, column 15.
console: $/%clr(%d{${LOG_DATEFORMAT_PATTERN:-yyyy-MM-dd HH:mm:ss.SSS}}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([t]){faint} %clr(%-40.40logger{39}){cyan} %clr(:){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}/$
^
Given that your YAML files contains character which are interpreted as tokens by the expand method, relying on Groovy's SimpleTemplateEngine, you should use an alternative method like the filter methods.
The documentation shows a number of examples of filter. By using the underlying Ant ReplaceTokens you might have better luck, as it uses #token# as the notation.

Ansible replace seems to throw a parsing error?

I'm trying to use the Ansible replace module to change some text in a file on a Windows Server 2019 Standard target. I'm using Ansible 2.8.3 running on Python 2.7
- name: REPLACE | Replace baseline.local with FQDN in InternetSettings.xml
replace:
path: '/path/to/settings.xml'
regexp: 'baseline\.local'
replace: '{{ FQDN }}'
I don't think the issue is the path, although one of the directories on the path to the file has brackets '{}' in its name. Could that be it?
I've tried to do the same thing with win_lineinfile, and it didn't throw an error with the same path, but it'd be difficult in this case to replicate the functionality of replace, which is really what I need.
EDIT 2: It works when I copy the file over to my local machine and delegate to 127.0.0.1. I'm running ansible from a Windows Subsystem for Linux (WSL) installation. It also works when I copy the file to a remote linux system and run replace there, so it seems to be a Windows problem.?
EDIT: The stack trace of the error I'm getting:
"Exception calling "Create" with "1" argument(s): "At line:4 char:21
+ def _ansiballz_main():
+ ~
An expression was expected after '('.
At line:13 char:27
+ except (AttributeError, OSError):
+ ~
Missing argument in parameter list.
At line:15 char:7
+ if scriptdir is not None:
+ ~
Missing '(' after 'if' in if statement.
At line:22 char:7
+ if sys.version_info < (3,):
+ ~
Missing '(' after 'if' in if statement.
At line:22 char:30
+ if sys.version_info < (3,):
+ ~
Missing expression after ','.
At line:22 char:25
+ if sys.version_info < (3,):
+ ~
The '<' operator is reserved for future use.
At line:24 char:32
+
My teammates and I suspect that this issue is the result of attempting to use an ansible module not built for windows on a windows target - more particularly, a target that doesn't have python installed. We suspect that some part of the compiling of python on the ansible controller or subsequent execution of the python binary on the windows target is what's really causing the problem here. We've decided on the workaround of just using win_shell with powershell's 'replace' to do what we need to do.
Try changing your regexp string to double quotes:
regexp: "baseline.local"

expect regex to search a complete line or text

my expect_output(buffer) is as below in multiple lines.
status QM1
QM(QM1) Status(Running)
CPU: 0.02%
Memory: 106MB
Queue manager file system: 391MB used, 47.3GB allocated
[1%]
HA role: Primary
HA status: Normal
HA control: Enabled
HA preferred location: Here
mqa(mqcli)#
tried multiple regex options, but i keep getting 0, as no match.
regexp /^.*\b(Running)\b.*$ $qmgrstat similar, get the value of "HA Status".
What would be the correct Regex syntax.
Not sure I understand. Do you only want to match the value of HA status? If yes, then this regex would do that:
HA status:\W*(.*)
If you want to capture the HA status when the "QM" status is Running, you can do
if {[string match {*Status(Running)*} $qmgrstat]} {
regexp {HA status:\s+(\S+)} $qmgrstat -> status
}
You can use the following regex to get the whole output:
status[^\v]*\vQM[^\v]*\vCPU:[^\v]*\vMemory:[^\v]*\vQueue manager file system:[^\v]*\vHA role:[^\v]*\vHA status:[^\v]*\vHA control:[^\v]*\vHA preferred location:[^\v]*\vmqa[^\v]*#\v
if you want the complete line of text with the status use: [^\v]*Status\([^\v]*
To get the line with the HA status you can use: HA status:[^\v]*
Good luck!

grok regex parsing not matching a log. when specifying a group as optional, but not the last group

Example:
info: 2014-10-28T22:39:46.593Z - info: an error occurred while trying
to handle command: PlaceMarketOrderCommand, xkkdAAGRIl. Error:
Insufficient Cash #userId=5 #orderId=Y5545
pattern:
> %{LOGLEVEL:stream_level}: %{TIMESTAMP_ISO8601:timestamp} -
> %{LOGLEVEL:log_level}: %{MESSAGE:message}
> (#userId=%{USER_ID:user_id})? (#orderId=%{ORDER_ID:order_id})?
extra patterns used:
USER_ID (\d+|None)
ORDER_ID .*
ORDER_ID_HASH \s*(#orderId=%{ORDER_ID:order_id})?
USER_ID_HASH \s*(#userId=%{USER_ID:user_id})?
MESSAGE (.*?)
Works fine:
removing the optional last orderId also works
info: 2014-10-28T22:39:46.593Z - info: an error occurred while trying
to handle command: PlaceMarketOrderCommand, xkkdAAGRIl. Error:
Insufficient Cash #userId=5
but if I keep the orderId and remove the userId then I get a "no match"
info: 2014-10-28T22:39:46.593Z - info: an error occurred while trying
to handle command: PlaceMarketOrderCommand, xkkdAAGRIl. Error:
Insufficient Cash #orderId=Y5545
Also the user_id group is ending with a ? as an optional group..
working with the grok debugger in heroku:
Is this a bug? (logstash 1.4.2) missing something with the regex? (more probable.. but what?)
I looked at the regex lib grok is using and looks this syntax supposed to work. It does work for the last group (orderId) but not for the one before..
Thanks for the help!
You are forcing a space to be before your optional last... you need to do ?:
%{LOGLEVEL:stream_level}: %{TIMESTAMP_ISO8601:timestamp} -> %{LOGLEVEL:log_level}: %{MESSAGE:message} ?(#userId=%{USER_ID:user_id})? ?(#orderId=%{ORDER_ID:order_id})?