Regex rich the limit? - regex

I have some code to routing my url (every user has is own addres like: domain.com/username)
The problem is I get some errors because my regex is too large(?)
How can I solve this?
$user_nick=user::getAllUserNick();//this return string like: username1|username2 etc.
$user_url_firstpage = new Zend_Controller_Router_Route_Regex(
'('.$user_nick.'+)',
array('controller' => 'ads', 'action' => 'category', 'page' => 1), array(1 => 'nick', 2 => 'page'), '%s/'
);
error log:
[28-Sep-2020 07:18:10 UTC] PHP Warning: preg_match(): Compilation failed: regular expression is too large at offset 83837 in D:\netbeans-projects\mysite\library\Zend\Controller\Router\Route\Regex.php on line 83

Related

WP_Query meta_query REGEXP

I am pretty much below beginner level for REGEX/REGEXP and have hit a blocking point in a project I am working in, where I am trying to get the ids for posts that match the search criteria , but I want to restrict the search between 2 sub-strings. I am trying to figure out is how to write the REGEXP in the meta_query:
$args = array(
'post_type'=> 'custom',
'order' => 'DESC',
'posts_per_page' => 10,
'paged' => $page,
'meta_query' => array(
array(
'key' => 'key',
'value' => "title*".$search."*_title",
'compare' => 'REGEXP',
)
),
);
And an example of the field in the DB :
a:302:{s:5:"title";s:10:"Test title";s:6:"_title";s:19:"
Unfortunately none of the combinations I tried based on documentation of SQL REGEXP won't return any values and I am trying to understand how I can pull this off and would appreciate any input.
Also would rather stick to WP_Query for now even though an SQL LIKE "%title%{$search}%_title%" works perfectly , so an alternative solution would be how to set the compare to 'LIKE' and parse it '%' since that is not possible out of the box as the % get escaped I believe.

How to allow Route reqular expression for date and string in laravel

I want to allow my url as below format but
->where(['id'=>'[0-9]+','from'=>'[A-Za-z]+','to'=>'[A-Z]+']);
and Sometime I want to access like this
1: /get_acc_customer_info/1/2016-8-1/2016-09-29
2 : /get_acc_customer_info/1/from/to
You can use a pipe | to build an alternation:
->where([
'id' => '[0-9]+',
'from' => '[A-Za-z]+|\d{4}(?:-\d{1,2}){2}',
'to' => '[A-Z]+|\d{4}(?:-\d{1,2}){2}'
]);

Getting uninitialized constant Twilio::REST::RequestError

I keep getting the same error since I upgraded to:
gem 'twilio-ruby', '~> 5.0.0.rc4'
The call was successful set to Twilio, but the getting some error.
app/controllers/home_controller.rb:59:in `rescue in call'
require "rubygems"
require "twilio-ruby"
def call
#twilio = Twilio::REST::Client.new account_sid, auth_token
begin
#call = #twilio.account.calls.create({
:to => ,
:from => twilio_number,
:url => url,
:method => "GET",
:if_machine => "Hangup",
:timeout => "20"
})
# Getting current call status (seems like error here...!)
get_status(#call.sid)
rescue Twilio::REST::RequestError => error
#err_msg = error.message
puts #err_msg
#returned error is like below:
#NameError (uninitialized constant Twilio::REST::RequestError)
end
end
Code for getting current call status:
def get_status(sid)
#twilio = Twilio::REST::Client.new account_sid, auth_token
#call = #twilio.account.calls.get(sid)
puts "Process Status : " + #call.status
return #call.status
end
Please help to figure it out.
Thank you!
For version 5, Try Twilio::REST::RestError.
This is documented here:
There are new classes to rescue errors from. The new library now uses the Twilio::REST::RestError class.

How to process multiline log entry with logstash filter?

Background:
I have a custom generated log file that has the following pattern :
[2014-03-02 17:34:20] - 127.0.0.1|ERROR| E:\xampp\htdocs\test.php|123|subject|The error message goes here ; array (
'create' =>
array (
'key1' => 'value1',
'key2' => 'value2',
'key3' => 'value3'
),
)
[2014-03-02 17:34:20] - 127.0.0.1|DEBUG| flush_multi_line
The second entry [2014-03-02 17:34:20] - 127.0.0.1|DEBUG| flush_multi_line Is a dummy line, just to let logstash know that the multi line event is over, this line is dropped later on.
My config file is the following :
input {
stdin{}
}
filter{
multiline{
pattern => "^\["
what => "previous"
negate=> true
}
grok{
match => ['message',"\[.+\] - %{IP:ip}\|%{LOGLEVEL:loglevel}"]
}
if [loglevel] == "DEBUG"{ # the event flush line
drop{}
}else if [loglevel] == "ERROR" { # the first line of multievent
grok{
match => ['message',".+\|.+\| %{PATH:file}\|%{NUMBER:line}\|%{WORD:tag}\|%{GREEDYDATA:content}"]
}
}else{ # its a new line (from the multi line event)
mutate{
replace => ["content", "%{content} %{message}"] # Supposing each new line will override the message field
}
}
}
output {
stdout{ debug=>true }
}
The output for content field is : The error message goes here ; array (
Problem:
My problem is that I want to store the rest of the multiline to content field :
The error message goes here ; array (
'create' =>
array (
'key1' => 'value1',
'key2' => 'value2',
'key3' => 'value3'
),
)
So i can remove the message field later.
The #message field contains the whole multiline event so I tried the mutate filter, with the replace function on that, but I'm just unable to get it working :( .
I don't understand the Multiline filter's way of working, if someone could shed some light on this, it would be really appreciated.
Thanks,
Abdou.
I went through the source code and found out that :
The multiline filter will cancel all the events that are considered to be a follow up of a pending event, then append that line to the original message field, meaning any filters that are after the multiline filter won't apply in this case
The only event that will ever pass the filter, is one that is considered to be a new one ( something that start with [ in my case )
Here is the working code :
input {
stdin{}
}
filter{
if "|ERROR|" in [message]{ #if this is the 1st message in many lines message
grok{
match => ['message',"\[.+\] - %{IP:ip}\|%{LOGLEVEL:loglevel}\| %{PATH:file}\|%{NUMBER:line}\|%{WORD:tag}\|%{GREEDYDATA:content}"]
}
mutate {
replace => [ "message", "%{content}" ] #replace the message field with the content field ( so it auto append later in it )
remove_field => ["content"] # we no longer need this field
}
}
multiline{ #Nothing will pass this filter unless it is a new event ( new [2014-03-02 1.... )
pattern => "^\["
what => "previous"
negate=> true
}
if "|DEBUG| flush_multi_line" in [message]{
drop{} # We don't need the dummy line so drop it
}
}
output {
stdout{ debug=>true }
}
Cheers,
Abdou
grok and multiline handling is mentioned in this issue https://logstash.jira.com/browse/LOGSTASH-509
Simply add "(?m)" in front of your grok regex and you won't need mutation. Example from issue:
pattern => "(?m)<%{POSINT:syslog_pri}>(?:%{SPACE})%{GREEDYDATA:message_remainder}"
The multiline filter will add the "\n" to the message. For example:
"[2014-03-02 17:34:20] - 127.0.0.1|ERROR| E:\\xampp\\htdocs\\test.php|123|subject|The error message goes here ; array (\n 'create' => \n array (\n 'key1' => 'value1',\n 'key2' => 'value2',\n 'key3' => 'value3'\n ),\n)"
However, the grok filter can't parse the "\n". Therefore you need to substitute the \n to another character, says, blank space.
mutate {
gsub => ['message', "\n", " "]
}
Then, grok pattern can parse the message. For example:
"content" => "The error message goes here ; array ( 'create' => array ( 'key1' => 'value1', 'key2' => 'value2', 'key3' => 'value3' ), )"
Isn't the issue simply the ordering of the filters. Order is very important to log stash. You don't need another line to indicate that you've finished outputting multiline log line. Just ensure multiline filter appears first before the grok (see below)
P.s. I've managed to parse a multiline log line fine where xml was appended to end of log line and it spanned multiple lines and still I got a nice clean xml object into my content equivalent variable (named xmlrequest below). Before you say anything about logging xml in logs... I know... its not ideal... but that's for another debate :)):
filter {
multiline{
pattern => "^\["
what => "previous"
negate=> true
}
mutate {
gsub => ['message', "\n", " "]
}
mutate {
gsub => ['message', "\r", " "]
}
grok{
match => ['message',"\[%{WORD:ONE}\] \[%{WORD:TWO}\] \[%{WORD:THREE}\] %{GREEDYDATA:xmlrequest}"]
}
xml {
source => xmlrequest
remove_field => xmlrequest
target => "request"
}
}

paypal expresscheckout in oscommerce get this error The totals of the cart item amounts do not match order amounts

I am working with the PayPal express checkout and am having issues in discount code. Below are my parameters:
PAYMENTREQUEST_0_CURRENCYCODE=CAD
&L_PAYMENTREQUEST_0_NAME0=test
&L_PAYMENTREQUEST_0_NUMBER0=
&L_PAYMENTREQUEST_0_AMT0=125.00
&L_PAYMENTREQUEST_0_QTY0=1
&PAYMENTREQUEST_0_ITEMAMT=125.00
&PAYMENTREQUEST_0_TAXAMT=0.00
&PAYMENTREQUEST_0_SHIPPINGAMT=0.00
&PAYMENTREQUEST_0_SHIPDISCAMT=0.00
&PAYMENTREQUEST_0_HANDLINGAMT=12.50
&PAYMENTREQUEST_0_AMT=112.50
Its returning the following error
Array ( [TIMESTAMP] => 2013-09-02T08:45:26Z [CORRELATIONID] => b5acd1c6276c4
[ACK] => Failure
[VERSION] => 85.0
[BUILD] => 7539191
[L_ERRORCODE0] => 10413
[L_SHORTMESSAGE0] => Transaction refused because of an invalid argument. See additional error messages for details.
[L_LONGMESSAGE0] => The totals of the cart item amounts do not match order amounts. [L_SEVERITYCODE0] => Error )
Can anyone help?
PAYMENTREQUEST_0_CURRENCYCODE=CAD
&L_PAYMENTREQUEST_0_NAME0=test
&L_PAYMENTREQUEST_0_NUMBER0=
&L_PAYMENTREQUEST_0_AMT0=125.00
&L_PAYMENTREQUEST_0_QTY0=1
&PAYMENTREQUEST_0_ITEMAMT=125.00
&PAYMENTREQUEST_0_TAXAMT=0.00
&PAYMENTREQUEST_0_SHIPPINGAMT=0.00
&PAYMENTREQUEST_0_SHIPDISCAMT=0.00
&PAYMENTREQUEST_0_HANDLINGAMT=12.50
&PAYMENTREQUEST_0_AMT=137.50