I have a folder containing all the log files, the filenames are colour-red, colour-green, colour-blue, colour-yellow, etc. I am writing the spl to include all the files except one, e.g. colour-white.
I know the * performs the wildcard search, and [^c] excludes specific character in the bracket. But I don't know how to combine them to exclude a certain word. On the other hand, I am not sure the same regrex rule apply for splunk.
source= "log/colour-*"
source= "log/colour-[^w]"
The desired result of the query is to retrieve all the files, expect colour-white.
Maybe some filters can be applied to retrieve the desired result, but so far the filters I know are for the file contents, not the file names.
You can also use something like this in your search query,
source!="log/colour-white"
And you can also check the difference between != and NOT at below link to get a more clear info on what to use.
Splunk Answers
The search command (the implicit command before the first |) does not support regex. To exclude something, use NOT.
(source = "log/colour-*" NOT source = "log/colour-w*")
Related
I would like WinMerge to compare the full text but exclude a variable substring.
Orientation="West" PhysicalAddress="2395226" DefFieldFrmt="Uf4d0" UnitCustomText="sec"
Orientation="West" PhysicalAddress="2395230" DefFieldFrmt="Uf4d1" UnitCustomText="sec"
In the lines above I want to ignore the PhysicalAddress="xxx" and locate the changed DefFieldFrmt="Uf4d1"
I have tried adding the filter:
PhysicalAddress=".*"
However this filters the complete line.
The actual text before and after the PhysicalAddress="xxx" will vary so I need a filter that says: match prefix and match suffix but ignore target variable substring.
Help please.
According to the documentation, is not possible to use the line filters for this:
When a rule matches any part of the line, the entire difference is ignored. Therefore, you cannot filter just part of a line.
However, since WinMerge's source code is on GitHub, it is possible to add a feature request for this to its list of issues.
I need a nice column for Centrify tool which include all the log files under the different folders, for example;
/oradata1/oracle/admin/A/scripts/rman_logs/*.log
/oracle/oracle/admin/B/scripts/rman_logs/*.log
/oradata2/admin/C/scripts/logs/*.log
I used this but after the * character user can see all logs;
/ora(data(1|2)|cle)/oracle|admin/admin/*/scripts/rman_logs
/ora(data(1|2)|cle)/oracle|admin/admin/*/scripts/rman_logs
Which expression I must use.
If I understandy our question correctly, you want only .log files. You can use a positive lookahead to assert that it is indeed a log file (contains .log at the end of filename), and match the filename whatever it is (.*).
Then it's really easy. (?=.*\.log(?:$|\s)).* Of course, you can also add specific folders if you wish to restrict the matches, but the positive lookahead will still do its work. I.e. (?=.*\.log(?:$|\s)).*/scripts/.*
EDIT: As your comment, you only need those folders, so you just specify their filepaths in alternations and add [^.\s\/]*\.log at the end. So:
(?:\/oradata1\/oracle\/admin\/A\/scripts\/rman_logs\/|\/oracle\/oracle\/admin\/B\/scripts\/rman_logs\/|\/oradata2\/admin\/C\/scripts\/logs\/)[^\s.\/]*\.log You may shorten the regex by trying to combine filepath elements, but, imo, not necessary as you might as well specify each filepath individually, if they don't overlap too much.
I have found a global expression.
this is not a good way but it works and save me from lots of job. The main files are under the ....../scripts/rman_logs/ for all servers so I use this way.
I can produce these lines and can be a command group for users so this works good
tail /////scripts/rman_logs/*.log
tail ////scripts/rman_logs/.log
Thanks for your helps.
I have many files containing following types of line -
* #version $Revision: 1.xxx
I want this type of line to be ignored while comparing using winmerge. I have tried with line filters. But, till not able to do that.
Can any one help me in this regard?
You can actually do this with line filters.
Select the filter option from the Selection Dialog
Go to the Linefilters tab, enable line filters and then compose a regular expression that matches lines you want to filter out.
In the results, it will highlight filtered lines in a different color, but it won't treat them as differences in the individual file or folder summary.
In your particular case, you'll want to find a regex pattern that matches your naming convention, but something like this should work:
#version \$Revision: \d\.\d*
After many tries the following regular expression is working for me -
^ \* #version \$Revision:
Check the WinMerge release notes:
Filters only applied when using full compare.
Line filtering is only applied in folder compare when using Full
Contents-compare method.
If you are using any other compare method, line filters are not
applied. Files marked different in folder compare can get status
changed to identical when opening them to file compare.
So, you will not be able to use regex to filter out the lines that are open in right/left pane.
You will have to install and use 3rd party filters, e.g. http://regexfilterforw.sourceforge.net/.
This is probably real simple, but I can't seem to figure out how to do it.
I have an application in R (Shiny) where a user uploads to the application a *.zip file that contains all the components of an ESRI shapefile. I unpack these files into their own directory. This folder then, may or may not, contain a *.shp.xml file. At some point in my R code, I need to find the exact name of the *.shp file that has been unpacked, and distinguish it from the *.shp.xml file. How do I write the expression that will do that? I was thinking to use list.files, but I am unsure how to write the rest of the expression.
thanks!
With R regex patterns the "$" has special meaning as the end of a character element (and the 'dots' need to be escaped with \\, so
shpfils <- list.files(path, pattern="\\.shp$")
This should isolate your file -
Sys.glob("*shp")
as compared to
Sys.glob("*shp*")
which should give both the files
or
Sys.glob("*shp.xml")
which should give the .shp.xml file
I'm trying to load multiple .txt files in R, from different folders.
I have problems writing the path and pattern using regular expressions.
My path has this structure:
'/Users/folderA/folderB/folderC/folderD/01_01_2012/folderE/file.txt'
So, the path is almost the same, except that the folder with the date name always changes.
I have tried to load it like this:
filesToProcess <- list.files(path = "/Users/folderA/folderB/folderC/folderD/",
pattern = "*_*_*/folderE/*.txt")
But this doesn't seem to work.
Could someone please help me writing down this with regular expressions?
Thanks a lot!
The key here is to use argument recursive=TRUE so that you can search inside the folders that are in the original directory:
filesToProcess <- list.files(path = "/Users/folderA/folderB/folderC/folderD",
pattern = "txt", recursive = TRUE, full.names = TRUE)
The pattern has to correspond to the name of the files, it can't refer to the name of the folders (see ?list.files). That's why you need a second step where you have to narrow down to the specific folders you wanted. Note the use of argument full.names=TRUEin the previous call that allow us to keep the path of each file (NB: you also have to drop the final / of the path argument or else it ends up doubled in our output and leads to an error when you'll try to upload the files).
filesToProcess[grep("folderE", filesToProcess)]
A final note:
Your regular expression was flawed anyway: * means
The preceding item will be matched zero or more times.
What you wanted was .: see ?regexp
The period . matches any single character.
Although the subject refers to regular expressions it seems from the example that you really want to use globs. In that case try:
Sys.glob("/Users/folderA/folderB/folderC/folderD/*_*_*/folderE/*.txt")