How does geolite region codes works? - geoip

We are using the region code from Geolite (Maxmind) that we found here: http://geolite.maxmind.com/download/geoip/misc/region_codes.csv
Does anyone know how they select the region they cover? I'm asking because for Latvia (LV) we cannot find any logical reason for those regions. There also are duplicated entries and entries which are exactly the same but with an s at the end.
If anyone already used them and know the reason, it would be very helpfull. Thanks!

Those are FIPS 10-4 region codes. Looking at Latvia, it appears that the distinction between "Municipality" and "District" got lost in MaxMind's data.

Related

Nifi: ReplaceTextWithMapping doesnt work as expected

I have been trying out examples in nifi and has been trying out the ReplaceTextWithMapping processor. The processor config as given below.
The mapping file is shown below.And I have used tab space between key and value.
I have followed this article here. And there is a lots of ID in the input file to the processor.There is no error on the logs. Can someone help me understand the problem.
The only workaround I could find was giving the values directly like (ID|POS). This does work. But still don't know why regex is not working.

Kibana Mapping conflict How to make sure this error wont get repeated

I am new to kibana we are using Aws es 5.5. i have setuped the dashboards yesterday which are working fine but today morning when i see all dashboards are empty with out no data. i found it was due to Mapping conflict. In google i found one Answer was to reindex the data. how can we prevent in future this type of errors.
Any Answers would be greatly Appreciated.
Probably you have the same field twice with not the same mapping, for example gender define as string in one place and in other place define as number.
You need to check it and prevent it next time

How to link multiple ports from a Expression to multiple groups of a Union

I add an image in order to explain myself better.
I have 300 something ports in a expression. I have created the equivalent number of groups in a union. I want each port of this expression to go to a port/field of the Union. One to one relationship. It seems like powercenter is not able to do this with autolink, or at least I'm unable to find the proper way to do this. How could I work arround this issue? Because I've been told that is likely that in a few days it will be more than 700 ports, and the amount it takes to do by hand is quite insane. Thanks in advance.
I'm surprised it validates... union is for homogenous sources but you seem to be trying to pivot your data (in which case I'd suggest using another transformation i.e. a normalizer and Informatica will start behaving as expected)
Possible solution: make a bunch of connections, save and export the file as xml, go to the lines when the connections are done, and replace that zone with as many rows as you need.
What I did specifically was to get the original rows, change the names as appropiate with the help of notepad++ and excel, and then go back to the original file and replace all of it. Check everything three times, and import the file back to powercenter.
I say possible solution because it's messy and dirty, but even though it may lead to mistakes I feel like the amount is vastly inferior and you have the versioning on your side, so just save before exporting. If someone with more experience could tell me it's thoughts about this, it would be a great opportunity to learn, just leaving this in case it goes unanswered

Transport Rule Logical And for Exchange 2010

Good Afternoon,
I have exhausted my googling and best-guess ideas, so I hope someone here has an idea of whether this is possible or not.
I am using Exchange Server 2010 (vanilla) in a test environment and trying to create a Hub Transport Rule using the Exchange Management Console. The requirements of the rules filtering are similar to the following scenario:
1.) If a recipient's address matches (ends with) "#testdomain.com" AND (begins with) "john"
2.) If the sender's address matches (ends with) "#testdomain.com"
3.) Copy the message to the "SupervisorOfJohns#testdomain.com" mailbox
I have no problems doing items 2 and 3, but I cannot figure out how to get item 1 in the same condition. I have come across some threads that simply concluded that MS goofed on this, but I am hesitant to fault them for something which seems like it should be really straightforward. I must be missing something. Expressions I have tried so far...:
1.) (^john)(#testdomain.com$)
2.) ^(john)(#testdomain.com)$
3.) (^john)#testdomain.com
4.) ^john #testdomain.com$
5.) ^(john)#testdomain.com
If you use the interface and +Add them as two separate entries, it treats them as an OR clause (if a recipient address begins with "john", OR it ends with "#testdomain.com"). As you can see from my simplistic attempts, I have barely any clue what can/should work in this case. Any suggestions or ideas would be appreciated.
Respectfully,
B. Whitman
Here's what I ended up using:
john\w*#testdomain.com
The reasoning behind the question is that I'm trying to make a service to catch certain e-mails and do some processing with them. I also wanted to restrict the senders/recipients to certain domains (though some checking will also be done with the processing service). Thanks to hjpotter92 for his solutions!

How to access facebook insight datas for consecutive months?

I'm new to Facebook app development I hope I can get an answer here.
Is that possible to retrieve Facebook insight data for consecutive months?
I tried end_time=2010-01-01 to since=2010-01-31 and period=month but I got
The specified date range cannot exceed 3024000 seconds!!
How will I get like 2010-02-01 to 2010-02-28 and 2010-03-01 to 2010-03-31?
I have tried and used lots of examples but I couldn't succeed: How can I solve this problem?
The thing that has worked for me is very similar to what you are doing, with the difference being that I use the UNIX timestamp for SINCE and UNTIL.
Example:
https://graph.facebook.com/212686148747689/insights/
page_impressions_by_city_unique/week/?
access_token=QWERTYUI&since=1315699200&until=1320796800
(That's all supposed to be on one line, but it's easier to read it this way, at least for me.)
With this approach, you want to be careful and make sure that the difference between SINCE and UNTIL is not bigger that 90 days. Otherwise, you'll get an error, like so:
(#604) The specified date range cannot exceed 7776000 seconds
Finally, if you don't have a way of generating the UNIX timestamp automatically, go to a web site like:
http://www.epochconverter.com/
If anyone else has some better insights, please share. I hope this helps.