Changes in maxmind.com free GeoLite databases - geoip

Maxmind.com had changed some fields and name scheme of their free cvs databases as of 01/06/2015. Furthermore the IPv6 database information had been split of and is provided separately now.
I would like to know, if
this happened also to the databases in mmdb format?
maxmind gave some info to developers prior changing this?
I'am running the github.com/AndreasBriese/ipLocator repo and was surprised of the changes (that posed the need to update the software yesterday)

If you are referring to the GeoLite2 CSV format, it was an alpha pre-release until recently in order to get feedback on the format. This was clearly specified on the download page as well as the CSV spec page.
The MMDB GeoIP2 databases have not changed as they have been production releases for well over a year. Similarly, the Legacy CSVs have not changed.

Related

Access to ImageNet data download

I've already been granted by the ImageNet website http://www.image-net.org/download-images to download the image data. And the page shows:
You have been granted access to the the whole ImageNet database through our site. By doing so you agree to the terms of access.
Download as one tar file
The full ImageNet data is currently unavailable. Data for ILSVRC is available.
ImageNet Fall 2011 release MD5: ...
ImageNet10K from Deng et al. ECCV2010
But both of the links shows "OOPS The url is not valid." when opened. (It's absolutely not due to some problem of my web or browser. I can tell this by the consistency of ImageNet web page style. I guess these links are too old, and moved to other urls, yet their website didn't update at once)
I have two questions here.
(1) Where and how can I really download the ImageNet data (as well as their labels, for classification task)?
(2) I want the data for the purpose to validate my method in my paper. Even if the dataset is downloaded, I'm afraid that it's unnecessarily big. Do I have to validate on ImageNet (Since its adopted in many papers anyway...) ? The Tiny ImageNet data's page seems not broken on their website. But its a dataset much smaller.
ImageNet Download:
Go to https://www.kaggle.com/c/imagenet-object-localization-challenge and click on the data tab. You can use the Kaggle API to download on a remote computer, or that page to download all the files you want directly.
There, they provide both the labels and the image data.
I don't know what is up with the ImageNet website, however, the url list links were also broken for me today. One way you can still get the data is by going to an alternate mirror, such as Kaggle ImageNet download, the link I provided above. From what I have hears, the Kaggle ImageNet is equivalent to the ImageNet from their website.
I'm unsure about how to answer your second question, as I don't know enough about your project. However, ImageNet will probably work to validate your model.
It can be downloaded in python using the datasets library:
>>> from datasets import load_dataset
>>> ds = load_dataset("imagenet-1k")
>>> train_ds = ds["train"]
>>> train_ds[0]["image"] # a PIL Image
You may need to install it as well as Pillow and login to Hugging Face after accepting the terms of access
pip install datasets Pillow
huggingface-cli login
You can find more info and links to download the files on the ImageNet page on Hugging Face: https://huggingface.co/datasets/imagenet-1k

How do I clean xDB in Sitecore?

Have recently tried working with xDB in Sitecore 8 and now looking for the way of cleaning out current stats from xDB without re-installing Sitecore. I deleted data files for Mongo (as was suggested) but still see figures in Analytics in Sitecore; also did iisreset but also did not help. What am I doing wrong? (I am new to Sitecore so might be missing something).
Have you tried to clean-up only MongoDB files, without Reporting database?
If yes, I think that is a point of your confusion. The way it works in xDB is that all tracking analytics data is written into Mongo and then by SessionEnd processed and saved into Reporting database, that is SQL database, same way as it was before previously in DMS. In that case you need to clean that database as well.
If you have access to SQL, you may use __DeleteAllReportingData stored procedure as the quickest:
More correct approach that goes well for instances where there is no direct access to DB is using admin tool for that located at /sitecore/admin/RebuildReportingDB.aspx. Also there was a module Analytics Database Manager previously, however I do not know its current state.
Reference: Walkthrough: Rebuilding the reporting database (from official documentation)

Migrating From ColdFusion 7 to ColdFusion 11

I'm planning a migration on a server from ColdFusion MX7 (Server 2003) to ColdFusion 11(Server 2012). There is a Other Server Where I need to migrate from ColdFusion 8 (Server 2008) to ColdFusion 11. Does my System effect in any way when upgrading like tags, or compatibility issues. Does anyone know which steps I should without effecting. I know about the code analyzer that we had in Cf administrator. I want to know if there is anything effected seriously when migrating.
Thanks in Advance
Kiran Kumar
The Code Analyzer helps in migrating your applications to ColdFusion 11 from earlier versions of ColdFusion. However, it checks the same for only two versions back. The Code Analyzer reviews the CFML pages that you specify and informs you of any potential compatibility issues. It detects unsupported and deprecated CFML features, and outlines the required implementation changes that ensure a smooth migration.
As far as the code compatibility is concerned, everything "should" work. However, it is recommended to check the code compatibility and deprecated tags (if any). You can refer to https://wikidocs.adobe.com/wiki/display/coldfusionen/Deprecated+Features & https://wikidocs.adobe.com/wiki/display/coldfusionen/Deprecated+tags,+attributes,+and+values.
I have briefly covered the entire Migration process here. So, will not iterate the same. Also, you can have a look at another helpful article for Migration Tweaks.
Having said that all, it's strongly recommended to test your website on the Testing/Development environment, before moving it on Production.
Hope this gives a better picture of the migration process.
I did the migration in the past, did not face important issue, as everyone have a different system the best solution would
- Backup
- Test the upgrade and see
if it's a production machine, you can copy your machine to a vm and test the upgrade there. it's may be a lot of work, but you can not know if you don't test
I am currently moving a ColdFusion 9 site to Coldfusion 11 and the way I tested it was to create a separate set of folders on the ms2013 server. I ran them side by side with a duplicate database with a different name for the test site.
I have moved sites up from 5 to 9 with few issues and the only one that really got me in ColdFusion 11 is dbtype in database functions. It has not only been depreciated but will always throw an error if found.
It also depends on how Coldfusion 11 will react to cfcs and other special tags if you use them. I don't so it was a snap.
Examples:
mydatabase
mydatabase1
mypagesfolder
mypagesfolder1
index.cfm
index1.cfm
Going live was a snap. I just renamed the folders, links*, dsn and renamed index1.cfm to index.cfm.
*Links only need to be changed if posting outside of folder and if so just the path.

Migrating Sitecore 6.6 to Sitecore 8

Recently Sitecore 8 has released and it has came up with lot of exciting new features. So our team decided to move from Sitecore 6.6 to Sitecore 8. Before migrating, i would like to know what all things i should be having in handy. Such as, .net Framework, Hardware configuration, environment etc.
Also, i would like to know the procedure to migrate from 6.6 to 8? I, never involved in sitecore migration project before. Please suggest me some good articles or post here your thoughts.
Thanks in advance. :)
See the Sitecore Compatibility Table for the .NET Framework, SQL Server version and Windows version.
Two common approaches.
1) Follow the Sitecore upgrade path.
2) Package the content, and start with a clean install.
Currently I working on a upgrade with an scripted upgrade that follow the Sitecore path. So I can easy repeat the steps and have the latest content in the databases.
I have some of my findings put down here Sitecore update and modules this article contain also a Related links section. Such as the upgrade white paper from Varun
Depending on how 'cluttered' your existing instance is, you may also want to consider installing a fresh copy of Sitecore 8 and then migrate your data/code to avoid all the hops that would be necessary to get to 8.
May be the following blog might help. Take a look at it.
https://varunvns.wordpress.com/2014/11/11/sitecore-version-upgrade-whitepaper/
I would recommend you make a backup of your site to use as a "sandbox" for the upgrade. Copy your databases and the web root for your site to a new location and then set up an IIS site with appropriate permissions pointing to your copy, and change your connection strings in the copy to point to a copy of the databases you backed up.
Perform the update there and ensure everything is working correctly. Work slowly to make sure you are following instructions correctly and note any special actions you had to take to perform the upgrade. Once you have it upgraded, perform the same process on the "real" site.
If you work with a Sitecore partner, I would highly encourage you to discuss the process with them to learn more specifics about the risks and challenges you may encounter during your upgrade.

Options for maintaining MySQL databases for a django development team

What are some options to avoid the latency of pointing local django development servers to a remote MySQL database?
If developers use local MySQL databases to avoid the latency, what are some useful tools to sync schema updates of the remote db with the local db and avoid manually creating, downloading, and loading dumps?
Thanks!
One possibility is to configure the remote MySQL database to replicate to the developers local machine - assuming you have control of the remote database's configuration.
See the MySQL docs for replication notes. Using MySQL replication the remote node would be the Master and the developer machines would be Slaves. The main advantage of this approach is your developer machines would always remain synchronized to the Master database. One possible disadvantage (depending on the number of developer machines you are slaving) is a degradation in the remote database's performance due to extra load introduced by replication.
It sounds like you want to do schema migrations. Basically it's a way to log schema changes so that you can update and even roll back along with your source changes (if you change a model you also check in a new migration that has up and down commands). While this will likely become an official feature at some point, there are several third-party solutions to choose from. It's really a personal preference, here are some popular ones:
South
Django Evolution
dmigrations
I use a combination of South for schema migrations, and storing JSON fixtures (or SQL dumps) of useful test data in the VCS repo for the project. Works pretty seamlessly.