For increasing performance we want to duplicate part of our postgresql data to in-memory sql storage. At this moment I am searching for suggested in-memory relational database, it may run on either windows or linux. I will really appreciate any suggestions based on experience of usage or good references.
If you need an In-Memory RDBMS for extreme SQL OLTP and analytics performance then try Oracle TimesTen In-Memory Database.
TimesTen runs on both Linux and Windows [and Solaris, AIX and HP-UX].
TimesTen runs as either a system or record [RDBMS] or as a read/write cache for the Oracle RDBMS.
Oracle will also soon release the shared nothing scalout version of TimesTen called TimesTen Scaleout
Related
I would like to build an admin dashboard in Django framework.
So far I have only worked with sqlite3 databases in Django.
However, the admin dashboard should read statistics from an MSSQL database and display them accordingly. (Sales figures as a graph, other sales in a table, etc.)
The turnover figures are very extensive. There are several thousand entries per month and the client would like to be able to filter by any date period.
Only data from the MSSQL database should be read. Writing or updating the data is not desired.
So far this is no problem, but I wonder what is the better solution to implement the project.
Should I connect the MSSQL database directly to Django or should I read the MSSQL database in certain intervals and cache the "results" in a sqlite3 database?
Caching seems to me to be the nicer solution, as we don't need real-time data and the performance of the MSSQL server might not suffer as a result. But I would have to build an additional connector to transfer the data from MSSQL to sqlite3.
How would you approach such a project?
Short version: I´d like to display data in django-framework App, should I read directly from MSSQL-Server or should I import the MSSQL-Dataset to local sqlite3-database?
Thanks in advance for your answers.
From official SQLite page:
Situations Where SQLite Works Well:
Application file format
SQLite is often used as the on-disk file format for desktop
applications such as version control systems, financial analysis
tools, media cataloging and editing suites, CAD packages, record
keeping programs, and so forth.
Cache for enterprise data
Many applications use SQLite as a cache of relevant content from an
enterprise RDBMS. This reduces latency, since most queries now occur
against the local cache and avoid a network round-trip. It also
reduces the load on the network and on the central database server.
And in many cases, it means that the client-side application can
continue operating during network outages.
For performance issues I would like to execute an optimization algorithm on an in memory database in django (I'm likely to execute a lot of queries). I know it's possible to use a sqlite in memory (How to run Django's test database only in memory?) but I would rather use postgresql because our prod database is a postgresql one.
Does someone knows how to tell django to create the postgresql database in the memory ?
Thanks in advance
This is premature optimization. Postgresql is very very fast even if you are running it on a cold metal hard disk provided you use the right indexes. If you don't persist the data on disk, you are opening yourself upto a world of pain.
If on the other hand you want to speed up your tests by running an in memory postgresql database you can try something like these non durability optimizations:
http://www.postgresql.org/docs/9.1/static/non-durability.html
The most drastic suggestion is to use a ramdisk on that page. Here's how to set up one. After following the OS/Postgresql steps edit django settings.py and add the tablespace to the DATABASES section.
Last but not least: This is just a complete waste of time.
This is not possible. You cannot use PostgreSQL exclusively in memory; at least not without defeating the purpose of using PostgreSQL over something else. An in-memory data store like Redis running alongside PostgreSQL is the best you can do.
Also, at this point, the configuration is far out of Django's hands. It will have to be done outside of Django's environment.
It appears you may be missing some understanding about how these systems work. Build your app first to verify that everything is working, then worry about optimizing the database in such ways.
I'm trying to figure out which database would suit my needs. My c++ project need a database that will be running on devices sold to customers. Mainly it would only log data and events to a database on local SSD disk. Write speed is the most important as logging frequency can be up to 1000Hz (1 write per 1ms). It must be possible to access data remotely from other devices to make graphic visualisations of data. I have tested sqlite with 3rd party server, mysql and postgres. Postgres seems to be quite slow compared to others. As I've read Postgres will become good if concurrency will increase, but in my case concurrency is and will be quite low.
I'm wondering is there any other database for such needs. It also feels that mysql and postgres will be a litte overkill for such requirements. Any suggestions?
PostgreSQL is an enterprise quality database, and not fit for embedded devices. MySQL while smaller will also be a tight fit in an embedded device. SQLite is the most common, and is widely used in embedded devices, even quite small.
Go for sqlite because your requirement states that you App will be running on DEVICES and mostly I guest they are mobile devices and almost all mobile devices support sqlite.... so go for it...
Consider BerkeleyDB. It is a small-footprint embedded DB with a big commercial backer if you needed support, etc. There are open source versions as well as commercially licensed ones. There's no support for SQL querying, but unless you're doing quite complex relational queries this should not be a problem. Concurrency support is excellent, though initial database configuration tends to be awkward.
There's a Microsoft-only alternative in the form of the Extensible Storage Engine, that's free and available on most versions of Windows. There are various other 'DBM'-like simple embedded databases out there, so long as you don't feel you need SQL.
You might also consider an in-memory 'NoSQL'-style database; something like Redis will be very performant.
RDM Embedded may be a good fit for you. I'm with Raima and this product allows you to access data remotely and you can utilize the in-memory or a hybrid on-disk/in-memory database capabilities (www.raima.com/in-memory-database) if you need to. What could be useful for you in this particular case is that RDM products can be used together to manage data between embedded, mobile, desktop or server devices. This can be easily setup through our products, RDM Embedded, RDM Mobile, RDM Workgroup and RDM Server.
If you want to test performance of our database quickly before downloading the full product, go to our Database Performance Popcorn Samples.
I have a corporate website with django-cms and with a SQLite database that has only a few updates per month. The majority are reads. It will be less than 5000k requests per day.
My deployment is in a Cpanel server with apache and wsgi. I need to know first if I should be worried about the usage of SQLite and if PostgreSQL in this situation will be faster and less resources consumption (because it is already installed and running on server).
This site use a 11MB SQLite file. Is this file in memory?
SQLite will be faster than PostgreSQL except in cases of high contention. If you only update a few times per month, then SQLite will amost certainly be faster than PostgreSQL.
You may still choose to use PostgreSQL for other reasons (i.e. if you need network access to your data), but probably not for performance reasons.
I am working on a cross platform that needs to use a database to store information. I was thinking because MySQL is opensource, would it be possible to remove the networking components from MySQL so that the program can directly interact with it. Is this possible, or should i just ship the install with a copy of mysql server with all the settings predefined and use a connector.
SQLite has what you need. http://www.sqlite.org/
I think in theory you could do that, but I'm not sure if the amount of work would be worth it and the chances of breaking something would be pretty high. I would just ship mySQL with your application.
Or use sqllite as suggested by someone else.
It could be possible, but I am not sure it is worth it (or else, use something like sqlite or even gdbm).
MySQL is quite robust (thousands of developers, millions of users) so in practice you should consider it won't crash.
Your own application might be less robust. It probably would crash. Then having MySQL still running ensures you that the data are in a sane state.
And you might perhaps be later interested in having some other (perhaps external) application doing SQL requests to your MySQL database, or give the ability to have the MySQL database on a remote server.