Interacting with a simple database in c++? - c++

I'm making a game in c++ that will need to store and retrieve information about players (name, email, high score) etc. I thought of trying to just do it myself with XML but I think a real database (maybe SQL?) would do a better job since over time there may be thousands of users.
Are there libraries to do simple interactions with databases like queries, retrieving information, and storing information?
Thanks

Yes, SQLite will do exactly this. It stores the database as a local file, so if you want an online database server then this is perhaps not the best option.

It sounds like sqlite might be a good fit for you. It doesn't need a constantly running database server, and it does things smarter than hand-rolled xml-serialization.
Sqlite has documentation.

Related

Django/Sqlite Improve Database performance

We are developing an online school diary application using django. The prototype is ready and the project will go live next year with about 500 students.
Initially we used sqlite and hoped that for the initial implementation this would perform well enough.
The data tables are such that to obtain details of a school day (periods, classes, teachers, classrooms, many tables are used and the database access takes 67ms on a reasonably fast PC.
Most of the data is static once the year starts with perhaps minor changes to classrooms. I thought of extracting the timetable for each student for each term day so no table joins would be needed. I put this data into a text file for one student, the file is 100K in size. The time taken to read this data and process it for a days timetable is about 8ms. If I pre-load the data on login and store it in sessions it takes 7ms at login and 2ms for each query.
With 500 students what would be the impact on the web server using this approach and what other options are there (putting the student text files into a sort of memory cache rather than session for example?)
There will not be a great deal of data entry, students adding notes, teachers likewise, so it will mostly be checking the timetable status and looking to see what events exist for that day or week.
What is your expected response time, and what is your expected number of requests per minute? One twentieth of a second for the database access (which is likely to be slow part) for a request doesn't sound like a problem to me. SQLite should perform fine in a read-mostly situation like this. So I'm not convinced you even have a performance problem.
If you want faster response you could consider:
First, ensuring that you have the best response time by checking your indexes and profiling individual retrievals to look for performance bottlenecks.
Pre-computing the static parts of the system and storing the HTML. You can put the HTML right back into the database or store it as disk files.
Using the database as a backing store only (to preserve state of the system when the server is down) and reading the entire thing into in-memory structures at system start-up. This eliminates disk access for the data, although it limits you to one physical server.
This sounds like premature optimization. 67ms is scarcely longer than the ~50ms where we humans can observe that there was a delay.
SQLite's representation of your data is going to be more efficient than a text format, and unlike a text file that you have to parse, the operating system can efficiently cache just the portions of your database that you're actually using in RAM.
You can lock down ~50MB of RAM to cache a parsed representation of the data for all the students, but you'll probably get better performance using that RAM for something else, like the OS disk cache.
I agree with some of other answers which suggest to use MySQL or PostgreSQL instead of SQLite. It is not designed to be used as production db. It is great for storing data for one-user applications such as mobile apps or even a desktop application, but it falls short very quickly in server applications. With Django it is trivial to switch to any other full-pledges database backend.
If you switch to one of those, you should not really have any performance issues, especially if you will do all the necessary joins using select_related and prefetch_related.
If you will still need more performance, considering that "most of the data is static", you actually might want to convert Django site a static site (a collection of html files) and then serve those using nginx or something similar to that. The simplest way I can think of doing that is to just write a cron-job which will loop over all needed url-configs, request the page from Django and then save that as an html file. If you want to go into that direction, you also might want to take a look at Python's static site generators: Hyde and Pelican.
This approach will certainly work much faster then any caching system however you will loose any dynamic components of the site. If you need them, then caching seems like the best and fastest solution.
You should use MySQL or PostgreSQL for your production database. sqlite3 isn't a good idea.
You should also avoid pre-loading data on login. Since your records can be inserted in advance, write django management commands and run the import to your chosen database before hand and design your models such that when a user logs in, the user would already be able to access and view/edit his or her related data (which are pre-inserted before the application even goes live). Hardcoding data operations when log in does not smell right at all from an application design point-of-view.
https://docs.djangoproject.com/en/dev/howto/custom-management-commands/
The benefit of designing your django models and using custom management commands to insert the records right way before your application goes live implies that you can use django orm to make the appropriate relationships between users and their records.
I suspect - based on your description of what you need above - that you need to re-look at the approach you are creating this application.
With 500 students, we shouldn't even be talking about caching. If you want response speed, you should deal with the following issues in priority:-
Use a production quality database
Design your application use case correctly and design your application model right
Pre-load any data you need to the production database
front end optimization comes first (css/js compression etc)
use django debug toolbar to figure out if any of your sql is slow and optimize specifically those
implement caching (memcached etc) as needed
As a general guideline.

Is SQLite right for my game?

I have been looking into different database libraries for my online card game (PostgreSQL, Oracle, etc) and, while SOCI + pg or Oracle are much more powerful, they also are tricky to compile, integrate, and do a whole lot more than what I need to do.
Quite simply these are my requirements:
Store the username, hash, wins, loses, email. Very simple.
The game itself will actually not communicate with the database that often. When the player logs in, I will log them in by retrieving the row by username, and verifying the hash with the hash generated by the password they enter.
Other than that, the server only accesses the database to add a user, record a win or loss after playing a round, or to update personal information.
Given that SQLite supports limited concurrency, this should be fine for my needs even if I have 100 or so card games running concurrently.
Reflecting on the above, is SQLite right for me or should I seriously think of opting for a more complex solution? Bearing in mind that databases are not my strongest point.
Thanks
With that many clients SQLite is a perfect fit. However, I would recommend you wrap up this functionality in a simple and not database specific interface, and implement it for SQLite. Once you will get millions of clients along with concurrency/performance issues, you can simply throw in another implementation that uses more powerful database without changing your application code much.

Database in a C++ Program

I've got a C++ program that needs to deal with a lot of typical database problems - looking at tables, inserting and deleting values, searching for records. All of the database information has to be stored locally. Let me emphasise that - I don't want to communicate with a server, I want the information to be stored on the user's computer.
Are there any libraries that can easily implement all this functionality, preferably in a SQL style syntax? Or what are some ways to easily and robustly implement this functionality?
You can use embedded DB.
I think SQLite is one of the more popular ones.
My personal preference would be SOCI, with a SQLite backend.
http://soci.sourceforge.net/
http://soci.sourceforge.net/doc/backends/sqlite3.html
http://www.sqlite.org/

Have you tried to use SQLite as the query engine for your *raw* database?

I have made a custom report generator for our database (Oracle Berkeley DB engine). Now it's time for me to add more flexibility and I am in dilemma. Do a partial or a fundamental redesign?
Lets assume that I have plenty of time.
I can only read the database, I don't have the right to modify it.
Inspired from Query Anything with SQLite article, I would like to let SQLite engine to do the dirty work (grouping, filtering, etc).
Have you tried it? Any examples? What about performance issues?
It works fine for what I am using it :-) However I don't use it together with another database, just standalone. There's a list of Well-known Users of SQlite on their website.
You need to tell us more about your usecase to make any speculations about performance, but I'd rather make a POC and measure performance Long-held, incorrect programming assumptions
There is a nice quickstart article on the sqlite site.
Here's the C/C++ API Reference.
I assume you should be able to create a temporary SQLite table by initially querying your other DB and inserting the data into the temporary SQLite table. Then you can use different querys on that temporary table to do your grouping, filtering, etc.

Sqlite as a replacement for fopen()?

On an official sqlite3 web page there is written that I should think about sqlite as a replacement of fopen() function.
What do you think about it? Is it always good solution to replece application internal data storage with sqlite? What are the pluses and the minuses of such solution?
Do you have some experience in it?
EDIT:
How about your experience? Is it easy to use? Was it painful or rather joyful? Do you like it?
It depends. There are some contra-indications:
for configuration files, use of plain text or XML is much easier to debug or to alter than using a relational database, even one as lightweight as SQLite.
tree structures are easier to describe using (for example) XML than by using relational tables
the SQLite API is quite badly documented - there are not enough examples, and the hyperlinking is poor. OTOH, the information is all there if you care to dig for it.
use of app-specific binary formats directly will be faster than storing same format as a BLOB in a database
database corruption can mean the los of all your data rather than that in a single bad file
OTOH, if your internal data fits in well with the relational model and if there is a a lot of it, I'd recommend SQLite - I use it myself for one of my projects.
Regarding experience - I use it, it works well and is easy to integrate with existing code. If the documentation were easier to navigate I'd give it 5 stars - as it is I'd give it four.
As always it depends, there are no "one size fits all" solutions
If you need to store data in a stand-alone file and you can take advantage of relational database capabilities of an SQL database than SQLite is great.
If your data is not a good fit for a relational model (hierarchical data for example) or you want your data to be humanly readable (config files) or you need to interoperate with another system than SQLite won't be very helpful and XML might be better.
If on the other hand you need to access the data from multiple programs or computers at the same time than again SQLite is not an optimal choice and you need a "real" database server (MS SQL, Oracle, MySQL, PosgreSQL ...).
The atomicity of SQLite is a plus. Knowing that if you half-way write some data(maybe crash in the middle), that it won't corrupt your data file. I normally accomplish something similar with xml config files by backing up the file on a successful load, and any future failed load(indicating corruption) automatically restores the last backup. Of course it's not as granular nor is it atomic, but it is sufficient for my desires.
I find SQLite a pleasure to work with, but I would not consider it a one-size-fits-all replacement for fopen().
As an example, I just wrote a piece of software that's downloading images from a web server and caching them locally. Storing them as individual files, I can watch them in Windows Explorer, which certainly has benefits. But I need to keep an index that maps between a URL and the image file in order to use the cache.
Storing them in a SQLite database, they all sit in one neat little file, and I can access them by URL (select imgdata from cache where url='http://foo.bar.jpg') with little effort.