Design of data storage C++ application (maybe relational database) - c++

I need to store and load some data in a C++ application. This data is basically going to end up as a set of tables as per a relational database.
I write the data to tables using something like csv format, then parse them myself and apply the database logic I need in my C++ code. But it seems stupid to reinvent the wheel with this and end up effectively writing my own database engine.
However, using something like a MySQL database seems like massive overkill for what is going to be a single user local system. I have tried setting up a MySQL daemon on my Windows system and I found it rather complex and possibly even impossible without admin privileges. It would be a serious obstacle to deployment as it would need each user's system to have MySQL set up and running.
Is there a reasonable middle ground solution? Something that can provide me with a simple database, accessible from C++, without all the complexities of setting up a full MySQL install?
NB. I have edited this question such that I hope it satisfies those who have voted to close the question. I am not asking for a recommendation for a tool, or someone's favourite tool or the best tools. That would be asking which database engine should I use. I am asking for what tools and design patterns are available to solve a specific programming problem - i.e. how can I get access to database like functionality from a C++ program, without writing my own database engine, nor setting up a full database server. This is conceptually no different to asking e.g. How do I print out the contents of a vector? - it's just a bigger problem. I have described the problem and what has been done so far to solve it. My understanding from the On Topic Page is that this is within scope.

You can try sqlite.
Here are some simple code examples: https://www.tutorialspoint.com/sqlite/sqlite_c_cpp.htm

Related

Which are suitable local database (easy to handle with C++) for a small data set

The computer has already been installed ORACLE. But I didn't try ORACLE before, I just use sqlite...So, for now I want to create a database locally, and just insert one table. I want to use a test C++ program to read and write in this database. Is there something useful for using API about this? BTW, in the computer I saw SQL plus and SQL developer, but I didn't find the DBCA, which I know can be used to create table...help me, thanks a lot!!
BTW, my supervisor wants to me to test different kinds of database, which will be written in my report. So I want to test different kinds of databases, for the data is quite small, just 100 lines in a table is enough, but it will be applied in a big program, so I need to try different kinds of database locally, and not difficult for me to use C++ API...Because I just know little about database. I need some suggestions, thank a lot!
While I wouldn't recommend Oracle for handling small data sets, I do have a pointer to C++ API documentation.
http://www.oracle.com/pls/db112/portal.portal_db?selected=5&frame=
... and scroll to the bottom of the page for links to further information.
There's a choice of either using C++ with Oracle OCI API, or using C++ with Oracle Pro*C precompiler. The precompiler actually does produce code that uses the OCI API.
I have done some work with the Pro*C precompiler (using C as the implementation language, though, not C++), and it wasn't too bad. OCI tends to be quite low-level, but apparently writing direct OCI code has its uses, too.

Database in a C++ Program

I've got a C++ program that needs to deal with a lot of typical database problems - looking at tables, inserting and deleting values, searching for records. All of the database information has to be stored locally. Let me emphasise that - I don't want to communicate with a server, I want the information to be stored on the user's computer.
Are there any libraries that can easily implement all this functionality, preferably in a SQL style syntax? Or what are some ways to easily and robustly implement this functionality?
You can use embedded DB.
I think SQLite is one of the more popular ones.
My personal preference would be SOCI, with a SQLite backend.
http://soci.sourceforge.net/
http://soci.sourceforge.net/doc/backends/sqlite3.html
http://www.sqlite.org/

Implementing a user registry in a c++ based server

So I'm planning to develop a Community feature for a game I'm developing. Currently, the high score server, which I want to integrate this user registry with is developed in pure c++.
Is there a c++ library for developing user registries? Currently, I am thinking of implementing the user registry by saving to the file system, but if there are libraries for this it'd be even better.
This sounds like a very open ended question.
If you intend to have quite a lot of users and possibly do reporting on data or have to manage pieces of data manually.... you may want to look into sqlite. There would be slightly less administration and overhead than using a fully featured DBMS.
Are you sure you want the high score server in C++? Seems a bit out of place for me.
I would suggest something higher like PHP or Python with a SQLite database. Undoubtly example scripts for user registration can be found on the internet.

Are there cross-platform tools to write XSS attacks directly to the database?

I've recently found this blog entry on a tool that writes XSS attacks directly to the database. It looks like a terribly good way to scan an application for weaknesses in my applications.
I've tried to run it on Mono, since my development platform is Linux. Unfortunately it crashes with a System.ArgumentNullException deep inside Microsoft.Practices.EnterpriseLibrary and I seem to be unable to find sufficient information about the software (it seems to be a single-shot project, with no homepage and no further development).
Is anyone aware of a similar tool? Preferably it should be:
cross-platform (Java, Python, .NET/Mono, even cross-platform C is ok)
open source (I really like being able to audit my security tools)
able to talk to a wide range of DB products (the big ones are most important: MySQL, Oracle, SQL Server, ...)
Edit: I'd like to clarify my goal: I'd like a tool that directly writes the result of a successful XSS/SQL injection attack into the database. The idea is that I want to check that every place in my app does correct output encoding. Detecting and avoiding the data getting there in the first place is an entirely different thing (and might not be possible when I display data that's written to the DB by a third-party application).
Edit 2: Corneliu Tusnea, the author of the tool I linked to above, has since released the tool as free software on codeplex: http://xssattack.codeplex.com/
I think metasploit has most of the attributes you are looking for. It may even be the only one that has all of what you specify, since all the others I can think of are closed source. There are a few existing modules that deal with XSS and one in particular that you should take a peek at: HTTP Microsoft SQL Injection Table XSS Infection. From the sounds of that module it is capable of doing exactly what you are wanting to do.
The framework is written in Ruby I believe, and is supposed to be easy to extend with your own modules which you may need/want to do.
I hope that helps.
http://www.metasploit.com/
Not sure if this is what you're after, its a parameter fuzzer for HTTP/HTTPS.
I haven't used it in a while, but IIRC it acts a proxy between you and the web application in question - and will insert XSS/SQL Injection attack strings into any input fields before deeming whether the response was "interesting" or not, thus whether the application is vulnerable or not.
http://www.owasp.org/index.php/Category:OWASP_WebScarab_Project
From your question I'm guessing it is a type of fuzzer you're looking for, and one specifically for XSS and web applications; if I'm right - then that might help you!
Its part of the Open Web Application Security Project (OWASP) that "jah" has linked you to above.
There are some Firefox plugins to do some XSS testing here:
http://labs.securitycompass.com/index.php/exploit-me/
A friend of mine keeps saying, that php-ids is pretty good. I haven't tried it myself, but it sounds as if it could approximately match your description:
Open Source (LGPL),
Cross Platform - PHP is not in your list, but maybe it's ok?
Detects "all sorts of XSS, SQL Injection, header injection, directory traversal, RFE/LFI, DoS and LDAP attacks" (this is from the FAQ)
Logs to databases.
I don't think there is such a tool, other than the one you pointed us to. I think there's a good reason for that: It's probably not the best way to test that each and every output is properly encoded for the applicable context.
From reading about that tool it seems the premise is to insert random xss vectors into the database and then you browse your application to see if any of those vectors succeed. This is rather a hit and miss methodology, to say the least.
A much better idea, I think, would be to perform code reviews.
You may find it helpful to have a look at some of the resources available at http://owasp.org - namely the Application Security Verification Standard (ASVS), the Testing Guide and the Code Review Guide.

Sqlite as a replacement for fopen()?

On an official sqlite3 web page there is written that I should think about sqlite as a replacement of fopen() function.
What do you think about it? Is it always good solution to replece application internal data storage with sqlite? What are the pluses and the minuses of such solution?
Do you have some experience in it?
EDIT:
How about your experience? Is it easy to use? Was it painful or rather joyful? Do you like it?
It depends. There are some contra-indications:
for configuration files, use of plain text or XML is much easier to debug or to alter than using a relational database, even one as lightweight as SQLite.
tree structures are easier to describe using (for example) XML than by using relational tables
the SQLite API is quite badly documented - there are not enough examples, and the hyperlinking is poor. OTOH, the information is all there if you care to dig for it.
use of app-specific binary formats directly will be faster than storing same format as a BLOB in a database
database corruption can mean the los of all your data rather than that in a single bad file
OTOH, if your internal data fits in well with the relational model and if there is a a lot of it, I'd recommend SQLite - I use it myself for one of my projects.
Regarding experience - I use it, it works well and is easy to integrate with existing code. If the documentation were easier to navigate I'd give it 5 stars - as it is I'd give it four.
As always it depends, there are no "one size fits all" solutions
If you need to store data in a stand-alone file and you can take advantage of relational database capabilities of an SQL database than SQLite is great.
If your data is not a good fit for a relational model (hierarchical data for example) or you want your data to be humanly readable (config files) or you need to interoperate with another system than SQLite won't be very helpful and XML might be better.
If on the other hand you need to access the data from multiple programs or computers at the same time than again SQLite is not an optimal choice and you need a "real" database server (MS SQL, Oracle, MySQL, PosgreSQL ...).
The atomicity of SQLite is a plus. Knowing that if you half-way write some data(maybe crash in the middle), that it won't corrupt your data file. I normally accomplish something similar with xml config files by backing up the file on a successful load, and any future failed load(indicating corruption) automatically restores the last backup. Of course it's not as granular nor is it atomic, but it is sufficient for my desires.
I find SQLite a pleasure to work with, but I would not consider it a one-size-fits-all replacement for fopen().
As an example, I just wrote a piece of software that's downloading images from a web server and caching them locally. Storing them as individual files, I can watch them in Windows Explorer, which certainly has benefits. But I need to keep an index that maps between a URL and the image file in order to use the cache.
Storing them in a SQLite database, they all sit in one neat little file, and I can access them by URL (select imgdata from cache where url='http://foo.bar.jpg') with little effort.