Store MySQL table to file (sqlite?) - c++

I wrote a C++/QT application that uses a mysql database for its data. It's using the mysql++ library. I now want it to be able to export and import its data to/from files.
I could write an own file format, but I'd like to elude this efford, if possible.
Is there an easy possibility to export a mysql table into a file and to reimport this file with C++?
I heard of sqlite, but as far as I read, migrating from mysql++ to sqlite is not that easy, because it includes a switch of the complete database backend.

You can use "LOAD DATA " and "SELECT ... INTO OUTFILE"
That should have great performance. You may not use the outfiles further as easily as you want.

The best way to export/import data from/to a database is xml files.

Related

Is there a way to use SQLite commands in the C++ interface for SQLite?

I am trying to import a CSV file into a SQLite Database through C++ code. Normally through the SQLite command line I would go
sqlite3 db
.import statistics.csv stats
I am wondering how to do this exact same thing through the C++ Interface for SQLite.
There is no public SQLite C++ API to do what import does exactly. You need to read the CSV file yourself and insert the values as normal.

Convert an unknown database file from a windows software into a MySqli Database

I have installed a software in my system and I have a lot of data from client in it. All the files which are inside DB folder of this software are with extensions for each individual party.
I want to to use these files to get converted to a MySqli Database.
Sample file from DB folder can be download from here
I have tried understanding for firebird service which this software uses to connect with these database files to get the things.
I want to extract database and import it inside MySqli (PhpMyAdmin)
The linked file seems to be a renamed Firebird database with structure version ODS 11.2 which corresponds to Firebird 2.5.x line.
For making a quick peep into the database you can use
IBSurgeon First Aid -- http://ib-aid.com
IB Expert (the Database Explorer feature) -- http://ibexpert.net
Free mode of FirstAID would let you peep into the data, but not extract it out, probably not even scroll ALL the tables. It also would most probably ignore all database structures that are not tables (UDF functions, procedures, VIEWs, auto-computed columns in tables) - afterall it is just low-level format parser, not an SQL engine.
IB Expert has as a non-commercial Personal edition, but it probably does not include DB Exp, however you may try a trial period of full version. However IBE's DBExp would probably also only show basic structures of the database, maybe it would be enough.
Alternatively you can install Firebird 2.5.8 - either a standalone version or maybe embedded (a set of DLLs used instead of FB server process) if your application can use it, then use any DB IDE suit to explore it. Most often mentioned for Firebird would be IBExpert, FlameRobin, Firebird Maestro or any other. Then you would be able to try different SQL queries, including SPs, VIEWs and UDF-functions if there were any registered for the database and actually used.
BTW IBExpert comes bundled with FB 2.5 Embedded, which one can use to open the database file.
After you figure out the format, you can either export required data into some intermediate format like CSV (for example: http://fbutils.sourceforge.net/ ) or use your C++ application (though why would anyone develop web-application in C++) using libraries like IB++ or OLE DB, etc. Maybe it would be better to just use the Firebird server and original DB files from PHP or what would you write the application in.

converting a large number of excel files into a single mysql table

I am using microsoft visual studio 10 c++ and mysql workbench.I have a large number of excel files and i want to update the content of all excel files into a single mysql table.I can create a csv file for each excel file and then import it but i want it to be done with the help of a stored procedure.I want to use c++.And this procedure has to be repeated with different excel files.
i was thinking of connecting my c++ program to both excel and mysql simultaneously(is it possible?) and reading the excel files and adding the data into the mysql table.
i have already connected my program to mysql database.
Any other approach would be appreciated.
In MySQL Store Procedure it will not accept bulk load or CSV import But You can use without SP. Better Try to import using C++.

Where to store data?

I have created a c++ program but now I need to store on the hard disk permanently some settings about the program.
Where do I have to store this data ?
In a sqlite database ?
In the windows registry ( if linux ? ) ?
In a XML file
In other files ?
It completely depends how much information it is and for which platform.
Viable options are a .ini file http://en.wikipedia.org/wiki/INI_file#Accessing_INI_files and for more data a SQLite database. I'm not a big fan of XML files.
It depends on the type and size of data. For small and less complex data simple text files are better. for complex data you can use XML or sqlite database. If you need to write complex queries go for sqlite. It stores data in files but will give better query options.
Modern applications use the system registry to store configuration information.
See the windows article: Using the Registry in a C++ Application
http://msdn.microsoft.com/en-us/library/ms838625.aspx

Best way to store data in C++

I'm just learning C++, just started to mess around with QT, and I am sitting here wondering how most applications save data? Is there an industry standard? Do they store it in a XML file, text file, SQLite? What about sensitive data that say accounting software would need to save? I'm just interested in learning what the best practices for this are.
Thanks
This question is way too broad. The only answer is it depends on the nature of the particular application and the data, and whether or not it is written in C++ has very little to do with it.
For example, user-configurable application settings are often stored in text files, but on Windows they are typically stored in the Registry. Accounting applications typically keep their data in a database of some sort.
There are many good ways to store application data (call it serialization).
Personally, I think for larger datasets, using an open format is much, much easier for debugging. If you go with XML, for example, you can store your data in an open form so that if you have file corruption issues (i.e. a client can't open your file for some reason), it's easier to find. If you have sensitive data in there, you can always encrypt it before writing it to file using key encryption. Microsoft, for instance, has gone from using a proprietary format to open xml in their office docs. They use .*x extension (.docx, .xlsx, etc). It's really just a compressed folder with xml files.
Using binary serialization is, of course, the industry standard at the moment for most standalone applications. Most likely that is because of the application framework they are using (such as MFC, which is old). If you take a look at most of the serialization techniques in modern application frameworks, XML serialization is very well supported.
First you need to clarify what kind of data you would like to save.
If you just want to save some application settings, use QSettings to save your settings to an INI file or registry.
If it is much more than just some application settings, go for XML files or SQL.
There is no standard practice, however if you want to use complex structured data, consider using an embedded database engine such as SQLite or Metakit, or Berkeley DB files. XML files would also do the job and be human readable/writable. Preferences can use INI files or the Windows registry, and so on. In short, it really depends on your usage pattern.
This is a general question. Like many things, the right answer depends on your application and its needs.
Most desktop applications save end-user data to a file (think Word and Excel). The format is up to you, XML, binary, etc. And if you can serialize/deserialize objects to file it will probably make your life easier.
Internal application data such as configuration files or temporary data might be saved to an XML file or an lightweight, local database such as SQLite
Often, "enterprise" applications used internally by a business will save their data to a back-end database such as SQL Server or Oracle. This is so all of the enterprise's data is saved to a single central location. And then it is available for reporting, etc.
For accounting software, you would need to consider the business domain and end users. For example, if the software is to be sold to large businesses you would probably use some form of a database to save data. Otherwise a binary file would be fine, perhaps with some form of encryption if you are really paranoid.
When you say "the best way", then you have to define what you mean by "good".
The problem is that various requirements conflict with each other, therefore so you can't satisfy all of them simultaneously.
For example, if one requirement is "concurrent multi-user access to the data" then this suggests using a database engine, but that conflicts with "as small as possible" and "minimize dependencies on 3rd-party software".
If a requirement is "portable data format" then this suggests XML, but that conflicts with "compact" and "indexed".
Do they store it in a XML file, text file, SQLite?
Yes.
Also, Binary files and relational databases.
Anything else?