Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I want to build a simple program that searches through a local database and returns a list of results. Essentially the program will be a searchable archive. This will be my first project of this type and I'm looking for some advice.
Fast reading of the database is important. Writing can be slow.
There will be at most a few thousand records, and most records will probably contain less than 3 kb in text.
Records should be flexible when it comes to their fields. Some records will have the field "abc", others will not. The number of fields per record may vary.
Should I simply write my data structures in C++, and then serialize them? Does it make sense to use an existing (lightweight) database framework? If so, can you recommend any that are free and easy to use and preferably written in modern C++?
My target platform is Window and I'm using the Visual Studio compiler.
Edit: the consensus is to use SQLite. Any recommendations as to which of the many SQlite C++ wrappers to use?
As commented by #Joachim, I would suggest SQLite. I have used it in C++ projects and it's straightforward to use. You basically put two files in your project sqlite3.c and sqlite3.h and then start coding to the interface (read the last paragraph of http://www.sqlite.org/selfcontained.html). You don't have to worry about configuring a database server. It's fast and lightweight. Read about transactions and SQLite if you need to speed some operations up.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
First off, my company is into power grid, not IT, so software is kinda a secondary role here.
I work on a power system simulation software, very old code base in C++ with MFC, like 15 years old. What we do is take large amounts of data, ~100,000 floating point values then format and write to a text file (most of the code actually uses the C FILE structure to do this). Then, it's read by a separate engine exe which computes the electrical algorithm (Electrical algorithms are mostly numeric solutions of system of diffn equations) and then writes back huge amount of data to another text file, which we read and update the UI.
My question is, is this how it should be done? It there a way to skip writing into the text file and directly pass the data to the exes?
exes are called using CreateProcess() MFC function.
EDIT::
Sorry, site won't let me comment.
#Vlad Feinstein Well, yes, it's like a Ladder. A thing called load flow solves power flow through the lines, which in turn will be used to find stability of the systems, which in turn for overvoltage ect. It's huge, the UI is million+ lines of code, engine exes another million maybe.
Doesn't MFC already implement IPC using Dynamic Data Exchange? I can pass strings to another process's PreTranslateMessage() func. A scaled up version of that?
There is no such a thing as "should be done as ..." there are multiple methods to do IPC and while the method you describe might not be the fastest, it is a viable solution nevertheless. If the performance doesn't bother you in this particular case you should not bother with modifying it. It is exactly the case where the phrase "if it ain't broke, don't fix it" applies.
Probably, you would not want to make any new IPC in the application that way, though.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm looking to write a reporting tool. The data resides in a ~6GB postgresql database. The application is an online store/catalog application that has items and orders. The stakeholders are requesting a feature that will allow them to search for an item and give a count of all those orders in the last 2 years.
Some rows contain quantities, and units of measure, which would require multiplication of quantity and UoM for each row.
It's also possible that other reporting functions will be necessary in the future.
I have not delved much into the data analysis aspect of programming. I enjoy Clojure, so I would be thrilled to find a solution that uses Clojure, but only if Clojure offers competitive tools for my needs.
Here are some options I'm considering:
merely SQL
Clojure
core.reducers
a clojure hadoop library
Hadoop
Can anyone shed some insight into these kinds of problems for me? Are there articles that you would recommend?
Hadoop is likely overkill for this project. It seems most likely that simply using Clojure-jdbc or Korma to read the data form the database and filter/reduce it in Clojure is likely to be fine. At work we routinely work with sequences of that size, though this depends on the expected response time. You may need to do some preprocessing and caching if instantaneous responses are expected.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
A while ago i made a database framework in c++ and have been using it in various places, even made a wrapper for it for vb.net.
Now i have a new project that would require multiple programs accessing a single database and it would be wasteful to load up the database multiple times for every one of them not to mention the syncing horrors.
So I figured i would turn the framework into a standalone application and access to the data would be done in some xx magical way from those other programs. From what I've seen php and mysql do something like this..?
Problem is, I have no clue where to start. The only kind of cross program communication i've done is one program reading and writing directly into the other ones memory, seems kinda hacky though and I'm not sure if that sort of thing is going to fly with managed languages (I want to make it accessible in vb.net too).
Tips?
The most portable way to do IPC (inter-process communication) is probably going to be Sockets.
What about D-Bus? There ist a port for Windows.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have seen recently that people use xml files as a database to store the settings. However, I don't know why exactly is it done. I am from a C/C++, Linux background. Thus, please help me to understand this concept. Any simple C/C++ example will help me to understand it's benefit better?
XML is a very common tool with tons of libraries to handle it. Although it isn't the most beautiful format in the world, it is possible to read and modify it by both hand and program. Probably one want to use it when program configuration modified by some gui or tool. If you intend manual configuration, it's probably better to choose something else, for example ini. This is why linux tools rarely use XML, BTW.
As a C++ programmer you'd probably find interesting the "boost::property_tree" library to deal with configs. Examples of usage included in the documentation. Also it provides with plenty of different backends to store configuration, so you haven't to stick to some one format.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a lot of data kept in a file. Which one would be fastest for accessing some keywords from that file multiple times? Java or C++. Am I going to get some advantage in speed if I keep those data in a database like Sqlite compared to file operation?
Because C++ is a low-level language, while Java runs in a virtual machine, well-written C++ code will typically be faster than well-written Java code, especially for low-level operations (including file accesses). Java has significant overhead whenever it needs to perform an operation outside of its virtual machine.
For large amounts of data, a database will be much faster than direct file operation; it's exactly what a database is designed to do.
Use C++, because it compiles directly to native bytecode. While some JVMs also do that, Java can't be guaranteed to always run that way. A database language would be even better, like the Sqlite you mentioned, because the language is specifically optimized for database stuff.