Pull multiple attributes from WMI - wmi

Is is possible to query multiple WMI sources like you can multiple tables in SQL?
#"SELECT Win32_Process.ExecutablePath, Win32_Process.Handle, Win32_Process.Name, Win32_PerfFormattedData_PerfProc_Process.WorkingSetPrivate, Win32_PerfFormattedData_PerfProc_Process.PercentProcessorTime FROM Win32_Process, Win32_PerfFormattedData_PerfProc_Process
This didn't work but hopefully you get the idea. I've searched around and I couldn't even find anyone asking this question. I'm guessing it's not possible but I thought I'd try the experts here before giving up.

No you can't. The WQL language is just a subset of SQL and doesn't supports querys over multiples classes at the same time. Instead you can try the ASSOCIATORS OF sentence, but only in some scenarios.

Related

Runtime model generation in Django or other languages

I'm asking this question for hearing about advice and having an idea about the possibility of this type work. Not asking for any code help, just advice, and a little leading would be helpful, if it is still the wrong place to ask this question, sorry about that.
I would like to build an application that lets users upload a CSV or excel file and generate a SQL table at the backend at that time and store data into the table. Then print the loaded data in UI.
I did it before in a terrible and messy way by using the Create Table query and select query for printing data. I believe you will think of me how a programmer can be so ridiculous if you see these codes. That's the reason I want to learn if there is a better and standard way to do the same thing or shouldn't do something like that at all? And how other companies do this type of works? You can advise different programming languages if there are any.
I read about documents Dynamic Models. I understand a little, but they don't advise to build in that way, so I don't know if it is worth going deep and put effort or not.

Creating and annotating simple geographical maps in Django

I am looking for a simple way to create geographical maps in Django, in which I could then select, highlight and annotate countries or groups thereof.
"Annotate": insert a label displaying textual information about the said country.
Is there anything that comes to mind?
Many thanks
EDIT: I checked GeoDjango already and it looks like much work in order to get where I need to. Don't get me wrong: I'm not trying to minimize my own investment in learning new tools, but for this project, I have a trade-off between time allocated to learning and the relative importance of this geographical feature in my app. It's more of a nice-to-have feature I'd like to add to an already 'complete' app. So I wondered whether there exists a 'simpler' python library for this task.
I think this is more of a question for if there is a front-end library to elegantly handle this. However if you need to generate the maps you could try something like this
https://kartograph.org/
I have personally used this http://jvectormap.com/ and found it to be really good.
In your database you could just have a Countries model with any associated information you might need to display, and create a view to handle that appropriately.

how to design full text search algorithm where keyword quantity is huge (like Google Alerts)?

I am building something very similar to Google Alerts. If you don't know what it is, consider the following scenario,
Thousands of new textual articles, blog posts influx everyday
Each user has a list of favorite "keywords" that he'd like to subscribe to
There are million users with million keywords
We scan every article/blog post looking for every keyword
Notify each user if a specific keyword matches.
For one keyword, doing a basic full text search against thousands of articles is easy, but how do make a full text search effectively with million keywords?
Since I don't have a strong CS backtround, the only idea I came of is compiling all keywords into regex, or automata, will this work well? (Like Google's re2)
I think I am missiong some thing important here. Like compiling those keywords into some advanced data structure. Since many keywords are alike (e.g. plural form, simple AND, NOT logic, etc). Are there any prior theory I need to know before head into this?
All suggestions are welcome, thanks in advance!
I can think of the following: (1) Make sure each search query is really fast. Millisecond performance is very important. (2) Group multiple queries with the same keywords and do a single query for each group.
Since different queries are using different keywords and AND/OR operations, I don't see other ways to group them.

Best way to build a crystal report with subreports

I have been tasked to create a crystal report that includes non linked subreports. It is meant to replicate the following. I am just having a hard time wrapping my mind around where to begin.
My application consumes a webservice which returns a list of objects for each web query made. I figured that since crystal reports tends to work natively with datasets, that I would create a custom dataset containing all the tables that the queries would involve.
Now that I have created a dataset and the data is loaded from the webservice I am consuming, I have come to a point where I am attempting to figure out how to query the dataset in such a way as to join columns from each datatable and build the report from that query.
Now can someone tell me whether there is an easier way to do this or have any suggestions as to what route they might take to accomplish this? The report needs to include subreports which complicates it a bit more.
I've found that it is cleaner and easier to maintain if you write a stored procedure in your database, and then just use that as your source in Crystal. If you have multiple sets of data to report, use multiple stored procedures. If you're going to have multiple subreports, it helps to have a common set of parameters for the procedures, although that is not required.
By getting your data using stored procedures, you can verify that you're getting the correct data before you write the report. Then Crystal is used mostly for formatting and totalling.

Query building in a database agnostic way

In a C++ application that can use just about any relational database, what would be the best way of generating queries that can be easily extended to allow for a database engine's eccentricities?
In other words, the code may need to retrieve data in a way that is not consistent among the various database engines. What's the best way to design the code on the client side to generate queries in a way that will make supporting a new database engine a relatively painless affair.
For example, if I have (MFC)code that looks like this:
CString query = "SELECT id FROM table"
results = dbConnection->Query(query);
and we decide to support some database that uses, um, "AVEC" instead of "FROM". Now whenever the user uses that database engine, this query will fail.
Options so far:
Worst option: have the code making the query check the database type.
Better option: Create query request method on the db connection object that takes a unique query "code" and returns the appropriate query based on the database engine in use.
Betterer option: Create a query builder class that allows the caller to construct queries without using any SQL directly. Once the query is completed, caller can invoke a "Generate" method which returns a query string approrpriate for the active database engine
Best option: ??
Note: The database engine itself is abstracted away through some thin layers of our own creation. It's the queries themselves are the only remaining problem.
Solution:
I've decided to go with the "better" option (query "selector") for two reasons.
Debugging: As mentioned below, debugging is going to be slightly easier with the selector approach since the queries are pre-built and listed out in a readable form in code.
Flexibility: It occurred to me that there are some databases which might have vastly better and completely different ways of solving a particular query. For example, with Access I perform a complicated query on multiple tables each time because I have to, but on Sql Server I'd like to setup a view. Selecting from the view and from several tables are completely different queries (i think) and this query selector would handle it easily.
You need your own query-writing object, which can be inherited from by database-specific implementations.
So you would do something like:
DbAgnosticQueryObject query = new PostgresSQLQuery();
query.setFrom('foo');
query.setSelect('id');
// and so on
CString queryString = query.toString();
It can get pretty complicated in there once you go past simple selects from a single table. There are already ORM packages out there that deal with a lot of these nuances; it may be worth at looking at them instead of writing your own.
Best option: Pick a database, and code to it.
How often are you going to up and swap out the database on the back end of a production system? And even if you did, you'd have a lot more to worry about than just minor syntax issues. (Major stuff like join syntax, even datatypes can differ widely between databases.)
Now, if you are designing a commercial application where you want the customer to be able to use one of several back-end options when they implement it, then you may have to specify "we support Oracle, MS SQl, or MYSQL" and code to those specific options.
All of your options can be reduced to
Worst option: have the code making the query check the database type.
It's just a matter of where you're putting the logic to check the database type.
The option that I've seen work best in practice is
Better option: Create query request method on the db connection object that takes a unique query "code" and returns the appropriate query based on the database engine in use.
In my experience it is much easier to test queries independently from the rest of your code. It gets a lot harder if you have objects that are piecing together queries from bits of syntax, because then you have to test the query-creation code and the query itself.
If you pull all of your SQL out into separate files that are written and maintained by hand, you can have someone who is an expert in SQL write them (you can still automate the testing of these queries). If you try to write query-generating functions you'll essentially have a C++ expert writing SQL.
Choose an ORM, and start mapping.
If you are to support more than one DB, your problem is only going to get worse.
And just think of DB that are comming - cloud dbs with no (or close to no) SQL, and Object databases.
Take your queries outside the code - put them in the DB or in a resource file and allow overrides for different database engines.
If you use SPs it's potentially even easier, since the SPs abstract away your database differences.
I would think that what you would want to do, if you needed the ability to support multiple databases, would be to create a data provider interface (or abstract class) and associated concrete implementations. The data provider would need to support your standard query operators and other common, supported functionality required support your query operations (have a look at IEnumerable extension methods in .NET 3.5). Each concrete provider would then translate these into specific queries based on the target database engine.
Essentially, what you do is create a database abstraction layer and have your code interact with it. If you can find one of these for C++, it would probably be worth buying instead of writing. You may also want to look for Inversion of Control (IoC) containers for C++ that would basically do this and more. I know of several for Java and C#, but I'm not familiar with any for C++.