Does This Method Have A Name? [closed] - data-mining

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
Not sure if this method exists for data analysis... or even if I can word my question clearly:
If you took multiple transparancies with a map of the world on it, and then placed a 'very light' dot of color at places of interest (one dot on each transparency), when you stacked all of the transparancies on top of each other (in any order really), the 'very light' dots of color would combine to form 'darker' spots indicating increased interest in these locations. Likewise, the 'answer' would become readily apparant just by looking at overlayed maps with little to no calculations
Does this sound like any establised technique that you have heard of? And if so, what is its name?

Yes.
This technique is commonly known as heat map (Wikipedia), and a standard visualization technique.
This is popularly used with multivariate density estimation (Wikipedia).
Picture from Wikipedia, see File:Old Faithful Geyser KDE with plugin bandwidth.png (CC by-sa-3.0 licensed).
I would not call this "data mining". This is much much older. It's a visualization technique popular in statistics, though; but not so much an "analysis" technique.

Related

Heat Conduction 2D Fourier libmesh / deal.II [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
are there any examples of solving heat conduction problems in 2D with fourier's law as main equation with finite elements and using either libmesh or deal.II libraries?
The 2D heat equation is the only way to solve heat conduction problems. Lots of examples using finite difference, finite element, and boundary element methods. All require meshes of some kind. Which one do you want to apply?
OK, so now we know you want to solve 2D heat conduction problems using FEA. It's a three step process:
Pre-process (create the mesh for your geometry, apply material properties, boundary conditions, and initial conditions (if transient or non-linear).
Perform the analysis (formulate and solve the matrix equations for node and element unknowns).
Post-process (graphical display of results is best, since pictures are worth thousands of words).
Which solver do you wish to use? Is your objective to write one or just use one? Do you want open source? Must it be written in C++? (Not likely. FORTRAN is by far the most common language for such programs.)
Is yours a large problem? I'm guessing no, but massive parallelization might be of interest to you:
http://www.cas.usf.edu/~cconnor/parallel/2dheat/2dheat.html
FEMHub likes Python, probably because of the nice libraries NumPy and SciPy.
Here's a site that lists open source libraries for Java.

Performance data collection in Linux (API) [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I need some library which has comprehensive set of APIs which can help to collect performance data of current machine. Could be very useful if this library written in C++ or Perl.
Tried to googling, since I don't know right terminology for that I found a lot of big and already established projects, which I cannot embed into my code.
What you are looking for is called PAPI Performance Application Programming Interface. It lets you collect data on all performance counters available e.g. FLOP (floating point operations) if you wish to validate your theoretical FLOP count. It also offers an API to compute MFLOPS or even find the cache hit ratio for your application. I have used the library extensively in supporting platforms in addition to Intel VTune.
Here is a list of "native" PAPI events but everything else you will find as CPU native counters.

What are common industry standards for creating game assets? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I would like to gain some insight from some of you more knowledgeable game designers on what exactly the standards (or most commonly accepted conventions) for creating game assets (textures, models, etc.) are. Or rather if there is anything I have missed.
Here are some things I would like to know in particular...
Conventions for texture files.
Sizes (512x512, 1024x1024, etc.).
File type (JPEG, BMP, PNG, etc.).
Conventions for polygon counts of models (in a modern game).
What polygon count makes something a "low poly" model.
How to use a "high poly" model to generate a good normal map for a simpler model.
Proper way to design UV maps for an object.
Usually my unwraps look alright but nowhere near production quality. Is there a proper technique for this?
Proper way to generate normal maps in realistic fashion.
Also what would be some good applications for developing some assets. Particularly audio. I already use Blender for modeling.
Essentially I'm looking for the tricks of the trade, if anyone is willing to share of course =)

Does every large project include a Lisp interpreter? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I had the impression that there was a paper or article somewhere that claimed every sufficiently large project (not written in a Lisp variant) contained a poorly implemented Lisp interpreter. Google turns up nothing and a quick search of SO doesn't either. Is this something well known and documented somewhere I have forgotten, or just a figment of my imagination?
An actual document or link to such an article would be appreciated, if it exists. Otherwise, I will remove the question.
What Greenspun meant when he uttered this quip was that Lisp provides a great many foundational technologies for writing good software, and that programs written in other languages informally (and inferiorly) reproduce a number of them as they grow.
Yes, this claim is Greenspun's tenth rule (actually the only rule):
Any sufficiently complicated C or Fortran program contains an ad hoc,
informally-specified, bug-ridden, slow implementation of half of
Common Lisp.
It is making a valid point about the expressiveness of Lisp-style features (particularly its kind of macros). However, it isn't serious to the degree you would write a paper on it.

How would one approach creating a morphable 3D model? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
A Morphable Model for the Synthesis of 3D Faces
The above video is over 12 years old. How was the software done?
I need something way simpler but basically the same: a morphable model (thorax) that can be altered after being pre-morphed using a picture.
Any links that might provide useful information are appreciated.
Are there any open source projects that might have helpful code that could be studied?
The details are all in their paper:
www.mpi-inf.mpg.de/~blanz/html/data/morphmod2.pdf
In short, you need:
A collection of complete 3D scans of samples of the object class you want to characterize
A way of performing 'non-rigid registration' to align a reference template to each sample
Standard statistical analysis (principal components) of the aligned samples
Note that their choice of the name 'Morphable Model' is misleading. They are referring to something much more specific than a set of difference morphs or morph targets.
What you need is called Morph target animation. Blender implements this, but the feature is called Shape Keys.
You can see an example of morphing at NeHe Productions.
This process works by creating a base vector of points, such as a face and a set of change vectors that contain the differences to various morph targets. A possible morph target would be smile and it would contain the offset values that added to the original face would result in a smiling face.
You can do linear combinations of morph targets and you can even create caricatures, by exaggerating the factors (original + 2*smile).