I've decided to take the advice in this question: create-excel-chart-programmatically-in-php and NOT attempt to do graphics myself.
I'd also like to follow the advice to use Google Charts API. This will be the first time for me.
But I don't want to do an Excel chart — however much that might appeal to my co-workers who use Excel for everything. I like the idea of receiving an image of my data.
Obviously, if you're familiar with Statistical Process Control, we're talking about a line chart.
Has anyone been able to 'configure' a Google line chart for SPC?
Because I haven't seen any examples, I'm mostly interested in whether or not it's even possible.
EDIT: well, of course it is possible. Maybe the dearth of previous 'solutions' is created by reluctance to hand sensitive data to Google and wait for a chart to come back.
Accepting your own answer is OK, but for those of you who might wonder, I got no rep points for it.
I could accept something better, if anyone suggests it.
I decided to take the paranoid approach [that it would create alarm if I sent a whole lot of confidential company information to Google] so here is a fragment of my code. It embeds a Google line chart in a simple table.
The table provides labelling for the chart, and Google gets just 'meaningless' numbers.
The array $values contains real measurements, but they are scaled wrt the upper and lower control limits, so LCL is 10%, and UCL is 90% of the vertical axis.
echo "<table class='noborder'>\n";
echo "<caption class='spc'>Product : Measurement</caption>\n";
echo "<tr><th> </th><th> </th><th> </th></tr>\n";
echo "<tr><td>AVG</td><td>\n";
echo "<img src='http://chart.apis.google.com/chart?cht=lc\n&chd=t:";
printf("%s", implode(',', $values));
echo "\n&chm=o,FF9900,0,-1,10|R,FFDDDD,0,0.0,1.0|r,F9ECEC,0,0.1,0.9";
echo "\n&chs=600x500&chco=0000CC'><td> </td>\n";
echo "<tr><td> </td><th>";
printf("LCL=%6.3f, AVG=%6.3f, UCL=%6.3f", $lcl, $avg, $ucl);
echo "</th><td> </td></tr>\n";
Related
I am writing a windows program (no mfc) and need to output a status line to the operator every few seconds or so. I tried using rich text boxes but after so many hours it seems to hang up. Does anybody have an suggestions on what I can use instead?
People mentioned that my buffers might have been exhausted. I thought I had planned for that. After I had about 1000 lines displayed I would take the first 500 and remove them using the select and cut options in rich text boxes. I still ran into the same problem.
This question appears relevant, and this one too. But they don't give any concrete recommendations for an alternative to rich text boxes.
You might try the Scintilla control (scintilla.org) which does not appear to have any hard limitations on text size. It has a permissive license. It is used by many text editors such as Notepad++, Notepad2, Code::Blocks, FlashDevelop. I haven't tried it personally but there from the documentation it looks easy to use it in a Windows API application. Of course, it might be overkill for your purposes.
If you keep appending to the text in the control every few seconds for hours then you are probably running into some memory constraint on the control or the process. I think you would have this problem with any control you choose given update frequence and how long you're running the program.
Have you considered implementing a simple circular buffer for the content of the text box? Say only keep the last hour's messages. You could maintain a separate log file for history if the operator needed to go back in time for hours.
I ended up writing my own control to do this, essentially duplicating the Output window in Visual Studio. It was a success, but it ended up being much more code than I thought it would be when I started - I insisted on features such as auto-scrolling when the cursor was on the last line, select/copy, bold text, etc. It was backed by a std::deque so I could limit the number of lines stored for the window.
Unfortunately the code belongs to a former employer so I can't share it here.
Hi so i'm making a game through the console window, and i was wondering if there was any way to just get maybe one or two text character's placement to change or disappear. Usually to accomplish this i would have to tell the console to re-type every single character and line all over again, but this just takes to long (1 second fps plus .5 second time spent re-typing the scene).
Is there some way i could re-fresh or change one or two lines or 'characters' seen on the console so so much time is not spent on waiting for the console to re-typing my 24 lines, each a string? (the scene made up of text)
Thanks! =)
btw... does anyone remember that little easter egg in windows which was an entire star wars movie made out of text in the console?? I want the game be smooth like that!
You'll need to use an external library to interface with the console as C++ doesn't have these capabilities, but it is possible.
My old goto for this sort of thing is ncurses. It's straightforward, quick to set up, and cross-platform. But it's old, and its age shows. (If you're on windows you'll have to use pdcurses; same capabilities, different package).
There are also console-specific ways of doing this. In particular, Windows provides an API for performing these sorts of actions.
You need ncurses library.
See console print w/o scrolling for reasons and examples.
Also google for the source to the rogue/urogue/nethack games which do that already.
Because the open source geo-coders cannot begin to compare to Google's or even Yahoo's, I would like to start a project to create a good open source geo-coder. Just to clarify, a geo-coder takes some text (usually with some constraints) and returns one or more lat/lon pairs.
I realize that this is a difficult and garguntuan task, so I am wondering how you might get started. What would you read? What algorithms would you familiarize yourself with? What code would you review?
And also, assuming you were going to develop this very agilely, what would you want the first prototype to be able to do?
EDIT: Let's set aside the data question for now. I am going to use OpenStreetMap data, along with a database of waypoints that I have. I would later plan to include other data sets as well, and I realize the geo-coder would be inherently limited by the quality of the original data.
The first (and probably blocking) problem would be: where do you get your data from? (unless you are willing to pay thousands of dollars for proprietary sets).
You could build a geocoding-api on top of OpenStreetMap (they publish their data in dumps on a regular basis) I guess, but that one was still very incomplete last time I checked.
Algorithms are easy. Good mapping data, however, is expensive. Very expensive.
Google drove their cars all over the world, collecting this data among other things.
From a .NET point of view these articles might be interesting for you:
Writing Your Own GPS Applications: Part I
Writing Your Own GPS Applications: Part 2
Writing GIS and Mapping Software for .NET
I've only glanced at the articles but they've been on CodeProject's 'Most Popular' list for a long time.
And maybe this CodePlex project which the author of the articles above made available.
I would start at the absolute beginning by figuring out how you're going to get the data that matches a street address with a geocode. Either Google had people going around with GPS units, OR they got the information from some existing source. That existing source may have been... (all guesses)
The Postal Service
Some existing maps(printed)
A bunch of enthusiastic users that were early adopters of GPS technology who ere more than willing to enter in street addresses and GPS coordinates
Some government entity (or entities)
Their own satellites
etc
I guess what I'm getting at is the information was either imported from somewhere or was input by someone via some interface. As my starting point I would look at how to get that information. In an open source situation, you may be able to get a bunch of enthusiastic people to enter information.
So for my first prototype, boring as it would be, I would create a form for entering information.
Then you need to know the math for figuring out the closest distance (as the crow flies). From there, try to figure out how to include roads. (My guess is you would have to have data point for each and every curve, where you hold the geocode location of the curve, and the angle of the road on a north/south and east/west vector. You'd probably need to take incline into account, too to get accurate road measurements.)
That's just where I'd start.
But in all honesty, I wouldn't even start on this. Other programmers have done it already, I'm more interested in what hasn't already been done.
get my free raw data from somewhere like http://ipinfodb.com/ip_database.php
load it into a database, denormalizing for fast lookups
design my API
build it out as a RESTful web service
return results in varying formats: JSON, XML, CSV, raw text
The first prototype should accept a ZIP code and return lat/lon in raw text.
Last night before going to bed, I browsed through the Scalar Data section of Learning Perl again and came across the following sentence:
the ability to have any character in a string means you can create, scan, and manipulate raw binary data as strings.
An idea immediately hit me that I could actually let Perl scan the pictures that I have stored on my hard disk to check if they contain the string Adobe. It seems by doing so, I can tell which of them have been photoshopped. So I tried to implement the idea and came up with the following code:
#!perl
use autodie;
use strict;
use warnings;
{
local $/="\n\n";
my $dir = 'f:/TestPix/';
my #pix = glob "$dir/*";
foreach my $file (#pix) {
open my $pic,'<', "$file";
while(<$pic>) {
if (/Adobe/) {
print "$file\n";
}
}
}
}
Excitingly, the code seems to be really working and it does the job of filtering out the pictures that have been photoshopped. But problem is many pictures are edited by other utilities. I think I'm kind of stuck there. Do we have some simple but universal method to tell if a digital picture has been edited or not, something like
if (!= /the origianl format/) {...}
Or do we simply have to add more conditions? like
if (/Adobe/|/ACDSee/|/some other picture editors/)
Any ideas on this? Or am I oversimplifying due to my miserably limited programming knowledge?
Thanks, as always, for any guidance.
Your best bet in Perl is probably ExifTool. This gives you access to whatever non-image information is embedded into the image. However, as other people said, it's possible to strip this information out, of course.
I'm not going to say there is absolutely no way to detect alterations in an image, but the problem is extremely difficult.
The only person I know of who claims to have an answer is Dr. Neal Krawetz, who claims that digitally altered parts of an image will have different compression error rates from the original portions. He claims that re-saving a JPEG at different quality levels will highlight these differences.
I have not found this to be the case, in my investigations, but perhaps you might have better results.
No. There is no functional distinction between a perfectly edited image, and one which was the way it is from the start - it's all just a bag of pixels in the end, after all, and any other metadata you can remove or forge all you want.
The name of the graphics program used to edit the image is not part of the image data itself but of something called meta data - which may be stored in the image file but, as others have noted, is neither required (so some programs may not store it, some may allow you an option of not storing it) nor reliable - if you forged an image, you might have forged the meta data as well.
So the answer to your question is "no, there's no way to universally tell if the pic was edited or not, although some image editing software may write its signature into the image file and it'll be left there by carelessness of the editing person.
If you're inclined to learn more about image processing in Perl, you could take a look at some of the excellent modules CPAN has to offer:
Image::Magick - read, manipulate and write of a large number of image file formats
GD - create colour drawings using a large number of graphics primitives, and emit the drawings in various formats.
GD::Graph - create charts
GD::Graph3d - create 3D Graphs with GD and GD::Graph
However, there are other utilities available for identifying various image formats. It's more of a question for Super User, but for various unix distros you can use file to identify many different types of files, and for MacOSX, Graphic Converter has never let me down. (It was even able to open the bizarre multi-file X-ray of my cat's shattered pelvis that I got on a disc from the vet.)
How would you know what the original format was? I'm pretty sure there's no guaranteed way to tell if an image has been modified.
I can just open the file (with my favourite programming language and filesystem API) and just write whatever I want into that file willy-nilly. As long as I don't screw something up with the file format, you'd never know it happened.
Heck, I could print the image out and then scan it back in; how would you tell it from an original?
As other's have stated, there is no way to know if the image was doctored. I'm guessing what you basically want to know is the difference between a realistic photograph and one that has been enhanced or modified.
There's always the option of running some extremely complex image recognition algorithm that would analyze every pixel in your image and do some very complicated stuff to determine if the image was doctored or not. This solution would probably involve AI which would examine millions of photos that are both doctored and those that are not and learn from them. However, this is more of a theoretical solution and isn't very practical... you would probably only see it in movies. It would be extremely complex to develop and probably take years. And even if you did get something like this to work, it probably still wouldn't be 100% correct all the time. I'm guessing AI technology still isn't at that level and could take a while until it is.
A not-commonly-known feature of exiftool allows you to recognize the originating software through an analysis of the JPEG quantization tables (not relying on image metadata). It recognizes tables written by many applications. Note that some cameras may use the same quantization tables as some applications, so this isn't a 100% solution, but it is worth looking into. Here is an example of exiftool run on two images, the first was edited by photoshop.
> exiftool -jpegdigest a.jpg b.jpg
======== a.jpg
JPEG Digest : Adobe Photoshop, Quality 10
======== b.jpg
JPEG Digest : Canon EOS 30D/40D/50D/300D, Normal
2 image files read
This will work even if the metadata has been removed.
There is existing software out there which uses various techniques (compression artifacting, comparison to signature profiles in a database of cameras, etc.) to analyze the actual image data for evidence of alteration. If you have access to such software and the software available to you provides an API for external access to these analysis functions, then there's a decent chance that a Perl module exists which will interface with that API and, if no such module exists, it could probably be created rather quickly.
In theory, it would also be possible to implement the image analysis code directly in native Perl, but I'm not aware of anyone having done so and I expect that you'd be better off writing something that low-level and processor-intensive in a fully-compiled language (e.g., C/C++) rather than in Perl.
http://www.impulseadventure.com/photo/jpeg-snoop.html
is a tool that does the job almost good
If there has been any cloning , there is a variation in the pixel density..or concentration which sometimes shows up.. upon manual inspection
a Photoshop cloned area will have even pixel density(my meaning is variation of Pixels wrt a scanned image)
I have a small Win32 console application which is essentially a test harness. I read data in, do some processing on it and currently just output some of the numbers to the console. This isn't a huge problem - I can get an idea of what the data looks like, but it would be much easier to analyse if there was a way of getting that information into a graph for each run of the software.
I've been getting to grips with GNUPlot recently, but can't work out a simple way to get the data sent to it. Has anyone tried this? ..or is there another graphing application I should try?
Excel and OO Calc are great tools and I've loaded .csv data into them for graphing data plenty of times myself. I was, however, hoping for a way to dynamically pipe data into a graphing application to avoid having to close/reopen excel and plot a graph each time I want to look at some data.
I think you can pipe data into GNUPlot (which is why I mentioned it) but the details of how to do so are rather scant.
A simple approach is to wtite the data out as CSV and then import it into a spreadsheet like Excel or OpenOffice to do the graph drawing.
Edit: Following your question, I got interested in GNUPlot myself - this is the simplest description of using it from the command line that I found: http://www.goldb.org/goldblog/CommentView,guid,f378e279-eaa5-4d85-b7d2-0339a7c72864.aspx
Never underestimate the power of Excel and a .csv data dump.
Writing data to a .csv file form C++ is not very difficult and there's lots of articles out there regarding the subject, for example: here, or just google.
Excel can easily load .csv's and then you can just use that to plot whatever graphs you require. THis is particularly useful if you just want a quick visual sanity check of results etc.
You don't really need to touch VBA to do this
In Excel you can set up a Data Connection to a file, it supports many files type but CSV does work fine.
Go to List item
Data Tab
Click Connections
Click Add
select the file
go to the connection properties - un-tick prompt for file name
set the required period.
close the connections dialog
select the start cell for importing the data - cell 1a on worksheet 2
click existing connections
select you data connection
flip to worksheet1 add your chart and hookup the data.
the chart will now update automatically
this is Excel 2007 - but think older version had this and I think OO can do it to.
You might also want to look into XMGrace which allows you to launch it and drive it dircetly from C/Fortran programs as shown here
Excel is completely script-able. Use the macro recorder to figure out the steps. Create the chart in its own sheet. Then save the chart using the GIF filter.
The actual import is something like:
ActiveChart.Export FileName:=something_dot_gif, FilterName:="GIF"
I just found an example of piping data into gnuplot on Cardiff University's website. Not tried it yet, but it looks promising!
[edit] ..and another which includes some notes for windows.
You can use MathGL - it can create a window (FLTK, GLUT or Qt) and display plot inside. Also it have large set of plot types and can work in console.