Quantum Computing - Hype Or Hyper? [closed] - error-correction

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have recently been web-researching quantum computing.
Will we see these in our lifetimes (ever?) (The error correction issue, for example, seems intractable to me).

Just looking at the results from one website, I'd say it's not that impossible:
http://arstechnica.com/journals/science.ars/2008/03/28/encoding-more-than-one-bit-in-a-photon
http://arstechnica.com/journals/science.ars/2008/10/28/scalable-quantum-computing-in-the-next-5-years
http://arstechnica.com/news.ars/post/20080729-finding-lost-qubits.html
http://arstechnica.com/news.ars/post/20080509-new-quantum-dot-logic-gates-a-step-towards-quantum-computers.html
http://arstechnica.com/news.ars/post/20080626-three-dimensional-qubits-on-the-way.html
http://arstechnica.com/news.ars/post/20080527-molecular-magnets-in-soap-bubbles-could-lead-to-quantum-ram.html
For a more technical overview of why it's not as hard as it used to be, there's a four-part series on self-correcting quantum computers:
http://scienceblogs.com/pontiff/2008/08/selfcorrecting_quantum_compute.php

I vote: Hype.
...but hope I'm wrong.
Randy

It's already here, just with very limited uses so far.

Quantum computing isn't much past the "idea" stage. Sure, they can multiply two 2-bit integers, but it takes a dozen grad students a week to set up for the run, and another week to validate the results.
Long-term it's probably got a lot of potential, though it may never be stable enough for use outside of a highly controlled lab-based "supercomputer" environment.
At this point I'd classify it more as Physics than Computer Science. In a way, it's as if Charles Babbage got his hands on one of Michael Faraday's papers and started thinking about maybe, possibly, someday, being able to use electromagnetism as a basis for calculation.
There's been a fair amount written about Quantum Computing over the last couple of years in Scientific American, much of it by the primary researchers themselves: http://www.sciam.com

Error correction and loss of coherence are the big problems in quantum computing, as I understand it. Lots of smart people are hard at work on solving these problems, but last I read, it was looking like error-correction requirements might be exponential over the number of qbits, which really detracts from the "we'll solve NP problems in an instant!" attraction of quantum computation.

Quantum computing is a tool; it's just a tool too raw to have any sort of useful application as of this moment, but who knows.

Nice, I get to re-use my answer from another SO question word-by-word. :)
A few answers mention quantum computers as if they're still far in the future, but I beg to differ.
There were vague mentions of possibility of quantum computers in 1970s and 1980s (see timeline on Wikipedia), however the first "working" 3-qubit NMR quantum computer was built in 1998. The field is still in infancy, and almost all progress is still theoretical and confined to academia, but in 2007 company called D-Wave Systems presented a prototype of a working 16-qubit, and later during the year 28-qubit adiabatic quantum computer. Their effort is notable since they claim that their technology is commercially viable and scalable. As of 2010, they have 7 rigs, current generation of their chips has 128 qubits. They seem to have partnered with Google to find interesting problems to test their hardware on.
I recommend this short 24-minute video and Wikipedia article on D-Wave for a quick overview, and there a lot more resources on this blog written by D-Wave founder and CFO.

Related

How to write fast (low level) code? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I would like to learn more about low level code optimization, and how to take advantage of the underlying machine architecture. I am looking for good pointers on where to read about this topic.
More details:
I am interested in optimization in the context of scientific computing (which is a lot of number crunching but not only) in low level languages such as C/C++. I am in particular interested in optimization methods that are not obvious unless one has a good understanding of how the machine works (which I don't---yet).
For example, it's clear that a better algorithm is faster, without knowing anything about the machine it's run on. It's not at all obvious that it matters if one loops through the columns or the rows of a matrix first. (It's better to loop through the matrix so that elements that are stored at adjacent locations are read successively.)
Basic advice on the topic or pointers to articles are most welcome.
Answers
Got answers with lots of great pointers, a lot more than I'll ever have time to read. Here's a list of all of them:
The software optimization cookbook from Intel (book)
What every programmer should know about memory (pdf book)
Write Great Code, Volume 2: Thinking Low-Level, Writing High-Level (book)
Software optimization resources by Agner Fog (five detailed pdf manuals)
I'll need a bit of skim time to decide which one to use (not having time for all).
Drepper's What Every Programmer Should Know About Memory [pdf] is a good reference to one aspect of low-level optimisation.
For Intel architectures this is priceless: The Software Optimization Cookbook, Second Edition
It's been a few years since I read it, but Write Great Code, Volume 2: Thinking Low-Level, Writing High-Level by Randall Hyde was quite good. It gives good examples of how C/C++ code translates into assembly, e.g. what really happens when you have a big switch statement.
Also, altdevblogaday.com is focused on game development, but the programming articles might give you some ideas.
An interesting book about bit manipulation and smart ways of doing low-level things is Hacker's Delight.
This is definitely worth a read for everyone interested in low-level coding.
Check out: http://www.agner.org/optimize/
C and C++ are usually the languages that are used for this because of their speed (ignoring Fortran as you didn't mention it). What you can take advantage of (which the icc compiler does a lot) is SSE instruction sets for a lot of floating point number crunching. Another thing that is possible is the use of CUDA and Stream API's for Nvidia/Ati respectively to do VERY fast floating point operations on the graphics card while leaving the CPU free to do the rest of the work.
Another approach to this is hands-on comparison. You can get a library like Blitz++ (http://www.oonumerics.org/blitz/) which - I've been told - implements aggressive optimisations for numeric/scientific computing, then write some simple programs doing operations of interest to you (e.g. matrix multiplications). As you use Blitz++ to perform them, write your own class that does the same, and if Blitz++ proves faster start investigating it's implementation until you realise why. (If yours is significantly faster you can tell the Blitz++ developers!)
You should end up learning about a lot of things, for example:
memory cache access patterns
expression templates (there are some bad links atop Google search results re expression templates - the key scenario/property you want to find discussion of is that they can encode many successive steps in a chain of operations such that they all be applied during one loop over a data set)
some CPU-specific instructions (though I haven't checked they've used such non-portable techniques)...
I learned a lot from the book Inner Loops. It's ancient now, in computer terms, but it's very well written and Rick Booth is so enthusiastic about his subject I would still say it's worth looking at to see the kind of mindset you need to make a CPU fly.

How to understand the design and code flow of any product quickly? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I have switched to a new company and I am working on a product that has a huge code base without documentation. I want to quickly get acquainted with the design and the code flow of the product so that I may become a productive member ASAP
Slowly and steadily one does gets to understand the code, but what should be the best and smart way one should approach the code base so that he understands the code quickly and start delivering?
Note: I tried my hands on Star UML and tried to reverse engineer the class diagrams so that I may have a rough idea of the product internal designs but failed miserably.
EDIT: The question is not about gaining knowledge about what the product does but how the internals are designed.
Fixing bugs and Debugging using breakpoints does provide one way of achieving this but I was looking if there is even a faster way we could achieve this
In Keith's Words:
This may work for some code-bases, but in general I think its a bad idea. You tend to be too focused on the details, while at first you want to get the big picture: what the classes are, what the communication patterns are, etc. Plus, if you have a distributed application (client-server, n-tier, etc), or code that takes a long time to run it may not be practical to run it through a debugger
I'm a contract engineer, and this situation is routine several times per year—for the last few decades.
I find it quite helpful to first run the application and play with it—before looking at any code:
What the heck does it do? If necessary, read the user documentation.
What happens with extreme values?
What if I leave out some values?
What happens if I click on a control rapidly?
Is there any way to misuse the program?
Explore the edges of the application: are there seldom used or hard-to-find sub-menus? Is there a configuration facility which exposes more functionality?
While I'm doing that, I'm constructing a mental model of how I would have implemented it. Surprisingly, this user-oriented first encounter with the product usually causes my understanding of the application to be head and shoulders above the developers who have worked on it for a long time. A side effect of this approach is that I tend to find quite a few bugs (often quite an avalanche of them), and think of quite a few improvements which should be made.
After that, I look at the general structure of the program, whether it be modules, classes, files, or schema. Not looking at individual lines of code, except those showing the program's architecture. Once I think I understand over half of the structure, I try to make a small bug fix or improvement—something which takes a few minutes to write, but may take hours to properly understand. If it works, I make a slightly bigger change somewhere, preferably in another section of the code.
In this way, I've found it possible to understand well enough approximately 50,000 to 100,000 lines of code per day.
If you have a development environment to run the code in the best way I've found is to use a debugger and watch the flow of code while executing it. You can setup break points and step through it to see how the code interacts.
The way I have always learned, besides just reading through the code / data model is to start fixing some bugs. That gives me exposure to various parts of the system, and having the 'purpose' while reading the code makes it a bit more meaningful.
Ask everyone you can find for help and ask them to ask anyone else they think could be helpful.
There are tools which suck up the source code and draw pictures. Try Enterprise Architect from Sparx. It's under $200 per seat and will show you the object layout very effectively.

Is C++ a "waste of time"? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I ran into this supposed interview of Bjarne Stroustrup, the inventor of C++.
http://artlung.com/smorgasborg/Invention_of_Cplusplus.shtml
Stroustrup: Well, it's been long enough, now, and I believe most people have figured out for themselves that C++ is a waste of time but, I must say, it's taken them a lot longer than I thought it would...
Interviewer: Yes, but C++ is basically a sound language.
Stroustrup: You really believe that, don't you? Have you ever sat down and worked on a C++ project? Here's what happens: First, I've put in enough pitfalls to make sure that only the most trivial projects will work first time. Take operator overloading. At the end of the project, almost every module has it, usually, because guys feel they really should do it, as it was in their training course. The same operator then means something totally different in every module. Try pulling that lot together, when you have a hundred or so modules. And as for data hiding, God, I sometimes can't help laughing when I hear about the problems companies have making their modules talk to each other.
Is this a hoax? Do any of these points seem true for any of the veteran C++ programmers out there?
You just have to check the Stroustrup's website (the FAQ part) to find that it's wrong - a well known hoax as Judah Himango already pointed :
Did you really give an interview to IEEE?
in which you confessed that C++ was
deliberately created as an awful
language for writing unmaintainable
code to increase programmers'
salaries? Of course not. Read the
real IEEE interview.
It's a well-known hoax.
And no, learning C++ isn't a waste of your time, something that's been discussed on StackOverflow many times.
As mentioned, this is a well-known hoax.
But it does provoke some interesting points. These days C++ is a waste of time, except for when you can't afford to waste time. Less opaquely: C++ is a waste of development time, except for when you can't afford to waste execution time.
From the article titled "The Real Stroustrup Interview" in IEEE Computer Magazine Vol. 31 Issue 6 pp.110-114 (June 1998):
For the past few months, a hoax interview between Stroustrup and Computer has been making the rounds in cyberspace. While we regret the incident, it offers us a welcome opportunity to have the father of C++ share his insights on Standard C++ and software development in general. We can also attest to his continued sense of proportion and humor—he suggests that the fictitious interview would have been a much funnier parody had he written it himself.
As others mentioned, this Interview is hoax.
Well, I am one of the persons who hate C++ and normally doesnt use it, but learning it was definitely not a waste of time. At least now I know why I hate C++ and I understand why other persons use this language and think it is good.
If you want to learn this language to know about its concepts, its benefits and its drawbacks, to be able to read code written in it, and in general to be able to "talk about" it, it is never a waste of time. Same for any other programming language. It will increase your expierience. For example, C++ shows one common way of OOP - a way I dont like, but a way many other people use.
But if you want to learn it because "the people say that it is the best" (as I sometimes read), then it is really a waste of time. Same for any other programming language.
Programmers that feel attracted to higher level languages that take care of memory management and other tasks for them, could feel that C++ is a waste of time.
It certainly is if you can achieve the same goal with another language in less time and with less bug fixing and don't mind the downsides as efficiency.
But I don't regret having learned and spent so many hours coding in C/C++ for it's such a beautiful language and allows you to produce things that not many other languages can.
I mean, don't you want to use the language with which operating systems and compilers are written? that's not a waste of time at all from my perspective.
C++ is far from being a waste of your time. You'll understand valuable concepts that will help you understand many other concepts in different programming languages. I.E.: VTABLE.
There is not a single framework which uses all language features of C++. This introduces a huge inconsistency to the language's ecosystem.
QT is one of the few APIs which I would call a framework (or API for a lot of things):
But it defines own string, own array, ...
What's the point of a "standard" library when no one can use it in a portable and compatible way (from the aspect of interaction with other APIs)?
I know, there is boost, but what is boost compared to an API such as QT? Nothing.
Look at Java: The is the standard Java API, and every "foreign" API uses it, it's all perfectly compatible.
C++ (and Java) probably the best language to learn to understand concepts of OOP.
I remember learning it in college benefited me a lot.
Stroustrup is not that stupid to say that! It is definitely a hoax!

'Head First' Style Data Structures & Algorithms Book? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I loved the Head First series book on object oriented design. It was a very gentle and funny introduction to the subject. I am currently taking a data structures class and find the text we are using (Kruse/Ryba Data Structures and Program Design in C++) to be very dry and hard to comprehend. This is mostly due I think to my own limitations in the area of Mathematics.
Does anyone know of a Data Structures text that is written in a lighter style, with a sense of humor, that still covers all the basics like Binary Trees, B Trees, and Graphs?
The Algorithm Design Manual by Steve Skiena isn't exactly a barrel of laughs, but it's relatively light on the deeper mathematics and contains lots of what he calls "War Stories", which are illustrative examples from real world situations where algorithm work really paid off (or, sometimes, totally failed). He's also got his audio and video lectures online, and he's got a nice lecture style with bits of humor interspersed, so it might be what you are looking for.
This too is not light either but it is pretty decent
Algorithms and data structures by Robert Lafore
There is nothing more readable and meaningful, in my opinion than http://www.amazon.com/Bundle-Algorithms-Parts-1-5-Fundamentals/dp/020172684X
It's 2 books, part 5 being graphs and is not as useful as the other book - unless of course, you want to use graphs to solve a problem. :)
How to Solve it By Computer by Dromey though not exactly an algorithms book takes the approach of re-discovering the process by which many data-structures and algorithms were arrived at over the years. This allows us to understand the flow of thought behind the code and some of the forces at work.
Related: This book follows in the foot-steps of another great book: How to Solve It by G. Polya which talks about how great mathematicians go about the problem-solving process.
I'm currently using Larry Nyhoff's ADTs, Data Structures, and Problem Solving with C++.
It's not as light or enjoyable to read as a Head First series book, but it's really well detailed on binary trees, b trees, and graphs. Its code samples have been really helpful for completing my assignments. No higher math knowledge is required to understand the text (except, of course, on the chapter dedicated to algorithm analysis).
Beginning Algorithms by Harris and Ross (a Wrox Press book) was one I liked, although its examples are presented in Java, not C++. Might be a nice accompaniment to the text you're trudging through in class.
I've heard good things about "Introduction to algorithms, A creative approach - Udi Manber"
I can't verify it though since it's not available locally :(
http://www.amazon.com/Introduction-Algorithms-Creative-Udi-Manber/dp/0201120372

How would you go about evaluating a programmer? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
A few weeks ago, I was assigned to evaluate all our programmers. I'm very uncomfortable with this since I was the one who taught everyone the shop's programming language (they all got out of college not knowing the language and as luck would have it, I'm very proficient with it.). On the evaluation, I was very biased on their performance (perfect scores).
I'm glad that our programming shop doesn't require an average performance level but I heard horror stories of shops which do require an average level.
My question are as follows:
As a programmer, what evaluation questions would you like to see?
As a manager, what evaluation questions would you like to see?
As the evaluator, how can you prevent bias in your evaluation?
I would love to remove the evaluation test. Is there any advantages to having an evaluation test? Any disadvantage?
Gets things done is really all you need to evaluate a developer. After that you look at the quality that the developer generates. Do they write unit tests and believe in testing and being responsible for the code they generate? Do they take initiative to fix bugs without being assigned them? Are they passionate about coding? Are they always constantly learning, trying to find better ways to accomplish a task or make a process better? These questions are pretty much how I judge developers directly under me. If they are not directly under you and you are not a direct report for them, then you really shouldn't be evaluating them. If you are assigned in evaluating those programmers that aren't under you, then you need to be proactive to answer the above questions about them, which can be hard.
You can't remove the evaluation test. I know it can become tedious sometimes, but I actually enjoy doing it and it's invaluable for the developer you are evaluating. You need to be a manager that cares about how your developers do. You are a direct reflection on them and as they are of you. One question I always leave up to the developer is for them to evaluate me. The evaluation needs to be a two lane road.
I have to also evaluate off a cookie cutter list of questions, which I do, but I always add the above and try to make the evaluation fun and a learning exercise during the time I have the developer one on one, it is all about the developer you are reviewing.
I would first consider not necessarily the number of lines of code, but the value of the code that the person adds as reflective of course to what they are assigned to do. Someone told to maintain code verses building a new app is very different. Also consider how the person uses new techniques to make the code relevant and updated? How maintainable is the code the person creates? Do they do things in a manner that is logical and understandable to the rest of the team? Does their coding improve the app or just wreck it? Last and not least does their coding improve over time?
What about getting everyone's input? Everyone that a person is working with will have a unique insight into that person. One person might think someone is a slacker, while another person sees that they are spending a lot of time planning before they start coding, etc.
What about getting everyone's input? Everyone that a person is working with will have a unique insight into that person.
That would work if (1) evaluation is conducted with open doors and (2) you've worked with that person on one project or even on the same module. As the person evaluating them, I couldn't judge the programmers who I didn't directly work with.
One person might think someone is a slacker, while another person sees that they are spending a lot of time planning before they start coding
Unfortunately, this is debatable. Someone who looks like a slacker might be in deep thoughts, or maybe not. And is someone who spend a long time planning, necessarily a bad programmer?
I believe a good evaluation question would be able to answer this.