In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, known as complementary variables or canonically conjugate variables such as position x and momentum p, can be known. - Wikipedia
The basic idea behind this principle is that taking measurements influence the thing you are measuring.
In microcontrollers we all get intruduced to this idea at some point when you are trying to figure out if the oscillator is running, and measuring it with a scope probe you realize - of course only after a couple of hours of struggling - that the 10pF impedence load the scope probe is adding to the pin is actually causing the oscillator, which was working just fine, to stop dead.
In software we have similar problems. First we find out we cannot set breakpoints because the optimizer has mixed the instructions so much that they no longer correlate to the C instructions. We are advised to disable optimization, so that we can set a breakpoint to inspect what is going on, only to discover that when optimizations are disabled the problem is gone. We think we are smart and add printf's to the code, side-stepping the optimization problem entirely, only to end up with working code once again, and the moment we remove the prints it breaks! It is almost like the bug disappears the moment you start looking at it ...
These "Heisenbugs" are usually an indication that there is some kind of concurrency problem or race condition, where the timing of the code is a critical part of the problem being exposed.
The oldest reference to a "Heisenbug" is a 1983 paper in the ACM.
As I often do, here is the link to Heisenbug on Wikipedia.
There are no comments to display.