Search the Community
Showing results for tags 'lore'.
-
The story of Mel Kaye is perhaps my favorite bit of Lore. Mel is often referred to as a "Real Programmer", of course a parody of the famous essay Real Men Don't Eat Quiche by Bruce Feirstein which led to the publication of a precious tongue-in-cheek book on stereotypes about masculinity, which in turn led Ed Post to speculate about the properties of the "Real Programmer" in his essay Real Programmers Don't use Pascal, but I digress. The story itself is famous enough to have it's own Wikipedia page. Even though this was written 36 years ago back in 1983 it still resonates well. The story shows us, by the mistakes of others, that doing something in an impressively complex way may very well demonstrate impressive skill and truly awe onlookers, but it is really not the best way to do things in the real world where our code needs to be maintained later. A good friend once told me that Genius is like a fertile field. However much it's potential to grow fantastic crops, it is also capable of growing the most impressive weeds if left untended, as Mel has shown us first hand here. Without further ado, here it is for your reading pleasure: The story of Mel A recent article devoted to the *macho* side of programming made the bald and unvarnished statement: Real Programmers write in Fortran. Maybe they do now, in this decadent era of Lite beer, hand calculators and "user-friendly" software but back in the Good Old Days, when the term "software" sounded funny and Real Computers were made out of drums and vacuum tubes, Real Programmers wrote in machine code. Not Fortran. Not RATFOR. Not, even, assembly language. Machine Code. Raw, unadorned, inscrutable hexadecimal numbers. Directly. Lest a whole new generation of programmers grow up in ignorance of this glorious past, I feel duty-bound to describe, as best I can through the generation gap, how a Real Programmer wrote code. I'll call him Mel, because that was his name. I first met Mel when I went to work for Royal McBee Computer Corp., a now-defunct subsidiary of the typewriter company. The firm manufactured the LGP-30, a small, cheap (by the standards of the day) drum-memory computer, and had just started to manufacture the RPC-4000, a much-improved, bigger, better, faster -- drum-memory computer. Cores cost too much, and weren't here to stay, anyway. (That's why you haven't heard of the company, or the computer.) I had been hired to write a Fortran compiler for this new marvel and Mel was my guide to its wonders. Mel didn't approve of compilers. "If a program can't rewrite its own code," he asked, "what good is it?" Mel had written, in hexadecimal, the most popular computer program the company owned. It ran on the LGP-30 and played blackjack with potential customers at computer shows. Its effect was always dramatic. The LGP-30 booth was packed at every show, and the IBM salesmen stood around talking to each other. Whether or not this actually sold computers was a question we never discussed. Mel's job was to re-write the blackjack program for the RPC-4000. (Port? What does that mean?) The new computer had a one-plus-one addressing scheme, in which each machine instruction, in addition to the operation code and the address of the needed operand, had a second address that indicated where, on the revolving drum, the next instruction was located. In modern parlance, every single instruction was followed by a GO TO! Put *that* in Pascal's pipe and smoke it. Mel loved the RPC-4000 because he could optimize his code: that is, locate instructions on the drum so that just as one finished its job, the next would be just arriving at the "read head" and available for immediate execution. There was a program to do that job, an "optimizing assembler", but Mel refused to use it. "You never know where its going to put things", he explained, "so you'd have to use separate constants". It was a long time before I understood that remark. Since Mel knew the numerical value of every operation code, and assigned his own drum addresses, every instruction he wrote could also be considered a numerical constant. He could pick up an earlier "add" instruction, say, and multiply by it, if it had the right numeric value. His code was not easy for someone else to modify. I compared Mel's hand-optimized programs with the same code massaged by the optimizing assembler program, and Mel's always ran faster. That was because the "top-down" method of program design hadn't been invented yet, and Mel wouldn't have used it anyway. He wrote the innermost parts of his program loops first, so they would get first choice of the optimum address locations on the drum. The optimizing assembler wasn't smart enough to do it that way. Mel never wrote time-delay loops, either, even when the balky Flexowriter required a delay between output characters to work right. He just located instructions on the drum so each successive one was just *past* the read head when it was needed; the drum had to execute another complete revolution to find the next instruction. He coined an unforgettable term for this procedure. Although "optimum" is an absolute term, like "unique", it became common verbal practice to make it relative: "not quite optimum" or "less optimum" or "not very optimum". Mel called the maximum time-delay locations the "most pessimum". After he finished the blackjack program and got it to run, ("Even the initializer is optimized", he said proudly) he got a Change Request from the sales department. The program used an elegant (optimized) random number generator to shuffle the "cards" and deal from the "deck", and some of the salesmen felt it was too fair, since sometimes the customers lost. They wanted Mel to modify the program so, at the setting of a sense switch on the console, they could change the odds and let the customer win. Mel balked. He felt this was patently dishonest, which it was, and that it impinged on his personal integrity as a programmer, which it did, so he refused to do it. The Head Salesman talked to Mel, as did the Big Boss and, at the boss's urging, a few Fellow Programmers. Mel finally gave in and wrote the code, but he got the test backwards, and, when the sense switch was turned on, the program would cheat, winning every time. Mel was delighted with this, claiming his subconscious was uncontrollably ethical, and adamantly refused to fix it. After Mel had left the company for greener pa$ture$, the Big Boss asked me to look at the code and see if I could find the test and reverse it. Somewhat reluctantly, I agreed to look. Tracking Mel's code was a real adventure. I have often felt that programming is an art form, whose real value can only be appreciated by another versed in the same arcane art; there are lovely gems and brilliant coups hidden from human view and admiration, sometimes forever, by the very nature of the process. You can learn a lot about an individual just by reading through his code, even in hexadecimal. Mel was, I think, an unsung genius. Perhaps my greatest shock came when I found an innocent loop that had no test in it. No test. *None*. Common sense said it had to be a closed loop, where the program would circle, forever, endlessly. Program control passed right through it, however, and safely out the other side. It took me two weeks to figure it out. The RPC-4000 computer had a really modern facility called an index register. It allowed the programmer to write a program loop that used an indexed instruction inside; each time through, the number in the index register was added to the address of that instruction, so it would refer to the next datum in a series. He had only to increment the index register each time through. Mel never used it. Instead, he would pull the instruction into a machine register, add one to its address, and store it back. He would then execute the modified instruction right from the register. The loop was written so this additional execution time was taken into account -- just as this instruction finished, the next one was right under the drum's read head, ready to go. But the loop had no test in it. The vital clue came when I noticed the index register bit, the bit that lay between the address and the operation code in the instruction word, was turned on-- yet Mel never used the index register, leaving it zero all the time. When the light went on it nearly blinded me. He had located the data he was working on near the top of memory -- the largest locations the instructions could address -- so, after the last datum was handled, incrementing the instruction address would make it overflow. The carry would add one to the operation code, changing it to the next one in the instruction set: a jump instruction. Sure enough, the next program instruction was in address location zero, and the program went happily on its way. I haven't kept in touch with Mel, so I don't know if he ever gave in to the flood of change that has washed over programming techniques since those long-gone days. I like to think he didn't. In any event, I was impressed enough that I quit looking for the offending test, telling the Big Boss I couldn't find it. He didn't seem surprised. When I left the company, the blackjack program would still cheat if you turned on the right sense switch, and I think that's how it should be. I didn't feel comfortable hacking up the code of a Real Programmer." -- Source: usenet: utastro!nather, May 21, 1983.
-
Melvin Conway quipped the phrase back in 1967 that "organizations which design systems ... are constrained to produce designs which are copies of the communication structures of these organizations." Over the decades this old adage has proven to be quite accurate and this has become known as "Conway's Law". Researchers from MIT and Harvard have since shown that there is strong evidence for this correllation, they called it the "The Mirroring Hypothesis". When you read "The mythical man month" by Fred Brooks we see that we already knew back in the seventies that there is no silver bullet when it comes to Software Engineering, and that the reason for this is essentially the complexity of software and how we deal with it. It turns out that adding more people to a software project increases the number of people we need to communicate with and the number of people who need to understand it. When we just make one big team where everyone has to communicate with everyone the code tends to reflect this structure. As we can see the more people we add into a team the more the structure quickly starts to resemble something we all know all too well! When we follow the age old technique of divide and concquer, making small Agile teams that each work on a part of the code which is their single responsibility, it turns out that we end up getting encapsulation and modularity with dependencies managed between the modules. No wonder the world is embracing agile everywhere nowadays! You can of course do your own research on this, here are some org carts of some well known companies out there you can use to check the hypothesis for yourself!
-
Exactly how long is a nanosecond? This Lore blog is all about standing on the shoulders of giants. Back in February 1944 IBM shipped the Harvard Mark 1 to Harvard University. It looked like this: The Mark I was a remarkable machine at the time, it could perform addition in 1 cycle (which took roughly 0.3 seconds) and multiplication in 20 cycles or 6 seconds. Calculating sin(x) would run up to 60 seconds (1 minute). The team that ran this Electromechanical computer had on it a remarkable young giant, Grace Brewster Murray Hopper, who went on to become a Rear Admiral in the US Navy before her retirement 43 odd years later. During her career as one of the first and finest computer scientists she was involved in a myriad of innovation. In 1949 she proposed the concept of creating a human readable language made up entirely of English words to program a computer. Her idea was readily dismissed because "computers do not understand English". It was only 3 years later that her idea gained traction when she published a paper entitled "The education of a computer" on the subject. She called this a "compiler". In 1959 she was part of the team commissioned to develop a new programming language for business use. It was called COBOL. But enough of the introductions already! If we had to go over everything Grace Hopper accomplished we would be here all day - here is the Wikipedia page on Grace Hopper for the uninitiated who do not know who this remarkable woman was. I have never seen a better explanation of orders of magnitude and scale than the representation Grace Hopper was known for explaining how long a nanosecond really is. Here is a video from the archives of her explaining this brilliantly in layman's terms in just under 2 minutes. I also love the interview Letterman did with her back in 1986 where she explained this briefly on national television. For television she stepped up her game a notch and took the nanosecond to the next level, also explaining through an analogy how long a Pico-second is! This 10 minute interview really goes a long way in capturing the essence of who Grace Hopper really was, a remarkable pioneer in Engineering indeed!
-
The basic idea behind this principle is that taking measurements influence the thing you are measuring. In microcontrollers we all get intruduced to this idea at some point when you are trying to figure out if the oscillator is running, and measuring it with a scope probe you realize - of course only after a couple of hours of struggling - that the 10pF impedence load the scope probe is adding to the pin is actually causing the oscillator, which was working just fine, to stop dead. In software we have similar problems. First we find out we cannot set breakpoints because the optimizer has mixed the instructions so much that they no longer correlate to the C instructions. We are advised to disable optimization, so that we can set a breakpoint to inspect what is going on, only to discover that when optimizations are disabled the problem is gone. We think we are smart and add printf's to the code, side-stepping the optimization problem entirely, only to end up with working code once again, and the moment we remove the prints it breaks! It is almost like the bug disappears the moment you start looking at it ... These "Heisenbugs" are usually an indication that there is some kind of concurrency problem or race condition, where the timing of the code is a critical part of the problem being exposed. The oldest reference to a "Heisenbug" is a 1983 paper in the ACM. As I often do, here is the link to Heisenbug on Wikipedia.
-
Epigrams on Programming Alan J. Perlis Yale University This text has been published in SIGPLAN Notices Vol. 17, No. 9, September 1982, pages 7 - 13. The phenomena surrounding computers are diverse and yield a surprisingly rich base for launching metaphors at individual and group activities. Conversely, classical human endeavors provide an inexhaustible source of metaphor for those of us who are in labor within computation. Such relationships between society and device are not new, but the incredible growth of the computer's influence (both real and implied) lends this symbiotic dependency a vitality like a gangly youth growing out of his clothes within an endless puberty. The epigrams that follow attempt to capture some of the dimensions of this traffic in imagery that sharpens, focuses, clarifies, enlarges and beclouds our view of this most remarkable of all mans' artifacts, the computer. One man's constant is another man's variable. Functions delay binding: data structures induce binding. Moral: Structure data late in the programming process. Syntactic sugar causes cancer of the semi-colons. Every program is a part of some other program and rarely fits. If a program manipulates a large amount of data, it does so in a small number of ways. Symmetry is a complexity reducing concept (co-routines include sub-routines); seek it everywhere. It is easier to write an incorrect program than understand a correct one. A programming language is low level when its programs require attention to the irrelevant. It is better to have 100 functions operate on one data structure than 10 functions on 10 data structures. Get into a rut early: Do the same processes the same way. Accumulate idioms. Standardize. The only difference (!) between Shakespeare and you was the size of his idiom list - not the size of his vocabulary. If you have a procedure with 10 parameters, you probably missed some. Recursion is the root of computation since it trades description for time. If two people write exactly the same program, each should be put in micro-code and then they certainly won't be the same. In the long run every program becomes rococo - then rubble. Everything should be built top-down, except the first time. Every program has (at least) two purposes: the one for which it was written and another for which it wasn't. If a listener nods his head when you're explaining your program, wake him up. A program without a loop and a structured variable isn't worth writing. A language that doesn't affect the way you think about programming, is not worth knowing. Wherever there is modularity there is the potential for misunderstanding: Hiding information implies a need to check communication. Optimization hinders evolution. A good system can't have a weak command language. To understand a program you must become both the machine and the program. Perhaps if we wrote programs from childhood on, as adults we'd be able to read them. One can only display complex information in the mind. Like seeing, movement or flow or alteration of view is more important than the static picture, no matter how lovely. There will always be things we wish to say in our programs that in all known languages can only be said poorly. Once you understand how to write a program get someone else to write it. Around computers it is difficult to find the correct unit of time to measure progress. Some cathedrals took a century to complete. Can you imagine the grandeur and scope of a program that would take as long? For systems, the analogue of a face-lift is to add to the control graph an edge that creates a cycle, not just an additional node. In programming, everything we do is a special case of something more general - and often we know it too quickly. Simplicity does not precede complexity, but follows it. Programmers are not to be measured by their ingenuity and their logic but by the completeness of their case analysis. The 11th commandment was "Thou Shalt Compute" or "Thou Shalt Not Compute" - I forget which. The string is a stark data structure and everywhere it is passed there is much duplication of process. It is a perfect vehicle for hiding information. Everyone can be taught to sculpt: Michelangelo would have had to be taught how not to. So it is with the great programmers. The use of a program to prove the 4-color theorem will not change mathematics - it merely demonstrates that the theorem, a challenge for a century, is probably not important to mathematics. The most important computer is the one that rages in our skulls and ever seeks that satisfactory external emulator. The standardization of real computers would be a disaster - and so it probably won't happen. Structured Programming supports the law of the excluded muddle. Re graphics: A picture is worth 10K words - but only those to describe the picture. Hardly any sets of 10K words can be adequately described with pictures. There are two ways to write error-free programs; only the third one works. Some programming languages manage to absorb change, but withstand progress. You can measure a programmer's perspective by noting his attitude on the continuing vitality of FORTRAN. In software systems it is often the early bird that makes the worm. Sometimes I think the only universal in the computing field is the fetch-execute-cycle. The goal of computation is the emulation of our synthetic abilities, not the understanding of our analytic ones. Like punning, programming is a play on words. As Will Rogers would have said, "There is no such thing as a free variable." The best book on programming for the layman is "Alice in Wonderland"; but that's because it's the best book on anything for the layman. Giving up on assembly language was the apple in our Garden of Eden: Languages whose use squanders machine cycles are sinful. The LISP machine now permits LISP programmers to abandon bra and fig-leaf. When we understand knowledge-based systems, it will be as before - except our finger-tips will have been singed. Bringing computers into the home won't change either one, but may revitalize the corner saloon. Systems have sub-systems and sub-systems have sub-systems and so on ad infinitum - which is why we're always starting over. So many good ideas are never heard from again once they embark in a voyage on the semantic gulf. Beware of the Turing tar-pit in which everything is possible but nothing of interest is easy. A LISP programmer knows the value of everything, but the cost of nothing. Software is under a constant tension. Being symbolic it is arbitrarily perfectible; but also it is arbitrarily changeable. It is easier to change the specification to fit the program than vice versa. Fools ignore complexity. Pragmatists suffer it. Some can avoid it. Geniuses remove it. In English every word can be verbed. Would that it were so in our programming languages. Dana Scott is the Church of the Lattice-Way Saints. In programming, as in everything else, to be in error is to be reborn. In computing, invariants are ephemeral. When we write programs that "learn", it turns out we do and they don't. Often it is means that justify ends: Goals advance technique and technique survives even when goal structures crumble. Make no mistake about it: Computers process numbers - not symbols. We measure our understanding (and control) by the extent to which we can arithmetize an activity. Making something variable is easy. Controlling duration of constancy is the trick. Think of all the psychic energy expended in seeking a fundamental distinction between "algorithm" and "program". If we believe in data structures, we must believe in independent (hence simultaneous) processing. For why else would we collect items within a structure? Why do we tolerate languages that give us the one without the other? In a 5 year period we get one superb programming language. Only we can't control when the 5 year period will begin. Over the centuries the Indians developed sign language for communicating phenomena of interest. Programmers from different tribes (FORTRAN, LISP, ALGOL, SNOBOL, etc.) could use one that doesn't require them to carry a blackboard on their ponies. Documentation is like term insurance: It satisfies because almost no one who subscribes to it depends on its benefits. An adequate bootstrap is a contradiction in terms. It is not a language's weaknesses but its strengths that control the gradient of its change: Alas, a language never escapes its embryonic sac. It is possible that software is not like anything else, that it is meant to be discarded: that the whole point is to always see it as soap bubble? Because of its vitality, the computing field is always in desperate need of new cliches: Banality soothes our nerves. It is the user who should parameterize procedures, not their creators. The cybernetic exchange between man, computer and algorithm is like a game of musical chairs: The frantic search for balance always leaves one of the three standing ill at ease. If your computer speaks English it was probably made in Japan. A year spent in artificial intelligence is enough to make one believe in God. Prolonged contact with the computer turns mathematicians into clerks and vice versa. In computing, turning the obvious into the useful is a living definition of the word "frustration". We are on the verge: Today our program proved Fermat's next-to-last theorem! What is the difference between a Turing machine and the modern computer? It's the same as that between Hillary's ascent of Everest and the establishment of a Hilton hotel on its peak. Motto for a research laboratory: What we work on today, others will first think of tomorrow. Though the Chinese should adore APL, it's FORTRAN they put their money on. We kid ourselves if we think that the ratio of procedure to data in an active data-base system can be made arbitrarily small or even kept small. We have the mini and the micro computer. In what semantic niche would the pico computer fall? It is not the computer's fault that Maxwell's equations are not adequate to design the electric motor. One does not learn computing by using a hand calculator, but one can forget arithmetic. Computation has made the tree flower. The computer reminds one of Lon Chaney - it is the machine of a thousand faces. The computer is the ultimate polluter. Its feces are indistinguishable from the food it produces. When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop. Interfaces keep things tidy, but don't accelerate growth: Functions do. Don't have good ideas if you aren't willing to be responsible for them. Computers don't introduce order anywhere as much as they expose opportunities. When a professor insists computer science is X but not Y, have compassion for his graduate students. In computing, the mean time to failure keeps getting shorter. In man-machine symbiosis, it is man who must adjust: The machines can't. We will never run out of things to program as long as there is a single program around. Dealing with failure is easy: Work hard to improve. Success is also easy to handle: You've solved the wrong problem. Work hard to improve. One can't proceed from the informal to the formal by formal means. Purely applicative languages are poorly applicable. The proof of a system's value is its existence. You can't communicate complexity, only an awareness of it. It's difficult to extract sense from strings, but they're the only communication coin we can count on. The debate rages on: Is PL/I Bactrian or Dromedary? Whenever two programmers meet to criticize their programs, both are silent. Think of it! With VLSI we can pack 100 ENIACs in 1 sq.cm. Editing is a rewording activity. Why did the Roman Empire collapse? What is the Latin for office automation? Computer Science is embarrassed by the computer. The only constructive theory connecting neuroscience and psychology will arise from the study of software. Within a computer natural language is unnatural. Most people find the concept of programming obvious, but the doing impossible. You think you know when you learn, are more sure when you can write, even more when you can teach, but certain when you can program. It goes against the grain of modern education to teach children to program. What fun is there in making plans, acquiring discipline in organizing thoughts, devoting attention to detail and learning to be self-critical? If you can imagine a society in which the computer-robot is the only menial, you can imagine anything. Programming is an unnatural act. Adapting old programs to fit new machines usually means adapting new machines to behave like old ones. In seeking the unattainable, simplicity only gets in the way. If there are epigrams, there must be meta-epigrams. Epigrams are interfaces across which appreciation and insight flow. Epigrams parameterize auras. Epigrams are macros, since they are executed at read time. Epigrams crystallize incongruities. Epigrams retrieve deep semantics from a data base that is all procedure. Epigrams scorn detail and make a point: They are a superb high-level documentation. Epigrams are more like vitamins than protein. Epigrams have extremely low entropy. The last epigram? Neither eat nor drink them, snuff epigrams.
-
If you are going to be writing any code you can probably use all the help you can get, and in that line you better be aware of the "Ballmer Peak". Legend has it that drinking alcohol impairs your ability to write code, BUT there is a curious peak somewhere in the vicinity of a 0.14 BAC where programmers attain almost super-human programming skill. The XKCD below tries to explain the finer nuances. But seriously many studies have shown that there is some truth to this in the sense that an increase in BAC may remove inhibitions which stifle creativity. It is believed the name really came from Balmer (with one L) series which are sudden peaks in the emmision spectrum of Hydrogen which was discovered in 1885 by Johan Balmer, but since this is about programming that soon got converted to Ballmer after the former CEO of Microsoft who did tend to behave like he was on such a peak on stage from time to time.(see https://www.youtube.com/watch?v=I14b-C67EXY) Either way, we are just going to believe that this one is true - confirmation bias and all, really - just Google it ! 🙂
-
Some reading is just compulsary in Computer Science, like Shakespere is to English, you will get no admiration from your peers if it comes out that you have never heard of the Bastard Operator from Hell. There is a whole collection of BOFH stories online here http://bofh.bjash.com/index.html#Original and even a WikiPedia page of course https://en.wikipedia.org/wiki/Bastard_Operator_From_Hell. The stories were originally posted on Usenet around 1992 Simon Travaglia. I would recommend you start at #1 here http://bofh.bjash.com/bofh/bofh1.html For those who need some movitvation to click, here is a short excerpt ...
-
They write the stuff - what we can learn from the way the Space Shuttle software was developed As a youngster I watched the movie "The right stuff" about the Mercury Seven. The film deified the picture of an Astronaut in my mind as some kind of superhero. Many years later I read and just absolutely loved an article written by Charles Fishman in Fast Company in 1996, entitled "They write the stuff", which did something similar for the developers who wrote the code which put those guys up there. Read the article here : https://www.fastcompany.com/28121/they-write-right-stuff I find myself going back to this article time and again when confronted by the question about how many bugs is acceptable in our code, or even just on what is possible in terms of quality. The article explores the code which ran the space shuttle and the processes the team responsible for this code follows. This team is one of very few teams certified to CMM level 5, and the quality of the code is achieved not through intelligence or heroic effort, but by a culture of quality engrained in the processes they have created. Here are the important numbers for the Space Shuttle code: Lines of code 420,000 Known bugs per version 1 Total development cost ($200M) $200,000,000.00 Cost per line of code (1975 value) $476.00 Cost per line of code (2019 inflation adjusted value) $2223.00 Development team (count) +-100 Average Productivity (lines/developer/workday) 8.04 The moral of the story here is that the quality everyone desires is indeed achievable, but we tend to severely underestimate the cost which would go along with this level of quality. I have used the numbers from this article countless times in arguments about the trade-off between quality and cost on my projects. The most important lesson to learn from this case study is that quality seems to depend primarily on the development processes the team follows and the culture of quality which they adhere to. As Fishman points out, in software development projects most people focus on attempting to test quality into the software, and of course as Steve McConnell pointed out this is "like weighing yourself more often" in an attempt to lose weight. Another lesson to be learned is that the process inherently accepts the fact that people will make mistakes. Quality is not ensured by people trying really hard to avoid mistakes, instead it accepts the fact that mistakes will be made and builds the detection and correction of these mistakes into the process itself. This means that if something slips through it is inappropriate to blame any individual for this mistake, the only sensible thing to do is to acknowledge a gap in the process, so when a bug does make it out into the wild the team will focus on how to fix the process instead of trying to lay blame on the person who was responsible for the mistake. Over all this article is a great story which is chock full of lessons to learn and pass on. References The full article is archived here : https://www.fastcompany.com/28121/they-write-right-stuff NASA has a great piece published on the history of the software https://history.nasa.gov/computers/Ch4-5.html Fantastic read- the interview with Tony Macina published in Communications of the ACM in 1984 - http://klabs.org/DEI/Processor/shuttle/shuttle_primary_computer_system.pdf
-
Real Programmers don't use Pascal I mentioned this one briefly before in the story of Mel, but this series would not be complete without this one. Real Programmers Don't Use PASCAL Ed Post Graphic Software Systems P.O. Box 673 25117 S.W. Parkway Wilsonville, OR 97070 Copyright (c) 1982 (decvax | ucbvax | cbosg | pur-ee | lbl-unix)!teklabs!ogcvax!gss1144!evp Back in the good old days -- the "Golden Era" of computers, it was easy to separate the men from the boys (sometimes called "Real Men" and "Quiche Eaters" in the literature). During this period, the Real Men were the ones that understood computer programming, and the Quiche Eaters were the ones that didn't. A real computer programmer said things like "DO 10 I=1,10" and "ABEND" (they actually talked in capital letters, you understand), and the rest of the world said things like "computers are too complicated for me" and "I can't relate to computers -- they're so impersonal". (A previous work [1] points out that Real Men don't "relate" to anything, and aren't afraid of being impersonal.) But, as usual, times change. We are faced today with a world in which little old ladies can get computerized microwave ovens, 12 year old kids can blow Real Men out of the water playing Asteroids and Pac-Man, and anyone can buy and even understand their very own Personal Computer. The Real Programmer is in danger of becoming extinct, of being replaced by high-school students with TRASH-80s! There is a clear need to point out the differences between the typical high-school junior Pac-Man player and a Real Programmer. Understanding these differences will give these kids something to aspire to -- a role model, a Father Figure. It will also help employers of Real Programmers to realize why it would be a mistake to replace the Real Programmers on their staff with 12 year old Pac-Man players (at a considerable salary savings). LANGUAGES The easiest way to tell a Real Programmer from the crowd is by the programming language he (or she) uses. Real Programmers use FORTRAN. Quiche Eaters use PASCAL. Nicklaus Wirth, the designer of PASCAL, was once asked, "How do you pronounce your name?". He replied "You can either call me by name, pronouncing it 'Veert', or call me by value, 'Worth'." One can tell immediately from this comment that Nicklaus Wirth is a Quiche Eater. The only parameter passing mechanism endorsed by Real Programmers is call-by-value-return, as implemented in the IBM/370 FORTRAN G and H compilers. Real programmers don't need abstract concepts to get their jobs done: they are perfectly happy with a keypunch, a FORTRAN IV compiler, and a beer. Real Programmers do List Processing in FORTRAN. Real Programmers do String Manipulation in FORTRAN. Real Programmers do Accounting (if they do it at all) in FORTRAN. Real Programmers do Artificial Intelligence programs in FORTRAN. If you can't do it in FORTRAN, do it in assembly language. If you can't do it in assembly language, it isn't worth doing. STRUCTURED PROGRAMMING Computer science academicians have gotten into the "structured programming" rut over the past several years. They claim that programs are more easily understood if the programmer uses some special language constructs and techniques. They don't all agree on exactly which constructs, of course, and the examples they use to show their particular point of view invariably fit on a single page of some obscure journal or another -- clearly not enough of an example to convince anyone. When I got out of school, I thought I was the best programmer in the world. I could write an unbeatable tic-tac-toe program, use five different computer languages, and create 1000 line programs that WORKED. (Really!) Then I got out into the Real World. My first task in the Real World was to read and understand a 200,000 line FORTRAN program, then speed it up by a factor of two. Any Real Programmer will tell you that all the Structured Coding in the world won't help you solve a problem like that -- it takes actual talent. Some quick observations on Real Programmers and Structured Programming: Real Programmers aren't afraid to use GOTOs. Real Programmers can write five page long DO loops without getting confused. Real Programmers enjoy Arithmetic IF statements because they make the code more interesting. Real Programmers write self-modifying code, especially if it saves them 20 nanoseconds in the middle of a tight loop. Programmers don't need comments: the code is obvious. Since FORTRAN doesn't have a structured IF, REPEAT ... UNTIL, or CASE statement, Real Programmers don't have to worry about not using them. Besides, they can be simulated when necessary using assigned GOTOs. Data structures have also gotten a lot of press lately. Abstract Data Types, Structures, Pointers, Lists, and Strings have become popular in certain circles. Wirth (the above-mentioned Quiche Eater) actually wrote an entire book [2] contending that you could write a program based on data structures, instead of the other way around. As all Real Programmers know, the only useful data structure is the array. Strings, lists, structures, sets -- these are all special cases of arrays and and can be treated that way just as easily without messing up your programing language with all sorts of complications. The worst thing about fancy data types is that you have to declare them, and Real Programming Languages, as we all know, have implicit typing based on the first letter of the (six character) variable name. OPERATING SYSTEMS What kind of operating system is used by a Real Programmer? CP/M? God forbid -- CP/M, after all, is basically a toy operating system. Even little old ladies and grade school students can understand and use CP/M. Unix is a lot more complicated of course -- the typical Unix hacker never can remember what the PRINT command is called this week -- but when it gets right down to it, Unix is a glorified video game. People don't do Serious Work on Unix systems: they send jokes around the world on USENET and write adventure games and research papers. No, your Real Programmer uses OS/370. A good programmer can find and understand the description of the IJK305I error he just got in his JCL manual. A great programmer can write JCL without referring to the manual at all. A truly outstanding programmer can find bugs buried in a 6 megabyte core dump without using a hex calculator. (I have actually seen this done.) OS/370 is a truly remarkable operating system. It's possible to destroy days of work with a single misplaced space, so alertness in the programming staff is encouraged. The best way to approach the system is through a keypunch. Some people claim there is a Time Sharing system that runs on OS/370, but after careful study I have come to the conclusion that they are mistaken. PROGRAMMING TOOLS What kind of tools does a Real Programmer use? In theory, a Real Programmer could run his programs by keying them into the front panel of the computer. Back in the days when computers had front panels, this was actually done occasionally. Your typical Real Programmer knew the entire bootstrap loader by memory in hex, and toggled it in whenever it got destroyed by his program. (Back then, memory was memory -- it didn't go away when the power went off. Today, memory either forgets things when you don't want it to, or remembers things long after they're better forgotten.) Legend has it that Seymour Cray, inventor of the Cray I supercomputer and most of Control Data's computers, actually toggled the first operating system for the CDC7600 in on the front panel from memory when it was first powered on. Seymour, needless to say, is a Real Programmer. One of my favorite Real Programmers was a systems programmer for Texas Instruments. One day, he got a long distance call from a user whose system had crashed in the middle of some important work. Jim was able to repair the damage over the phone, getting the user to toggle in disk I/O instructions at the front panel, repairing system tables in hex, reading register contents back over the phone. The moral of this story: while a Real Programmer usually includes a keypunch and lineprinter in his toolkit, he can get along with just a front panel and a telephone in emergencies. In some companies, text editing no longer consists of ten engineers standing in line to use an 029 keypunch. In fact, the building I work in doesn't contain a single keypunch. The Real Programmer in this situation has to do his work with a text editor program. Most systems supply several text editors to select from, and the Real Programmer must be careful to pick one that reflects his personal style. Many people believe that the best text editors in the world were written at Xerox Palo Alto Research Center for use on their Alto and Dorado computers [3]. Unfortunately, no Real Programmer would ever use a computer whose operating system is called SmallTalk, and would certainly not talk to the computer with a mouse. Some of the concepts in these Xerox editors have been incorporated into editors running on more reasonably named operating systems. EMACS and VI are probably the most well known of this class of editors. The problem with these editors is that Real Programmers consider "what you see is what you get" to be just as bad a concept in text editors as it is in women. No, the Real Programmer wants a "you asked for it, you got it" text editor -- complicated, cryptic, powerful, unforgiving, dangerous. TECO, to be precise. It has been observed that a TECO command sequence more closely resembles transmission line noise than readable text [4]. One of the more entertaining games to play with TECO is to type your name in as a command line and try to guess what it does. Just about any possible typing error while talking with TECO will probably destroy your program, or even worse -- introduce subtle and mysterious bugs in a once working subroutine. For this reason, Real Programmers are reluctant to actually edit a program that is close to working. They find it much easier to just patch the binary object code directly, using a wonderful program called SUPERZAP (or its equivalent on non-IBM machines). This works so well that many working programs on IBM systems bear no relation to the original FORTRAN code. In many cases, the original source code is no longer available. When it comes time to fix a program like this, no manager would even think of sending anything less than a Real Programmer to do the job -- no Quiche Eating structured programmer would even know where to start. This is called "job security". Some programming tools NOT used by Real Programmers: FORTRAN preprocessors like MORTRAN and RATFOR. The Cuisinarts of programming -- great for making Quiche. See comments above on structured programming. Source language debuggers. Real Programmers can read core dumps. Compilers with array bounds checking. They stifle creativity, destroy most of the interesting uses for EQUIVALENCE, and make it impossible to modify the operating system code with negative subscripts. Worst of all, bounds checking is inefficient. Source code maintainance systems. A Real Programmer keeps his code locked up in a card file, because it implies that its owner cannot leave his important programs unguarded [5]. THE REAL PROGRAMMER AT WORK Where does the typical Real Programmer work? What kind of programs are worthy of the efforts of so talented an individual? You can be sure that no real Programmer would be caught dead writing accounts-receivable programs in COBOL, or sorting mailing lists for People magazine. A Real Programmer wants tasks of earth-shaking importance (literally!): Real Programmers work for Los Alamos National Laboratory, writing atomic bomb simulations to run on Cray I supercomputers. Real Programmers work for the National Security Agency, decoding Russian transmissions. It was largely due to the efforts of thousands of Real Programmers working for NASA that our boys got to the moon and back before the cosmonauts. The computers in the Space Shuttle were programmed by Real Programmers. Programmers are at work for Boeing designing the operating systems for cruise missiles. Some of the most awesome Real Programmers of all work at the Jet Propulsion Laboratory in California. Many of them know the entire operating system of the Pioneer and Voyager spacecraft by heart. With a combination of large ground-based FORTRAN programs and small spacecraft-based assembly language programs, they can to do incredible feats of navigation and improvisation, such as hitting ten-kilometer wide windows at Saturn after six years in space, and repairing or bypassing damaged sensor platforms, radios, and batteries. Allegedly, one Real Programmer managed to tuck a pattern-matching program into a few hundred bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter. One plan for the upcoming Galileo spacecraft mission is to use a gravity assist trajectory past Mars on the way to Jupiter. This trajectory passes within 80 +/- 3 kilometers of the surface of Mars. Nobody is going to trust a PASCAL program (or PASCAL programmer) for navigation to these tolerances. As you can tell, many of the world's Real Programmers work for the U.S. Government, mainly the Defense Department. This is as it should be. Recently, however, a black cloud has formed on the Real Programmer horizon. It seems that some highly placed Quiche Eaters at the Defense Department decided that all Defense programs should be written in some grand unified language called "ADA" (registered trademark, DoD). For a while, it seemed that ADA was destined to become a language that went against all the precepts of Real Programming -- a language with structure, a language with data types, strong typing, and semicolons. In short, a language designed to cripple the creativity of the typical Real Programmer. Fortunately, the language adopted by DoD has enough interesting features to make it approachable: it's incredibly complex, includes methods for messing with the operating system and rearranging memory, and Edsgar Dijkstra doesn't like it [6]. (Dijkstra, as I'm sure you know, was the author of "GoTos Considered Harmful" -- a landmark work in programming methodology, applauded by Pascal Programmers and Quiche Eaters alike.) Besides, the determined Real Programmer can write FORTRAN programs in any language. The real programmer might compromise his principles and work on something slightly more trivial than the destruction of life as we know it, providing there's enough money in it. There are several Real Programmers building video games at Atari, for example. (But not playing them. A Real Programmer knows how to beat the machine every time: no challange in that.) Everyone working at LucasFilm is a Real Programmer. (It would be crazy to turn down the money of 50 million Star Wars fans.) The proportion of Real Programmers in Computer Graphics is somewhat lower than the norm, mostly because nobody has found a use for Computer Graphics yet. On the other hand, all Computer Graphics is done in FORTRAN, so there are a fair number people doing Graphics in order to avoid having to write COBOL programs. THE REAL PROGRAMMER AT PLAY Generally, the Real Programmer plays the same way he works -- with computers. He is constantly amazed that his employer actually pays him to do what he would be doing for fun anyway, although he is careful not to express this opinion out loud. Occasionally, the Real Programmer does step out of the office for a breath of fresh air and a beer or two. Some tips on recognizing real programmers away from the computer room: At a party, the Real Programmers are the ones in the corner talking about operating system security and how to get around it. At a football game, the Real Programmer is the one comparing the plays against his simulations printed on 11 by 14 fanfold paper. At the beach, the Real Programmer is the one drawing flowcharts in the sand. A Real Programmer goes to a disco to watch the light show. At a funeral, the Real Programmer is the one saying "Poor George. And he almost had the sort routine working before the coronary." In a grocery store, the Real Programmer is the one who insists on running the cans past the laser checkout scanner himself, because he never could trust keypunch operators to get it right the first time. THE REAL PROGRAMMER'S NATURAL HABITAT What sort of environment does the Real Programmer function best in? This is an important question for the managers of Real Programmers. Considering the amount of money it costs to keep one on the staff, it's best to put him (or her) in an environment where he can get his work done. The typical Real Programmer lives in front of a computer terminal. Surrounding this terminal are: Listings of all programs the Real Programmer has ever worked on, piled in roughly chronological order on every flat surface in the office. Some half-dozen or so partly filled cups of cold coffee. Occasionally, there will be cigarette butts floating in the coffee. In some cases, the cups will contain Orange Crush. Unless he is very good, there will be copies of the OS JCL manual and the Principles of Operation open to some particularly interesting pages. Taped to the wall is a line-printer Snoopy calender for the year 1969. Strewn about the floor are several wrappers for peanut butter filled cheese bars (the type that are made stale at the bakery so they can't get any worse while waiting in the vending machine). Hiding in the top left-hand drawer of the desk is a stash of double stuff Oreos for special occasions. Underneath the Oreos is a flow-charting template, left there by the previous occupant of the office. (Real Programmers write programs, not documentation. Leave that to the maintainence people.) The Real Programmer is capable of working 30, 40, even 50 hours at a stretch, under intense pressure. In fact, he prefers it that way. Bad response time doesn't bother the Real Programmer -- it gives him a chance to catch a little sleep between compiles. If there is not enough schedule pressure on the Real Programmer, he tends to make things more challenging by working on some small but interesting part of the problem for the first nine weeks, then finishing the rest in the last week, in two or three 50-hour marathons. This not only inpresses his manager, who was despairing of ever getting the project done on time, but creates a convenient excuse for not doing the documentation. In general: No Real Programmer works 9 to 5. (Unless it's 9 in the evening to 5 in the morning.) Real Programmers don't wear neckties. Real Programmers don't wear high heeled shoes. Real Programmers arrive at work in time for lunch. [9] A Real Programmer might or might not know his wife's name. He does, however, know the entire ASCII (or EBCDIC) code table. Real Programmers don't know how to cook. Grocery stores aren't often open at 3 a.m., so they survive on Twinkies and coffee. THE FUTURE What of the future? It is a matter of some concern to Real Programmers that the latest generation of computer programmers are not being brought up with the same outlook on life as their elders. Many of them have never seen a computer with a front panel. Hardly anyone graduating from school these days can do hex arithmetic without a calculator. College graduates these days are soft -- protected from the realities of programming by source level debuggers, text editors that count parentheses, and user friendly operating systems. Worst of all, some of these alleged computer scientists manage to get degrees without ever learning FORTRAN! Are we destined to become an industry of Unix hackers and Pascal programmers? On the contrary. From my experience, I can only report that the future is bright for Real Programmers everywhere. Neither OS/370 nor FORTRAN show any signs of dying out, despite all the efforts of Pascal programmers the world over. Even more subtle tricks, like adding structured coding constructs to FORTRAN have failed. Oh sure, some computer vendors have come out with FORTRAN 77 compilers, but every one of them has a way of converting itself back into a FORTRAN 66 compiler at the drop of an option card -- to compile DO loops like God meant them to be. Even Unix might not be as bad on Real Programmers as it once was. The latest release of Unix has the potential of an operating system worthy of any Real Programmer. It has two different and subtly incompatible user interfaces, an arcane and complicated terminal driver, virtual memory. If you ignore the fact that it's structured, even C programming can be appreciated by the Real Programmer: after all, there's no type checking, variable names are seven (ten? eight?) characters long, and the added bonus of the Pointer data type is thrown in. It's like having the best parts of FORTRAN and assembly language in one place. (Not to mention some of the more creative uses for #define.) No, the future isn't all that bad. Why, in the past few years, the popular press has even commented on the bright new crop of computer nerds and hackers ([7] and [8]) leaving places like Stanford and M.I.T. for the Real World. From all evidence, the spirit of Real Programming lives on in these young men and women. As long as there are ill-defined goals, bizarre bugs, and unrealistic schedules, there will be Real Programmers willing to jump in and Solve The Problem, saving the documentation for later. Long live FORTRAN! ACKNOWLEGEMENT I would like to thank Jan E., Dave S., Rich G., Rich E. for their help in characterizing the Real Programmer, Heather B. for the illustration, Kathy E. for putting up with it, and atd!avsdS:mark for the initial inspriration. REFERENCES [1] Feirstein, B., Real Men Don't Eat Quiche, New York, Pocket Books, 1982. [2] Wirth, N., Algorithms + Datastructures = Programs, Prentice Hall, 1976. [3] Xerox PARC editors . . . [4] Finseth, C., Theory and Practice of Text Editors - or - a Cookbook for an EMACS, B.S. Thesis, MIT/LCS/TM-165, Massachusetts Institute of Technology, May 1980. [5] Weinberg, G., The Psychology of Computer Programming, New York, Van Nostrabd Reinhold, 1971, page 110. [6] Dijkstra, E., On the GREEN Language Submitted to the DoD, Sigplan notices, Volume 3, Number 10, October 1978. [7] Rose, Frank, Joy of Hacking, Science 82, Volume 3, Number 9, November 1982, pages 58 - 66. [8] The Hacker Papers, Psychology Today, August 1980. [9] Datamation, July, 1983, pp. 263-265.
-
Abandon all sanity, ye who enter here! (The section above is of course Dante's description of the inscription to the gates of Hell) Computers work in essentially the same way, executing instructions, moving data around, etc. Programming languages are mere abstractions allowing us to tell the same computer how to do the same things using different "words and methods". These abstractions provided by languages like C, C++ or even Java, GoLang or LISP were created on the back of many years of research and when we learn to program we seldom spend the time to understand WHY the language works the way that it does, focussing rather on HOW to use it. Some concepts are best explored by imagining what life would be like if the language we were using to program looked very different. For example every embedded programmer I ask has told me that GOTO is bad, but very seldom could someone actually explain why, and it would be even more rare to find someone who was familiar with Dijkstra's paper on the subject. (see this page for that one and many more examples). I have struggled to find a better explanation of what Dijkstra was trying to convey than the "COME FROM" statement in INTERCAL! The following article appeared in Computer Shopper (the British magazine of that name, not the Ziff-Davis title) around September 1992. Intercal -- the Language From Hell Where would we be without effective programming languages? Wait, don't turn the page; that was a rhetorical question. (The answer isn't: word-processing on slide rules.) Every once in a while it's worth thinking about the matter. Most of us (the lucky ones, that is) don't come within spitting distance of programming from one day to the next. Those who do, usually don't give any thought to them: it's time to write that function so do it, in C or Cobol or dBase whatever comes to hand. We don't waste time contemplating the deeper significance of the tools we use to do the job. But what would we do without well-designed, efficient programming languages? Go back to slide rules, maybe. A computer, in a very real sense, is its language. The processor speaks in a native tongue, its own machine code: unreadable by humans and next to impossible to program in, but nevertheless essential. It's the existence of this language which makes the computer such a general purpose tool; a hardware platform upon which we layer the abstractions, the virtual machines, of our operating systems. Which are, when you stop to think about it, simply a set of grammatically correct expressions in the computer's internal language. Take the humble 386 PC for example. Feed it one set of instructions and it pretends to be a cute little Windows machine; feed it another set, and something hairy happens -- it turns into a big, bad file server and starts demanding passwords and issuing threats. This flexibility is not an accident. Almost from the start, the feature which distinguished computers from their complex calculating predecessors was the fact that they were programmable -- indeed, universally programmable, capable of tackling any computationally tractable problem. Strip away the programmability of the machine and you lose the universality. Not to mention the usefulness. Programming languages, as any fule kno, originated in the nineteen-fifties as a partial solution to a major difficulty; machine code is difficult to write and maintain and next to impossible to move (or port) from one computer to another of a different type. Assemblers, which allowed the programmers to write machine code using relatively simple mnemonic commands (which were then "assembled" into machine code) were an early innovation, but still failed to address the portability issue. The solution that finally took off was the concept of an abstract "high level" language: and the first to achieve widespread use was Fortran. A high level language is an artificial language with a rigidly specified grammar, which can be translated into machine code via a program called a "compiler". Statements in the high level language may correspond to individual statements, or whole series of statements, in the machine language ofthe computer. Indeed, the only thing that makes compilation practical is the fact that all computer language systems are effectively equivalent; any algorithm which can be expressed in one language can be solved in any other, in principle. So why are there so many languages? A load of old cobol'ers There are several answers to the question of language proliferation. Besides the facetious (some of us don't like Cobol) and the obvious (designing languages looks like a Fun Thing to a certain type of warped personality), there's the pragmatic reason. Simply put, some languages are excellent for expressing certain types of problem, but are lousy at dealing with other situations. Take Prolog, for example. Prolog is a brilliant language for resolving formal logic propositions expressable in the first order predicate calculus, but you'd need your head examining if you tried to write an operating system in it. (The Japanese MITI tried to do something similar with their Fifth Generation Project, and when was the last time you heard of them? Right.) Alternatively, take C. C is a wonderful language. It combines the flexibility and speed of raw machine code with the readability of ... er. Yes, you can re-jig a C compiler to run on anything. You can fine-tune it to produce tight, fast machine code that will execute on your toaster. Yes, you can write wonderful device drivers in it. But, again, you'd need your head examining if you set out to write a humongous database application in it. That's what Cobol is for; or SQL. So to the point of this article ... INTERCAL. An icky mess INTERCAL is a programming language which is to other languages as elephants are to deep-sea fishing -- it has nothing whatsoever to do with them. Nothing ... except that it's a programming language. And it serves as a wonderful example of the fact that, despite their theoretical, abstract interchangeability, not all languages are born equal. Here's a potted history of the offender: INTERCAL is short for "Computer Language With No Readily Pronouncable Acronym". It was not so much designed as perpetrated at Princeton University, on the morning of May 26th, 1972, by Donald R. Woods and James M. Lyon. They have been trying to live it down ever since. The current implementation, C-INTERCAL, was written by Eric S. Raymond (also the author of The New Hacker's Dictionary), and -- god help us -- runs on anything that supports C (and the C programming tools lex and yacc). INTERCAL is, in its own terms, elegant, simple and concise. It is also flexible and forgiving; if the compiler (called, appropriately enough, ick) encounters something it doesn't understand it simply ignores it and carries on. In order to insert a comment, just type in something that ick thinks is wrong; but be careful not to embed any valid INTERCAL code in your comments, or ick will get all excited and compile it. There are only two variable types: 16-bit integers, and 32-bit integers, denoted by .1 (for the 16-bit variable called "1") or :1 (for the 32-bit variable named "1"); note that .1 is not equivalent to :1 and definitely has nothing to do with 0.1 (unless it happens to be storing a value of 0.1). INTERCAL supports two unique bitwise operators, interleave (or mingle) and select, denoted by the "^" and "~" symbols respectively. You interleave two variables by alternating the bits in the two operands; you select two variables by taking from the first operand whichever bits correspond to 1's in the second operand, then pack these bits to the right in the result. There are also numerous unary bitwise operators, and in order to resolve matters of precedence the pedantic may use sparks (') or rabbit-ears (") to group expressions (Don't blame me for the silly names: INTERCAL has a character set which is best described as creative.) It is not surprising that these operators are unique to INTERCAL; the parlous readability of C would not be enhanced by the addition of syntax like: PLEASE DO IGNORE .1 <-".1^C'&:51~"#V1^C!12~;&75SUB"V'V.1~ Like any other language, INTERCAL has flow-of-control statements and input and output statements. To write something or other into a variable, you need the WRITE IN list statement, where list is a string of variables and/or array elements. The format of the input data should be as numbers, the digits of which are spelt out in english in the range ZERO (or OH) to FOUR TWO NINE FOUR NINE SIX SEVEN TWO NINE FIVE. To output some information, you need the READ OUT list statement, where list again consists of variables. Numbers are printed, by default, in the form of "extended" Roman numerals (the syntax of which I will draw a merciful veil over), although the scholarly may make use of the standard library, which contains routines for formatting output in Sanskrit. Like FORTRAN, INTERCAL uses line numbers which are optional and follow in no particular order. Unlike FORTRAN, INTERCAL has no evil, unspeakable GOTO command, and not even an IF statement. However, you would be wrong to ascribe this to INTERCAL being designed for structured programming; it is actually because C-INTERCAL is the only known language that implements the legendary COME FROM ... control statement (originally described by R. L. Clark in "A Linguistic contribution to GOTO-less programming", Comm. ACM 27 (1984), pp. 349-350). For many years the GOTO statement -- once the primary means of controlling the sequence of execution of a program -- has been reviled as contributing to unreadable, unpredictable code that is virtually impossible to follow because it jumps about the place like a kangaroo on amphetamines. The COME FROM statement enables INTERCAL to do away with nasty GOTO's, while still preserving for the programmer that sense of warm self-esteem and achievement that comes from successfully writing a really nasty piece of self-modifying code involving computed GOTO's in FORTRAN (Or, for the terminally hip and hacker-ish, involving a triple-indirect pointer to a union of UNIX kernel data structures in C. Capisce?) Basically, the COME FROM statement specifies a line, or lines, which -- when the program executes them -- will jump to the COME FROM: effectively the opposite of GOTO. Because INTERCAL contains no equivalent of the NEXT statement for controlling whether or not some statement is executed, it provides a creative, endearing and unique substitute; abstention. For example, you can abstain from executing a given line of code with the ABSTAIN FROM (label) form of the command. Alternatively, and more uselessly, you can abstain from executing all statements of a specified type; for example, you can say PLEASE ABSTAIN FROM IGNORING + FORGETTING or DO ABSTAIN FROM ABSTAINING The abstention command is revoked by the REINSTATE statement. It is a Bad Idea to ABSTAIN FROM REINSTATING. It is also worth noting that the ABSTAIN syntax is rather confusing; for example, DO ABSTAIN FROM GIVING UP is not accepted, although a valid synonym for this is DON'T GIVE UP. (GIVE UP is the statement that terminates execution of a program. You are encouraged to use this statement at every opportunity. It is your only hope of escape.) The designers of the INTERCAL language believed that source code should be easy to understand or, failing that, friendly; in extremis, good old-fashioned politeness will do. Consequently, the syntax of the language looks a bit odd at first. After the line label (if any) there should be a statement identifier; this can be one of DO, PLEASE, or PLEASE DO. Programs which are insufficiently polite to the compiler may be rejected with the error message PROGRAMMER IS INSUFFICIENTLY POLITE; likewise, programs which spend too much time grovelling to the compiler will be terminated with extreme prejudice. The DO, PLEASE or PLEASE DO is then followed by (optionally) one of NOT, N'T, or %n, then a statement: the NOT or N'T's meaning should be self-evident, while the %n is merely the percentage probability that the following statement will be executed. Of course, with only two binary and three unary operators, it is rather difficult for programmers to get to grips with the language. Therefore, in a fit of quite uncharacteristic generosity, the designers have supplied -- and even partially documented -- a library of rather outre subroutines that carry out such esoteric operations as addition, subtraction, logical comparison, and generation of random numbers. The subroutines will be charmingly familiar to those PDP-11 assembly-language hackers among you who are also fluent in FORTRAN and SPITBOL. Why you should program in Intercal INTERCAL, despite being afflicted with these unique features (which, arguably, should remain that way) has survived for twenty years and is indeed thriving. The relatively recent C-INTERCAL dialect (which introduced the COME FROM statement, among other things) has spread it far beyond its original domain; the infamy continues, transmitted like a malign influence across the Internet. It is a matter of record that no computer language has ever succeeded in gaining widespread acceptance on the basis of elegance, comprehensibility or necessity. Look at FORTRAN, Cobol or C; all of them spread despite the fact that better alternatives were available. In fact, there are reasons to believe that INTERCAL is the language of the future. Firstly, INTERCAL creates jobs. Yes, it's true. In general, if a particular programming tool is unfriendly, ugly, and absolutely essential, the response of management is to throw more programmers at it instead of replacing it. INTERCAL is so monumentally difficult to use for anything sensible that it is surely destined to be the cause of full employment, massive increases in the DP budget of every company where it is used, and general prosperity for programmers. Secondly, once you have learned INTERCAL you will be able to win friends and influence people. As the authors point out, if you were to state that the simplest way to store a value of 65536 in a 32-bit variable is DO :1 <- #0$#256 any sensible programmer would say that this was absurd. Since this is indeed the simplest way of doing this, they'd be made to look like a fool in front of their boss (who, by Murphy's law, would have turned up at just that moment). This will have a devastating effect on their ego and simultaneously make you look brilliant (until they realise that you cribbed this example from the manual, like me. Deep shame.) Thirdly, INTERCAL helps sell hardware. There's been a lot of fuss recently about big corporations inventing gigantic, bloated windowing and operating systems that require monstrously powerful computers to run on; some cynics suggest that this is because personal computers are now so powerful that they can already do anything any reasonable individual would want them to, and it's necessary to invent warm furry buttons that soak up millions of processor cycles in order to sell new equipment. With INTERCAL, you no longer need Windows or OS/2 to slow your 486 PC to a crawl! A Sieve of Eratosthenes benchmark (that computes all the prime numbers less than 65536), when coded in INTERCAL, clocked over seventeen hours on a SPARCStation-1. The same program, in C, took less than 0.5 seconds -- thus proving, quite clearly, that INTERCAL software is so slow that you absolutely must buy a CRAY-3 or equivalent in order to have any hope of getting any work out of it. Consequently, it is quite clear that INTERCAL represents a major alternative to conventional programming languages. Anyone who learns this language is going to go far -- at least as far as the nearest psychiatric institution. Rumour has it that INTERCAL was not entirely unrelated to the collapse of the Soviet Union and the success of the Apollo missions. It is even reported to improve your sex life and restore hair loss! So we warmly advise you to take this utterly unbiased report at face value and go forth and get your friendly neighbourhood software company to switch to INTERCAL, wherever or whatever they may be using right now. You know it makes sense ... References The INTERCAL Resources Page INTERCAL on WikiPedia Intercal on Rosettacode The INTERCAL reference manual
-
While we are on the topic of the wisdom of Dijkstra let us not forget what he said about computer architecture. I refer you to EWD 32, Paragraph 5. http://www.cs.utexas.edu/users/EWD/transcriptions/EWD00xx/EWD32.html The first time I read that I almost fell on the floor. It is so true. Most of the time we are awed every time the CPU architects have clearly anticipated our needs and built the correct facilities into the MCU's. But occasionally, you get a strange oversight like the TRMT bit without an interrupt in the EUART on PICmcu's. Or the advanced math options on the ADC with Computation but they work on every conversion and don't respect the channel. i.e. you are limited to operating on a single ADC channel if you use the advanced features. We all need a good laugh, so post your favorites "features" below.
-
The King's Toaster Anonymous Once upon a time, in a kingdom not far from here, a king summoned two of his advisors for a test. He showed them both a shiny metal box with two slots in the top, a control knob and a lever. "What do you think this is?" One advisor, an engineer, answered first. "It is a toaster," he said. The king asked, "How would you design an embedded computer for it?" The engineer replied, "Using a four-bit microcontroller, I would write a simple program that reads the darkness knob and quantizes its position to one of 16 shades of darkness, from snow white to coal black. The program would use that darkness level as the index to a 16-element table of initial timer values. Then it would turn on the heating elements and start the timer with the initial value selected from the table. At the end of the time delay, it would turn off the heat and pop up the toast. Come back next week, and I'll show you a working prototype." The second advisor, a computer scientist, immediately recognized the danger of such short-sighted thinking. He said, "Toasters don't just turn bread into toast, they are also used to warm frozen waffles. What you see before you is really a breakfast food cooker. As the subjects of your kingdom become more sophisticated, they will demand more capabilities. They will need a breakfast food cooker that can also cook sausage, fry bacon, and make scrambled eggs. A toaster that only makes toast will soon be obsolete. If we don't look to the future, we will have to completely redesign the toaster in just a few years. With this in mind, we can formulate a more intelligent solution to the problem. First, create a class of breakfast foods. Specialize this class into subclasses: grains, pork and poultry. The specialization process should be repeated with grains divided into toast, muffins, pancakes and waffles; pork divided into sausage, links and bacon; and poultry divided into scrambled eggs, hard-boiled eggs, poached eggs, fried eggs, and various omelet classes. The ham and cheese omelet class is worth special attention because it must inherit characteristics from the pork, dairy and poultry classes. Thus, we see that the problem cannot be properly solved without multiple inheritance. At run time, the program must create the proper object and send a message to the object that says, 'Cook yourself'. The semantics of this message depend, of course, on the kind of object, so they have a different meaning to a piece of toast than to scrambled eggs. Reviewing the process so far, we see that the analysis phase has revealed that the primary requirement is to cook any kind of breakfast food. In the design phase, we have discovered some derived requirements. Specifically, we need an object-oriented language with multiple inheritance. Of course, users don't want the eggs to get cold while the bacon is frying, so concurrent processing is required, too. We must not forget the user interface. The lever that lowers the food lacks versatility and the darkness knob is confusing. Users won't buy the product unless it has a user-friendly, graphical interface. When the breakfast cooker is plugged in, users should see a cowboy boot on the screen. Users click on it and the message 'Booting UNIX v. 8.3' appears on the screen. (UNIX 8.3 should be out by the time the product gets to the market.) Users can pull down a menu and click on the foods they want to cook. Having made the wise decision of specifying the software first in the design phase, all that remains is to pick an adequate hardware platform for the implementation phase. An Intel 80386 with 8MB of memory, a 30MB hard disk and a VGA monitor should be sufficient. If you select a multitasking, object oriented language that supports multiple inheritance and has a built-in GUI, writing the program will be a snap. (Imagine the difficulty we would have had if we had foolishly allowed a hardware-first design strategy to lock us into a four-bit microcontroller!)." The king had the computer scientist thrown in the moat, and they all lived happily ever after.
-
Today’s Lore blog is about Edsger Dijkstra. One of my all time favorites!
-
lore Dijkstra on Webster, users, bugs and Aristotle
Orunmila posted a blog entry in Programming Lore
Every programmer should read what Edsger Dijkstra wrote as part of their Education. The University of Texas hosts a comprehensive Archive of Dijkstra's writings Perhaps my faovourite piece is what he wrote about "Users" in EWD618. In the context of the article Dijkstra was calling this kind of stereotyping out saying that Read the full paper here : On Webster, users, bugs and Aristotle -
One of those classic little jokes has always been the sign warning the average "non-technieschen peeper" to "Keep your hands in your pockets, relax and watch the Blinkenlights!". It too has it's WikiPedia page of course. Another version of the story was later created. "Simplified" for those not proficient in German to this: ATTENTIONThis room is fullfilled mit special electronische equippment. Fingergrabbing and pressing the cnoeppkes from the computers is allowed for die experts only! So all the "lefthanders" stay away and do not disturben the brainstorming von here working intelligencies. Otherwise you will be out thrown and kicked anderswhere! Also: please keep still and only watchen astaunished the blinkenlights.
-
Software Engineering is complex. The essense of Fred Brooks's "No Silver Bullet" was that software takes long to develop because it takes humans long to deal with this complexity. Today we so often run into the situation where someone publishes some clever idea or solution, and others enthusastically implement this in their project only to be disappointed by the fact that it does not seem to give them the expected benefit. Things that come to mind first are "modular code", Design Patterns and "Object Oriented". More often than not the root cause is a lack of a deeper understanding of the essense of the solution or of original problem. This mistake of ritualistically copying what the "Textbook" says has become so systemic in our industry that we needed a vocabulary name to refer to it, and this name is "Cargo Cult Programming". The WikiPedia Entry has a pretty neat description - "Cargo cult programming is a style of computer programming characterized by the ritual inclusion of code or program structures that serve no real purpose. Cargo cult programming is typically symptomatic of a programmer not understanding either a bug they were attempting to solve or the apparent solution (compare shotgun debugging, deep magic)." In short it is a reminder that you should not ritualistically follow any solution without truly understanding what the essense of the solution is. What are the advantages and disadvantages, the trade-offs involved, and why does it apply in your situation? I e.g. recently had a debate with some colleagues about what constituted "Industrial Strength Code", and the claim was made that "Industrial Quality Code" is code that uses State Machines and all function calls are non-blocking. To me this just sounds like a textbook case of Cargo Culting the rituals hoping for the quality to follow. It is after all possible to produce the highest quality code using any programming paradigm provided it is appied at an appropraite place and time. But I digress... In order to appreciate the term, you really have to read the story of the Cargo Cult! (This image was found at http://cargocultsoa11.files.wordpress.com/2010/09/cargo-cult2.jpg and was taken by Steve Axford. The Cargo Cult “Reporter Paul Raffaele: "John [Frum] promised you much cargo more than 60 years ago, and none has come. … Why do you still believe in him?" Chief Isaac Wan: "You Christians have been waiting 2,000 years for Jesus to return to earth and you haven’t given up hope."” The earliest known cargo cult was the Tuka Movement in Fiji from 1885. During World War II, the Allies set up many temporary military bases in the Pacific, introducing isolated peoples to Western manufactured goods, or "cargo". While military personnel were stationed there, many islanders noticed these newcomers engaging in ritualized behaviors, like marching around with rifles on their shoulders in formations. After the Allies left, the source of cargo was removed and the people were nearly as isolated as before. In their desire to keep getting more goods, various peoples throughout the Pacific introduced new religious rituals mimicking what they had seen the strangers do. Melanesia In one instance well-studied by anthropologists, the Tanna Islanders of what is now Vanuatu interpreted the US military drill as religious rituals, leading them to conclude that these behaviors brought cargo to the islands. Hoping that the cargo would return by duplicating these behaviors, they continued to maintain airstrips and replaced their facilities using native materials. These included remarkably detailed full-size replicas of airplanes made of wood, bark, and vines, a hut-like radio shack complete with headphones made of coconut halves, and attempts at recreating military uniforms and flags. Many Melanesians believed that Western manufactured goods were created by ancestral spirits, but the occupiers had unfairly gained control of them (as the occupiers in question had no visible means of producing said goods themselves). The islanders expected that a messianic Western figure, John Frum, would return to deliver the cargo. No one knows who Frum is, nor is there physical evidence he existed, but the islanders continue to ceremoniously honor him. After the war the US Navy attempted to talk the people out of it, but by that point it was too late and the religious movement had taken hold. Subsequently the people of Tanna have been waiting over sixty years for the cargo to return. Then again, as mentioned in the quote above, Christians have been waiting more than two thousand years for their guy to come back. Modern cargo cult believers do exist, although most see John Frum and the like merely as manifestations of the same divinity worshiped in other parts of the world, and treat the trappings of the belief as a worship service rather than a magical collection of talismans. (The full story at https://rationalwiki.org/wiki/Cargo_cult reproduced here under CC-BY-SA ) More about Cargo Cult's here on WikiPedia https://en.wikipedia.org/wiki/Cargo_cult
-
There is this beautiful story about testing. I think the moral of the story is that if you do something really well, and people get recognition for doing that, it can become an inspiration to excell. The credit here should not just go to The Black Team, but also to management that allowed this to develop into such a "thing". When I was establishing a testing team at my previous employer I started off by sending this story to each of the hand-picked members. They felt like programming was much more exotic and interesting than testing and nobody wanted to be on the testing team. To my delight the team took the story to heart and in their endeavors to improve testing more and more ended up trying out all kinds of exotic languages like Golang scripting, which they would never have done as embedded C programmers alone. I am tremendously proud of what that testing team of mine has become (you know who you are!), and perhaps this story can be an inspiration to more teams out there. The Story From : http://www.penzba.co.uk/GreybeardStories/TheBlackTeam.html Sometimes because of the nature of my work I get to hear stories from a greybeard with a past that is, well, "interesting." And again, because of the nature of my work, or more accurately, because of the nature of their work, the stories can't be verified. That's just life. Sometimes the stories can't even be told. But some, after time, can. I'm starting to write them up, and this is one of them. Once upon a time most hackers knew about the infamous IBM "Black Team". In case you don't, I suggest you go read about them first, before you carry on here: The Black Team (That should open in a new window or tab - just close it when you're done.) So let me tell you a story involving a member of the Black Team. This has come to me 13th hand, so I can't vouch for its veracity, but I can believe it. Oh yes, I can believe it. There was a programmer at IBM who was terribly excited to get the job writing the driver/interface software for a brand spanking new tape-drive. This was one of the fascinating machines - cabinet sized - that you might see in movies from the 60s, 70s and beyond. Reel to reel tape, running forward, then backwards, stopping, starting, no apparent reason for the direction, speed or duration. Almost hypnotic. Maybe I should write a simulator as a screen-saver. Hmm ... Anyway, our programmer was very excited by the idea of writing the software for this new piece of hardware, but also a little anxious. After all, his work would attract the attention of the Black Team, something no programmer ever really wanted. So he decided to thwart them. He decided to write his code perfectly. He decided to prove the code, formally. He decided to write code that was so clean, so neat, so perfect, that nothing wrong could be found with it. There are two ways to write code: write code so simple there are obviously no bugs in it, or write code so complex that there are no obvious bugs in it. -- C.A.R. Hoare He worked really hard at getting this code perfect, and the time came for a member of the Black Team to inspect his code and test his software, they found no bugs. None! And they were annoyed. They came back. At first in twos and threes, but finally the entire team descended on him and his code to find the bugs they simply knew must be there, all to no avail. So our programmer was happy, content, confident, and above all, smug. So the day came when the hardware was to be unveiled and demonstrated to the world. This sort of event was always a big one. "New Hardware" was a big deal, and duly trumpeted. Then, at the last minute before the demonstration began, a member of the Black Team hurried up to the console and began frantically typing in commands. Our programmer was confident - he knew the code was perfect. It was proven and tested. Nothing could go wrong. He wasn't even perturbed when the tape started spinning at full speed and running right to the end. It stopped before the end, as he knew it would. It was safe, it was working, it was perfect. The tape then started to rewind at full speed, but that wasn't a problem either. Again, it stopped just short of the end. No problem. Again and again the tape ran all the way to the end, then all the way to the start. Again and again it stopped within tolerances. Our programmer smirked, knowing the Black Team were beaten. And then he saw it. The cabinet had started to build up a gentle, rocking motion. And it was growing. What could be happening? The Black Team had found the fundamental frequency at which the cabinet would rock, and had programmed the tape to resonate. In mounting horror, matching the mounting amplitude, the programmer watch as the cabinet at first subtly, then clearly, and finally unmistakably began rocking ever more violently, until finally, inevitably, it fell over. In front of the World's press. History doesn't relate what happened to the programmer, or to the product. Despite the tale being utterly believable, I've been able to find no record of it. The greybeard who told me the story has moved on to a place where I can no longer ask questions, and so I'm left with what might just be a tall tale. Surely there would survive some report of this event, some record, somewhere. Do you have a copy? [you can read Colin Wright's blog here https://www.solipsys.co.uk/new/ColinsBlog.html?ColinWright ] 8<-------8<-------8<-------8<-------8<-------8<-------8<-------8<-------8<------- The epilog to that background story linked there sums it up best: Epilog Readers not familiar with the software industry might not grasp the full significance of what the Black Team accomplished within IBM, but the real lesson to be learned from the story has nothing to do with software. A group of slightly above-average people assigned to do what many considered an unglamorous and thankless task not only achieved success beyond anyone's wildest expectations, but undoubtedly had a great time doing it and wound up becoming legends in their field. As I read through the end-of-year lists of all the problems the computer industry and the world as a whole is facing, I just can't seem to bring myself to view them with gravity the authors seem to intended. After all, even the worst of are problems seem solvable by a few like-minded people with a bit of dedication.