"Debugging is twice as hard as writing the code in the first place.
Therefore, if you write the code as cleverly as possible, you are, by
definition, not smart enough to debug it."
I'm working on a major project and including code from stuff I wrote in 1988. It reminds me of how much I've forgotten- well, not forgotten, but not recently accessed. The feeling's similar to the rush of memories you get when viewing an old picture album. And it's reassuring to see my overall state of mind hasn't changed, based on the comments in the code, like:
* Years from now you will review this function and say to
* "Boy, this is really crappy code. I should take the time and
* optimize it."
* You probably won't remember, but you spent an entire
* weekend of unbillable time tweaking this. You increased
* its execution speed by 30%, but in the process crafted
* a function of such blinding elegance that when you
* reviewed it the next day, you discovered it was totally
* incomprehensible. So you put the old code back in.
* This is running on a 386 machine with a 12Mhz clock
* and 640K, and the profiler lists the execution time as 211
* milliseconds. A bit slow, but acceptable.
* So forget about it.
"12 Mhz clock and a full 640k."
My current laptop has a 2.4 gigaHertz clock, which is 200 times faster than that old 386 desktop. That kludgy, awkwardly written function that required 211 milliseconds to run now takes a little over one millisecond, and the file that took 20 seconds to process runs so fast that the command prompt appears immediately after I hit the return key.
Thank you, 1988 KGB, for the unexpectedly wise advice. And by the way, Fox canceled Tracey Ullman, but the Simpsons got their own show and are still on the air. And that "Naked Gun" movie you saw with Doug last weekend? Keep an eye on O.J. Simpson. Trust me.
You may ask, how did I remember taking my son to see "Police Squad?" Thanks to Google and the Internet Movie Database, this program comment now makes sense:
* "Hey Look! It's Enrico Pallazzo!"