Something I'd almost forgotten from about six years ago:
I was working as a computer programming, using a propriety language similar to Visual Basic. One day the boss decided that we needed to standardize all our variable names, using three simple and (apparently) sensible rules:
Does this seem reasonable to you? If so, you haven't thought it through. In many screen fonts, including the one used by our editing interface, I (capital Eye) and l (small Ell) are indistinguishable, so we ended up unable to tell the difference between variables the software was treating as different. It only took a few days to convince the boss to let us change all the variable names back to the old chaotic but legible system, but very little got done that week except the electronic equivalent of digging holes and filling them up again.
Posted by Dr. Weevil at July 21, 2004 10:42 PMWell, you could have used a two-character designator. For example, you could have used "EYE-EN" for Integers and "ELL-EN" for Longs.
*Ducks and runs for cover.*
Posted by: Pious Agnostic on July 22, 2004 10:25 AMOh, man. Large-steaming-mess-on-a-plate. I work with a website where the potential for confusion is a lot lower than that (I'd have said practically nil, before certain
I forgot to mention: variable names were limited to (I think) 32 characters, and there were hundreds of them that needed to be distinguished, so adding two characters at the beginning was enough to force us to shorten a few. Three would have been worse.
Posted by: Dr. Weevil on July 23, 2004 08:17 AMBy any chance was this language "Turbo Pascal"?
Posted by: Pious Agnostic on July 23, 2004 09:17 AMI have a vague memory that in Fortran77 you could actually specify an include which would redefine a given glyph as a completely different alphanumeric value in the code itself. For example, all L's could be turned into the code equivalent of a number 3. Pure evil ensued when one did the following to an unsuspecting sophomore physics student:
Text find and replace l and 1 (Ell and One), causing a visual mess.
Redefine "l" as "1," causing a logical mess that behaves the same way.
Sit back, relax, and drink in the schadenfreude as victim beats his head against the computer for several hours as he tries to figure out what the hell is wrong with his code...
Posted by: Old Oligarch on July 26, 2004 01:30 AMYou would have loved Hungarian notation. 1 byte - ha! They had special values for, say, a pointer to an array of longs, etc.
The Hungarian notation isn't based on the looks but rather the practices of Microsoft's Charles Simonyi.
Posted by: J Bowen on August 1, 2004 04:16 PMCouldn't you have just used "N" instead of "I"?
Posted by: Dodd on August 3, 2004 02:28 PM