July 21, 2004
It Seemed Like A Good Idea At The Time

Something I'd almost forgotten from about six years ago:

I was working as a computer programming, using a propriety language similar to Visual Basic. One day the boss decided that we needed to standardize all our variable names, using three simple and (apparently) sensible rules:

  1. All variable names must begin with a single letter followed by an underline.
  2. The letter must be capital for global variables, small for local. Or maybe it was the other way around: it hardly matters.
  3. The letter must indicate the variable type: S or s for character strings, I or i for integers, L or l for long integers, F or f for floating point, B or b for binary/logic variables (True/False) -- there may have been one or two other kinds.

Does this seem reasonable to you? If so, you haven't thought it through. In many screen fonts, including the one used by our editing interface, I (capital Eye) and l (small Ell) are indistinguishable, so we ended up unable to tell the difference between variables the software was treating as different. It only took a few days to convince the boss to let us change all the variable names back to the old chaotic but legible system, but very little got done that week except the electronic equivalent of digging holes and filling them up again.

Posted by Dr. Weevil at July 21, 2004 10:42 PM
Comments

Well, you could have used a two-character designator. For example, you could have used "EYE-EN" for Integers and "ELL-EN" for Longs.

*Ducks and runs for cover.*

Posted by: Pious Agnostic on July 22, 2004 10:25 AM

Oh, man. Large-steaming-mess-on-a-plate. I work with a website where the potential for confusion is a lot lower than that (I'd have said practically nil, before certain experiences publishing it last week). But. Never, ever introduce a new naming scheme unless you know exactly what you're doing. The consequences are not pleasant.

Posted by: Michelle Dulak Thomson on July 22, 2004 11:07 PM

I forgot to mention: variable names were limited to (I think) 32 characters, and there were hundreds of them that needed to be distinguished, so adding two characters at the beginning was enough to force us to shorten a few. Three would have been worse.

Posted by: Dr. Weevil on July 23, 2004 08:17 AM

By any chance was this language "Turbo Pascal"?

Posted by: Pious Agnostic on July 23, 2004 09:17 AM

I have a vague memory that in Fortran77 you could actually specify an include which would redefine a given glyph as a completely different alphanumeric value in the code itself. For example, all L's could be turned into the code equivalent of a number 3. Pure evil ensued when one did the following to an unsuspecting sophomore physics student:

Text find and replace l and 1 (Ell and One), causing a visual mess.

Redefine "l" as "1," causing a logical mess that behaves the same way.

Sit back, relax, and drink in the schadenfreude as victim beats his head against the computer for several hours as he tries to figure out what the hell is wrong with his code...

Posted by: Old Oligarch on July 26, 2004 01:30 AM

You would have loved Hungarian notation. 1 byte - ha! They had special values for, say, a pointer to an array of longs, etc.

The Hungarian notation isn't based on the looks but rather the practices of Microsoft's Charles Simonyi.

Posted by: J Bowen on August 1, 2004 04:16 PM

Couldn't you have just used "N" instead of "I"?

Posted by: Dodd on August 3, 2004 02:28 PM