I regularly read Joel On Software because Joel is somebody who, for a change, has a brain of his own and who isn't afraid to bitch on certain programming conventions even if the rest of the world thinks they're the Ultimate Solution. I usually tend to agree with Joel.
This time he explains how coding conventions can make your wrong code look wrong.
He also explains the difference between 'Apps Hungarian notation' and the 'Systems Hungarian notation'. Systems Hungarian notation is the notation we all know and love to hate. Apps Hungarian notation is the way the Hungarian notation was actually ment to be, and the one that makes some sense. Apps Hungarion notation is the one where you add useful information to your variable names (e.g. cBeers is countBeers, bufRead, bufWrite, etc). Systems Hungarian notation is the one where you add totally useless information to your variables, like: long int liSubTotal is longintSubTotal.
In college, we were taught to use the (Systems) Hungarian notation and I always found that to be a completely braindead way of hinting your variables. But college professors don't really care, because they have no brain of their own so they just do what all other educational facilities do: teach you bullshit. To quote Mark Twain: "I have never let my schooling interfere with my education."
Anyway, the original paper on the Hungarian notation (clearer, shorter version here) is what later got bastardized into the Systems Hungarian Notation. The original paper still has some, IMHO, idiotic things like prefixing a character variable with ch, but it's a whole lot more useful than that daft Systems Hungarian notation.
Joel also bashes Exceptions again, which I love (the fact that he bashes them, not Exceptions themselves), and then links to this blog post on 'Exceptions: Cleaner, more elegant, and harder to recognize'.
Clearly, Joel is a Practical Programmer and you should read the stuff he writes about, because he's always right. Don't go arguing now! He's just right.