Unexpected 'free' day (More of an "I want to do something else" day...)


I got up this morning fully intending to work on billing/provisioning stuff for ACI all day... I only wound up doing 2 hours total work for them (well, so far), exhausted the stuff that can be done during the day (most of the changes need to be done after-close-of-business) and decided to brush up on my C++ for a contract next week (it's been 9 years since I did any paid C++ work (garn I'm getting old), I normally use C when I need to drop down for low-level work).

Read the book "Effective C++ (Second Edition)", which was exactly the kind of book I was looking for, a guide for C programmers that focuses on the stuff that is different from C and problematic. It's pretty slow in some sections for someone who eats and sleeps OO (i.e. Python users), but that's fine, don't need to spend as much time in there. Mostly it outlines the warts that make C++ different from C, reminds you about the anti-patterns of the implementation(s) and basically assumes you know what you are doing for the rest.

Interestingly, as I was going through, I was frequently thinking to myself that the reasons given for the existence of any given wart were pretty trivial. The author is pretty anti-multiple-inheritance, for instance. I'm not a huge user of MI, but the broken/unfinished semantics in C++ seem rather unconvincing as a reason to tar-and-feather the idea.

Similarly, the mess around implicit object constructors/destructors, and const-ness and the resulting headaches with memory leaks all seem pretty dumb. Of course, I'm sure most people are using smart pointer libraries or some other abstraction to avoid the problem, but the idea that you'd have something that fragile as the fundamental operation-set in your language seems a little... questionable. You get a 5% (or whatever) performance improvement... until the memory leaks bring your process to its knees and you spend 6 months trying to track down the error.

From Stroustrup's lecture, it really seems to me that C++ has jumped the shark; attempting to create a system with an infinite series of template-matched patterns that are intended to automatically jump into line with a few lines of final code, with all "real" code in infinitely large series of code-books that are all created by perfect coding gods, never needing debugging or inspection. The nasty syntax that results just emphasizes how ugly the result is... most of it seems to be a ridiculously involved attempt to avoid having first-class classes.

What is a template but a mechanism for saying "apply this pattern to (instances of) a given class"... the overarching belief in creating compile-time static-typed variables turns what could be a simple run-time check (are all objects of the same type/class) into a requirement that all programmers be meta-programmers and understand all of the requirements of the template-expanding pattern-matching systems of the STL and related libraries.

Oh, I understand the efficiency arguments, and yes, I saw some elegance in the model Stroustrup was outlining, but it was elegance in the sense of "wow, look at how sharp and pointy that crystalline structure seems"... not really something you want to embrace or play with :) .

Anyway, I'll probably play with building some Python extension this weekend to practice Boost usage (though bjam seems unhappy with me (lack of the example code (404 on the web site) doesn't help that)). Maybe the elegance and majesty will yet win me over...

Comments

  1. Mike Fletcher

    Mike Fletcher on 03/03/2008 9:27 a.m. #


    If anything, the weekend has cemented my impressions. All of the "new" features (templates, STL, etceteras) seem to have been produced by a group unwilling to admit that the core of their language was broken and fix it.<br />
    <br />
    Both of the books I've been reading on how to create bullet-proof C++ are pretty consistent: the built-in primitives in C++ are fundamentally and irretrievably broken. To write safe, reliable C++ code you must never use them. Instead, you must use the STL/Boost-style abstractions that make finalization semantics comparatively understandable and predictable.<br />
    <br />
    So why didn't the language designers recognize this and fix the language's core semantics? I know, I know, backward compatibility... still, seems like there should have been some work at the language level to provide the reliable semantics by default and let those needing backward compatibility use a flag to enable the old, broken, semantics.

Comments are closed.

Pingbacks

Pingbacks are closed.