and the geek shall inherit the earth

I don’t expect non-programmers to get much out of this list of the top ten things that annoy programmers, but if you’re into free thinking and science, you might like this bit — it’s the number one annoyance, “Your own code, six months later”

Ever look back at some of your old code and grimace in pain? How stupid you were! How could you, who know so much now, have written that? Burn it! Burn it with fire!

Well, good news. You’re not alone.

The truth is, the programming world is one that is constantly changing. What we regard as a best practice today can be obsolete tomorrow. It’s simply not possible to write perfect code because the standards upon which our code is judged is evolving every day. It’s tough to cope with the fact that your work, as beautiful as it may be now, is probably going to be ridiculed later. It’s frustrating because no matter how much research we do into the latest and greatest tools, designs, frameworks, and best practices, there’s always the sense that what we’re truly after is slightly out of reach. For me, this is the most annoying thing about being a programmer. The fragility of what we do is necessary to facilitate improvement, but I can’t help feeling like I’m one of those sand-painting monks.

Frustrating? Sure, sure. But then again, it is utterly delightful to be able to look back on yourself six months ago and think, what was I thinking? I know so much better now! Fixed, unchanging knowledge is one of the most devastating intellectual afflictions known to humankind. Do you really want to be absolutely sure of everything, and stay that way? What happens to your poor brain when all it can do is mull over the same things, over and over and over? It atrophies into so much grey jell-o, that’s what happens.

If you aren’t regularly stricken by how much more you know now than you did six months ago, what’s the point? If your life isn’t a constant series of epiphanies followed by epiphanies that make those prior epiphanies seem quaint and lame by comparison … well, why bother?

Eleven years or so ago, I got my hands on a computer that was connected to the internet for the first time. I had to know how this thing worked, so I began obsessively taking things apart and breaking them until they worked again. Six months later, in spite of the fact I was just a dangerous newbie with a copy of FrontPage and a lot of nerve, I had a job with the title of webmaster, and I never looked back. I have no formal education in this, of course, I can’t be taught, I can only learn, and not from people who are involved in the process of teaching, unless they are also involved in the process of being and doing — even then, I don’t learn from the instructions, I learn from the stuff the instructions are about. I take it apart, I beat on it incessantly (i am known at work for wearing out keyboards), and eventually, I understand. Then I go find something else I don’t know, and bang on it until I do.

Nothing will make you feel stupider on a regular basis than web programming, since no matter how much you know at any point, new stuff needing to be known makes it impossible to ever be complacent — but that feeling of stupid is actually the thing that makes you smarter.

Geek out on whatever knowledge makes your synapses zing, is my advice. The more you know how little you know, the more you are driven to learn.

9 thoughts on “and the geek shall inherit the earth

  1. Most fields of knowledge don't change anything like as rapidly as computer technology does, but I guess I have an analogous impulse. Every few years I get an obsessive interest in some particular area of knowledge and feel driven to learn as much as possible about it. It lasts a few years and then a different one grabs me and takes over. I always remember what I learned, though.

    1. I'm sure there are many fields of science which offer a similar rate of advancing knowledge — and it's all cumulative, even programming. The things you learn always serve you well, even when they are not replaced with new things to learn at the same rate as internet tech.

      This is of course in the same vein as my previous post on lazy-mindedness. The “debate” we had with Oyvind was a particularly vivid illustration of importance of keeping an open mind and being willing to admit there are new things we don't know yet — without that, it's all about parroting the same things over and over, recitation and memorization vs. learning and growing.

      Infidel, you always make me think :) It's why your feed is in my google home page and not just in the reader.

      1. I'm sure there are many fields of science which offer a similar rate of advancing knowledge

        This is actually one of the fascinating things about modern technology — computer data processing is merging into everything else, and as a result everything else is accelerating its rate of development because of the growth of data-processing power. Medical technology, a special interest of mine, is a good example. The analysis of genes, the development of new strategies for fighting cancer (subject of a recent post of mine) and viruses (subject of an upcoming one) — all are moving faster and faster all the time because of the progress in computers.

        It's really an enhancement to our ability to think, beyond what unaided brains can do (computers and organic brains are good at different things — at least for now), as earlier generations of machines extended our ability to manipulate the physical world beyond what unaided muscles could do.

  2. It looks like being self-taught works well especially in tech. I learned quite a while back that some things that you learn in a classroom have little or no relevance in the real world. For example I took an abnormal psychology class concurrent with working at the county mental health hospital with real live patients. It took about 30 seconds on day one lecture one to figure out that the professor had no experience working with people with psychiatric disorders–everything she knew was theory from a book. After the first class she relied on me heavily to describe the types of behaviors one might expect from (insert psychiatric disorder). As you pointed out tech is constantly evolving. One of the studies (on tech ed) pointed out that if students go through a 3 year program like ITT or one of the other schools by the time they graduate everything they learned in years one and two and most of what they learned in year 3 would be obsolete.

  3. I'm sure there are many fields of science which offer a similar rate of advancing knowledge

    This is actually one of the fascinating things about modern technology — computer data processing is merging into everything else, and as a result everything else is accelerating its rate of development because of the growth of data-processing power. Medical technology, a special interest of mine, is a good example. The analysis of genes, the development of new strategies for fighting cancer (subject of a recent post of mine) and viruses (subject of an upcoming one) — all are moving faster and faster all the time because of the progress in computers.

    It's really an enhancement to our ability to think, beyond what unaided brains can do (computers and organic brains are good at different things — at least for now), as earlier generations of machines extended our ability to manipulate the physical world beyond what unaided muscles could do.

  4. It looks like being self-taught works well especially in tech. I learned quite a while back that some things that you learn in a classroom have little or no relevance in the real world. For example I took an abnormal psychology class concurrent with working at the county mental health hospital with real live patients. It took about 30 seconds on day one lecture one to figure out that the professor had no experience working with people with psychiatric disorders–everything she knew was theory from a book. After the first class she relied on me heavily to describe the types of behaviors one might expect from (insert psychiatric disorder). As you pointed out tech is constantly evolving. One of the studies (on tech ed) pointed out that if students go through a 3 year program like ITT or one of the other schools by the time they graduate everything they learned in years one and two and most of what they learned in year 3 would be obsolete.

  5. Pingback: PABlo Bley
  6. Pingback: The 3rd Party

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>