Mimsy Were the Borogoves

Editorials: Where I rant to the wall about politics. And sometimes the wall rants back.

IT’s rarefied view of obsolescence

Jerry Stratton, May 19, 2005

Tech geeks always like having the best, and we often look askance at hardware or software that is more than a year or two old. In the real world, people treat their computers and software just like they treat any other thing that they buy: they keep using it until it stops working or they need more features. But to tech geeks, using anything other than the latest versions of software and hardware is a sin. There are no end of parent jokes or grandparent jokes about how non-techs are willing to use completely obsolete hardware.

What’s worse is that these non-tech parents and grandparents actually get work done with them, just not the work tech geeks want them to be doing. I think it is important to remember that, for most people who use computers, using computers is not the purpose of having them. Normal people have some other purpose, such as writing, balancing budgets, browsing the web, or sending e-mail. As long as the computer they have still works for their task, that computer is not obsolete. Not for that consumer.

I understand this view. I love having the newest and strangest toys to play with. But I’m also cheap. I don’t like wasting money replacing what is already working and working well. I still have a seven-year-old computer at home, and I use it every day. I will someday, probably very soon, replace it, but this will be in order to do something I can’t do now, not just because it is “old”.

Old is useless?

The tech geek’s strange view of obsolescence, that something is obsolete merely because it is old, permeates corporate and educational IT. Here at the University where I work, we no longer even donate computers because any computers we’re getting rid of are “old computers that are useless to us” and “we shouldn’t be dumping useless computers out to other people.” After all, we don’t need them, so nobody else could either.

At one time, this was almost true: we kept computers forever, and only got rid of them when they no longer worked well for us. We kept our Macintosh lab computers for five years, and more.

Even then, however, we easily found high schools and non-profit organizations who wanted and could use those computers when we had a donation policy in effect. We are near Mexico. Several computers ended up in labs there once we were done with them.

Today, however, we have a three year replacement policy. We’ve just replaced a lab full of computers that are three years old, and are about to replace another lab full of computers that are three years old. My home computer is a grandfather by this metric.

These are three and four year old Macintoshes. They are all well within their useful lifespan. For most people’s uses, they are practically new. But we are going to keep them in storage here, or possibly try to sell them to employees after warning them (I’m not making this up) that the computers are too old for them to use, and we recommend that they not buy them. If the past is any guide, most of them won’t.

Why are we afraid to donate them or sell them? Not just because the computers are so old as to be worthless, but also because “the operating system is too old”. We can’t reasonably let these incredibly old operating systems out in the wild. Like the incredibly old computers, they’re useless with such an old operating system.

That “old” operating system is Mac OS X 10.2. This operating system is, as I write this, still supported by Apple with security updates. It is (barely) two revisions behind. The first lab was only one revision behind when we replaced it and put all of those computers into storage.

This is a great example of how bureaucratic resistance to change and a rarefied view of obsolescence in IT have combined to make poor policy. I have two labs’ worth of computers that are all much better than the computer I use every day at home, that are still supported by their manufacturer, that continue to have regular updates from their manufacturer, and we recommend that nobody use them, because they’re “so old that they’re worthless.”

When did geeks become the ultimate consumer?

It’s bad enough that IT geeks try to spread their rarefied definition of obsolescence in all of their relations with non-geeks. But when did geeks become the ultimate consumer?

It didn’t use to be this way. Geeks used to build everything on the cheap in their basement. That’s pretty far in the other direction. Nowadays, even geeks who build their own are doing so only in order to get the latest and fastest in a configuration that is unavailable in the latest and fastest off-the-shelf versions.

In a sense, computers are following the path that cars followed. For most people, there is no need to replace their computer any more, not until it breaks down. Unless someone is doing video work or playing ultimate games, not only will last year’s computer work for them, but last century’s computer is also perfectly fine for writing letters and books and doing taxes.

Car manufacturers have come to the point where they need to convince people to buy new cars more often, because no one needs a new car every few years. Even there, however, the person in the heartland who buys more than one car every several years is known as the person who buys cars too often. Whereas the person who doesn’t buy a new computer every one to three years is known as the person who doesn’t buy a new computer often enough.

Features vs. functionality

Geeks do not have the same concerns as the average consumer. Geeks like features. The more the better, even if they do no good; the geek will make them do good. At the recent Emerging Technologies conference, Nokia showed off a cell phone that can be programmed in the Python scripting language. I want one of those phones. I’m sure I could justify it with all the cool things I can make such a phone do, but the fact is the real reason I want it is not to do those cool things. It is to be able to program my phone in Python. The journey is its own reward.

Most people, of course, want phones that they can use to make phone calls with. Extra features are useful only insofar as they make it easier to make phone calls, or are useful in the course of making a phone call. Space constraints for how much stuff people can carry may eventually combine multiple devices into one, but one of those devices will not be a portable Python programming tool.

Betamax

The war between geeks and non-geeks has entered our technological mythology. The common geek meme that “Betamax died even though it was better than VHS” is an example of this mythology. It is a myth; it speaks to our need for tragedy; but it is wrong. It completely ignores the advantages that VHS had for the average consumer.

  1. Compact movie collections: with one hour lengths during the VHS/Betamax wars, it took two or even three Beta tapes to view a normal movie. Consumers found it easier to deal with one tape per movie using two-hour VHS tapes, rather than multiple tapes per movie.
  2. Timeshifting: one hour is not even really enough to timeshift a one-hour television show given the vagaries of clock mismatch, let alone a two or three hour movie--or a night’s programming lineup. Two hours, however, could easily get any one and a half hour period, or get the primetime 8 to 10 PM lineup.
  3. Comparable quality: at 250 lines for Beta and 240 lines for VHS, the difference in quality was not easily visible on televisions of that time period or for a long time afterwards.

To tech geeks, the problems with Betamax are challenges that can be overcome by fiddling with the device or waiting until the next revision; to consumers, those are problems that shouldn’t have to be overcome, certainly not on a daily basis. Beta had its uses and its market; the home consumer was not one of those markets.

In the war between Betamax and VHS in the consumer market, the best technology won, but it wasn’t the technology that tech geeks would consider the best. It was the technology that consumers considered the best. And they were right.

iPod

Today, we need only look at geek arguments against the iPod. Geeks complain about its popularity; they don’t understand why the iPod is popular when it doesn’t do what geeks want it to do. Some of their complaints about the iPod are the same things that consumers prefer about the iPod: it comes with its own rechargeable battery; it is small; and it does only one thing, but does that one thing--play music--well. It is easy to use, and not filled with unnecessary features that only confuse what the device is supposed to do: play their music.

When the iPod first came out, one complaint was that it had a smaller drive than other models, such as the Nomad. But those other models only had slow USB, taking hours to fill up their slightly larger hard drive, where the iPod had Firewire and was able to fill up its hard drive in minutes or less. Those other models had older hard drives that made them bulky. Geeks looked at hard disk space. The average consumer looked at the overall functionality. The iPod won because it was easier to use, easier to carry, and wasted less of their time.

That the iPod cost $100 more than its competition at the time didn’t matter except to tech geeks; tech geeks were willing to put up with poor interfaces that made it a challenge to choose from gigabytes of music. Tech geeks enjoy challenges. The average consumer took a look at the iPod’s far easier interface, far faster transfer speeds, more compact form factor, and slightly lower disk space, and was more than willing to pony up that extra $100.

Quintessential anti-iPod reaction when the iPod came out: “For $399 they should have made it a full-blown PDA as well, there is no way I can see this succeeding.”

iTunes Music Store

Meanwhile, industry leaders complain that the iPod doesn’t play licensed and rented music files. But consumers, so far, prefer to play music that they own, which the iPod plays fine. Consumers prefer not to have their music under the control of someone else. They want CDs, MP3 files, or if the music has to have digital restrictions, those restrictions should not significantly impede their ownership of the files. In other words, if their music files absolutely must be restricted, they want iTunes Music Store music files.

The iPod’s digital restriction management operates under an ownership paradigm, where most other players use restriction management that operates under a license or lease paradigm: the music can always be turned off from elsewhere. No iPod or iTunes software in use right now will ever refuse to play for consumers “until they upgrade to a newer client”, as WMA, the restriction management in use by other music players, promises to do. Labels are desperate to convert the music industry from a consumer-friendly ownership model to a label-friendly rental model.

Age is morally wrong

People were willing to buy a VCR without deferring to tech geeks; they are willing to buy music players without deferring to tech geeks; but when it comes to computers, they defer to tech geeks, and so they get computers that only a tech geek could love: computers that require too much work to keep working, that require a tech geek to keep working. They’re new and expensive and filled with gimmicks and high-priced Special Effects and esoteric lights, dials, and meters that assure them that this is a superior machine.

Sometimes it seems that we think it is morally wrong to use older computer hardware or software. Few people using a three-year-old car talk self-deprecatingly about it and themselves just because their car is “old”. But computer owners will laugh at people who don’t upgrade Word because Word 5.1 does what they need, and an upgrade would mean having to buy a new computer. But why is that laughable? Why should someone spend several hundred dollars at least just to use the latest Word feature that they don’t want?

We in IT need to be more considerate of other people’s needs, not just manufacturers’ needs and our own preference for things that need to be tinkered with. We forget that our concerns are not the same as everyone else’s. Normal people have specific reasons for wanting a new computer. Without those reasons, not only do they not want one, they don’t need one, and there is no reason for us to convince them otherwise.

We in corporate and educational IT environments need to be more considerate of what we throw away, and also what we recommend that people throw away. We should not become so disconnected from the real world that we forget why people own computers in the first place: to make their lives easier.

And consumers need to stop listening to us when we recommend that they buy what we want, rather than what they want.

November 29, 2023: The Parable of the Soldered RAM
512 MB DDR desktop RAM: Elixir 512MB DDR RAM M2U51264DS8HC3G-5T for desktop computers, circa 2006.; memory

Just looking at this brings back my very reasonable fear of static electricity.

I’m not a big fan of the continued miniaturization of computer parts, nor of the transformation of computers to appliances. These trends make weekend tinkering a thing of the past, much in the same way that modern cars make garage mechanics impossibly difficult. Surface mount devices are far more difficult to solder, and require special tools beyond merely a soldering gun.

But my dislike of them doesn’t change that they increase reliability and decrease cost. Everything in life is a trade-off. One of my long-standing complaints with tech bloggers, from the introduction of the iPod to the expectation that no-one has a computer more than a year or two old is an inability to see that time saved is a very important feature.

Last October, one of the bloggers I follow snark-announced the next Apple CPUs. His biggest complaint:

With all variants of Apple’s CPUs the memory is soldered onto the CPU module. You can’t upgrade it, ever.

Two days later, his Top Story was:

Seems to be a loose connection inside my laptop. That's why the problem is intermittent. It can probably be fixed, I think I’ll just turn it into a Linux server and stick it on a shelf, and take the other laptop (better screen and CPU but 16GB of soldered RAM) and use that as my Windows system.1

I’m not a big fan of soldered RAM either; but I’m comfortable—mostly, I still have an irrational fear of static electricity—opening up my computer and replacing user-replaceable parts. Most people are not, and for good reason: every time you open it up, you increase the chance of screwing something else up.

But everything is a trade-off, and soldered RAM is a good example of that. Loose connections have plagued home computing since the beginning, and soldered parts vastly increase connection reliability. This is especially true for any device that’s going to be dragged around a lot, such as a laptop. Soldered RAM that comes loose is a manufacturing defect that justifies a replacement computer. User-serviceable RAM that comes loose is… just what user-serviceable RAM does. That’s the whole point, that the RAM can be removed.

  1. <- Misplaced Compassion
  2. Gonzales v. Raich ->