Bloviations of an
Opinionated Programmer

The New MacBook is Deliberately Ahead of its Time

I'm over a week late on this, but I thought I would wait to hear what all the biggest names in blogging had to say about it first.

Apple launched a new MacBook last Monday. It's the thinnest notebook they've ever built, and it's got some cool new stuff. The force-touch trackpad looks genuinely cool, the retina display is a nice addition, and USB Type C is an important advancement, but none of those things are what I find most interesting about the machine. The thing that I find most interesting about this device is its name.

When Marc Gurman caught wind of this machine back in January, he called it a MacBook Air. That name makes a lot of sense, considering that it's thinner than the current MacBook Air, and just as light. It's perhaps more "Airy" than the current models. Yet for some reason, Apple did not call this product a MacBook Air when they launched it — they just called it "MacBook". That strikes me as a touch peculiar.

On the latest episode of John Gruber's podcast, he speculated that Apple didn't name the product "MacBook Air" because they wanted to price it out of the MacBook Air's price range (MacBook air currently starts at $899). I disagree with him for two reasons:

  1. When the "MacBook" (sans qualifier) has been part of the lineup, it has always been the lowest-cost option. Choosing the plain "MacBook" name for a notebook specifically because it's more expensive than the current MacBook Airs is backwards.
  2. If price was truly the issue, surely Apple would have named it "MacBook Air with Retina Display" to justify the higher price, a la "MacBook Pro with Retina Display" and "iMac with Retina Display"

My theory: Apple's choice to name this product "MacBook" has nothing to do with its price. I think it's far more likely that Apple is working on an even thinner computer which will eventually become the new MacBook Air (to be available in silver, space gray, and gold), and this current MacBook model will come down in price once the new MacBook Air launches, probably in Spring 2016. The problem is that the current MacBook is about as thin as it can get without sacrificing ports. Would Apple really release a notebook with zero ports?

John Siracusa thinks they should. On episode 108 of the Accidental Tech Podcast (appropriately titled "Zero is Better Than One"), he points out that it's extremely odd that Apple would use only one USB Type-C port instead of either zero or two. Indeed, this does seem like an odd choice if you only look at the computer in the context of its current lineup, but this decision makes a lot of sense if you imagine where this product would fit a year from now in a hypothetical next-generation lineup of MacBooks. I think the honor of being Apple's first zero-port MacBook is being reserved for an ultra-thin MacBook Air, a machine so thin that they couldn't fit any ports on it even if they wanted to (and a computer that's exclusively wireless would certainly give the "Air" branding new meaning). But as Siracusa points out, they could have probably tweaked the design of the current MacBook ever-so-slightly to make room for a second port. So why not?

The lack of a second USB port supports the perception that the usefulness of Apple's notebooks is trending downward, which is something that Merlin Mann and Dan Benjamin grumbled about during episode 212 of the Back to Work podcast. In particular, Merlin highlights the apparent untimely death of the Thunderbolt connector as a point of irritation. These frustrations wouldn't be misplaced if you consider the MacBook lineup as it currently is. The Thunderbolt connector in particular has only been around for 3-or-so years, and when it was introduced, it was promised to be the one-size-fits-all solution for all of your needs. Yet, we now have a brand new MacBook that curiously left it out (even the current MacBook Airs support Thunderbolt).

But the new MacBook is supposed to be a computer for the masses, and I don't think that description ever befit Thunderbolt. I sincerely doubt many people use the Thunderbolt connector in a non-professional context, which is why I think the future next-generation MacBook Pro (to be available in silver, space gray, and gold) will keep the Thunderbolt connector alive, along with 2-3 USB Type-C ports in an all-around slimmer package. I think the reason the current MacBook only has one USB Type-C port is because the number of ports will serve as a point of differentiation between models, just as it always has.

All of these guys are trying to make sense of the new MacBook in the context of the current lineup, which is a mistake. This MacBook doesn't actually belong in the current lineup, which is why it didn't displace any of the existing models. This is just the first leg of Apple's all-new lineup of next-generation notebooks, which will all be available in 3 different finishes, are retina-by-default (note: this MacBook is Apple's first retina display computer not to include "with Retina Display" in its official name), and include a dramatically simplified port arrangement. The way I see it, this product will only make sense a year from now, once the MacBook Pro and MacBook Air tiers fall into place. This was just a signal of what's to come. The New MacBook is deliberately ahead of its time.

Open Standards are the Worst! (Sometimes)

(Attention grabbing headline: check)

There's a common belief in programming circles that open standards and open-source software will always defeat proprietary solutions, even if it takes an eternity. Some think that the success of Microsoft Windows, for example, is just an anomoly. One of these days, an open-source linux distribution will rise in popularity, and overthrow it.

In fact, as I type this, an article titled Your App Is Not Better than an Open Protocol is trending on Hacker News. In it, the author argues that open standards like SMS, RSS, and Email compatibility are what transformed services like Twitter, WhatsApp, and GitHub into household names.

But at least with proprietary solutions, you have control over the entire platform. You can rest assured that there aren't any issues caused by poorly implemented software on the other end of the wire.

Consider Google Reader: Its shutdown in 2013 delivered a huge blow to the community. Not necessarily because it was a good RSS reader on its own (which it was), but because it became the de facto hub of RSS clients everywhere. Most of the popular RSS readers in use at the time weren't truly RSS readers at all — they were merely Google Reader clients. App developers were drawn to Google Reader over pure RSS for a variety of reasons, all of which stem from the fact that RSS is an open — uncontrolled — standard.

Standards-compliant applications still have to handle non-compliance gracefully

There is no central authority that sanitizes RSS feeds. If a user or your RSS application subscribes to a feed that gives you tag soup, you are expected to be able to handle it in a reasonable way — and that's not always easy. So you can either try to build your own sloppy RSS parser, or rely on someone smarter than you to do it on your behalf. That "someone smarter" happened to be Google Reader for most people. Google is a trusted, well established company, and most people already have a Google account they can use with your app, so it was a good fit. It was easier to build an app that handled Google's well-formed RSS than try to do it yourself.

Also consider Gmail: it adds a lot of features that aren't present in any commonly used email standard. Labels, colored stars, and "priority inbox" are all features that were built on top of standards like IMAP and POP, but aren't actually a part of any commonly used spec. This makes things difficult for mail client makers who are now expected to go beyond the spec to facilitate Gmail's unique features. It shouldn't come as any surprise that we've already begun to see a rise of Gmail clients instead of true IMAP Email clients. As gmail pulls away from the herd, it becomes more difficult to build a universal mail client.

Consider the Web: I cannot even fathom how difficult it must be to write a web rendering engine that's capable of taking a hot mess of poorly-formed CMS garbage, and figuring out what the developer was trying to accomplish. It's not enough to simply meet the spec, you almost have to read the developer's mind to figure out exactly what they were trying to achieve. We even tried solving this once (sorta) by creating the XHTML standard, where well-formed markup was a requirement. There was just one problem: people didn't use it because adhering to a stricter spec was inconvenient, and it didn't offer enough advantages.

Updating standards is a long and painful process

Even if you're smart enough to roll your own clever RSS parser, you still have another problem to face: syncing which articles have been "read". If someone wants to read an article on their phone, and have it appear as "read" on their computer, (or vice versa), you get the exciting opportunity to invent your very own proprietary solution. Or, you can go where your customers already are and just become a Google Reader client. As a bonus: the "read" state will be synced across every other Google Reader client that your user uses, making it even easier for users to switch to your platform (or away from it, as the case may be).

Wouldn't it be great if the RSS spec was updated with a feature to keep track of this data? The problem is, even if the spec was updated, we would still need to update all the RSS generators, and all the RSS clients. It would be years, if not decades before this new version of RSS had become commonplace enough to eschew support for our current version.

If you don't believe me, consider Email: it faced a similar problem early in its lifetime. POP3 didn't have any way of tracking which messages were read and which ones weren't. In fact, it was partially designed on the assumption that people would download their messages to their mail client, and expect them to be immediately deleted on the server so that only one instance of the message exists — just like real mail. IMAP was invented in the 80s, but it seems like it wasn't until the mid 00s that it was safe to assume your mail service of choice supported it.

Consider — once again — the Web: Every time a new feature is considered, we have to use a host of browser prefixes to ensure cross-browser compatibility, and sometimes those browser prefixes stick around for years before it's safe to use the standard, W3C-blessed syntax. The fact that we rely on enormous tables detailing which browsers support certain features should be damning evidence that these "standards" are anything but.

Standards-compliance comes at the cost of either accessibility or performance

In my opinion, most of the "modern web" is an abuse of the HTML/CSS/JS standard. HTML is a great way to define static documents, perhaps with minimal interactivity. It is not a good way to create elaborate interactive applications. Rendering the layout of a complex HTML web page is a computationally expensive process. Flipboard realized this, and decided to move to a strictly <canvas> based approach for their latest "2.0" release of their mobile web client. The results are, unsurprisingly, very fluid. John Gruber put it a little more bluntly: "To me, that Flipboard went this route is a scathing condemnation of the DOM/CSS web standards stack."

I believe Flipboard chose the correct solution. The DOM/CSS stack was not designed to handle such complex animations so fluidly. The <canvas> element, on the other hand, was designed specifically for complex animations that couldn't be efficiently achieved using the DOM. However, there's a fairly vocal crowd opposed to the change, mainly due to accessibility concerns. This isn't a bad reason to be opposed to the change, but everyone seems to be ignoring a pretty substantial issue: we shouldn't be forced to choose between performance and accessibility just to achieve standards-compliance.