The reports of the iPad’s death may be premature

I’ve seen a number of articles recently declaring the iPad to be, in effect, a failed experiment: not powerful enough to replace a full computer for “power user” productivity tasks, and with little utility that separates them from phones with oversized screens. Yet two or three years ago many of these same writers were declaring the iPad—and tablets in general, but really just the iPad—to be the next evolution in personal computing, as big a leap forward as the GUI and mouse.

I was a little contrarian about that, writing this time last year that Steve Jobs’s famous analogy of PCs being trucks and tablets being cars might be wrong:

While each new iteration of iOS […] will give us more theoretical reasons to leave our laptops at home or the office, I’m not convinced this is solely a matter of missing operating system functionality. It may be that some tasks just don’t map that well to a user experience designed around direct object manipulation.

While I still think this is true, I’m feeling a little contrarian again, because I believe the first sentence in that paragraph is still true, too. Tasks that do map well to a user experience designed around direct object manipulation are likely to get easier with each new release of iOS (and Android and Windows Whatever), and with each release there will be some tasks that weren’t doable before that are now.

I’ve also felt for some time—and I believe I’ve written this in the past, although I’m not going to dig for a link—that ideas can and will migrate from OS X to iOS over time, not just the reverse. And it’s always struck me as a little bonkers when people say that Apple’s “post-PC vision” will never include better communication between apps, smarter document sharing, and other power user features. OS X is friendlier than Windows and Linux, but it’s a far better OS for power users than either Windows or Linux, too. While I think it’s true that Apple has no intention of exposing the file system on iOS—let alone exposing shell scripting—that doesn’t mean that their answer to all shortcomings of iOS for power users is going to be “That’s not what iOS devices are for.”

So, this brings us around to Mark Gurman’s report that iOS 8 will have split-screen multitasking. I don’t know whether it’s true (although Gurman’s track record has by and large been pretty good), but this is one of the biggest shortcomings for power users on an iPad—as much as we may celebrate the wonders of focus that one app at a time gives us, in real world practice, working in one window while referring to the contents of another is something that happens all the time.

The big question for me is the feasibility of the “hybrid” model, tablets with hardware keyboards or laptops with touch screens. We don’t have the latter in the Mac world, but the Windows world not only has them, touch has become pretty common on new and not-too-expensive models. Anecdotally, people like them. To me, a tablet with a hardware keyboard makes more sense than the touch laptop, simply because you can take the keyboard off and not bother with it most of the time. While I’ve joined in mocking the Surface’s implementation of this, it may truly be Microsoft’s implementation that’s at fault, not the whole concept.

At any rate, I’m not only not expecting the iPad to be subsumed by “phablets,” I’m expecting iOS to start delivering on distinctly “power user” features over the next few versions. I’m happy with my MacBook Pro in a way I wouldn’t be with only an iPad and there will always be people who will say that. But the future of computing is a trend toward computing as appliance. Right now, there’s a measurable gap between what computing appliances do and what general computers do. And it sells Apple short to suggest that they never plan to address that gap with anything more than “we don’t think our customers need to do any of that.”

Pampero Manhattan

There’s an argument to be made that if you know a few basic cocktails, you actually know a lot of cocktails. There’s a whole family of drinks that stem from the Martinez: one part each gin and sweet vermouth, with a dash of maraschino liqueur (a clear dry cherry concoction). If you look at that as a template—base spirit, vermouth and optional dash of bitters of liqueur—you can immediately see both the martini and the Manhattan there.

The Manhattan itself—classically two parts rye whiskey to one part sweet vermouth with a dash of bitters—can be squished into all sorts of interesting shapes. You can use dry vermouth with it and have a Dry Manhattan, or use a half-part each of both dry and sweet vermouth and have a “Perfect” Manhattan. Swap the rye for scotch and you have a Rob Roy. Swap the rye for Canadian whiskey, as many bars will when you ask for a Manhattan, and you have a bad Manhattan.

There’s a variant that I’ve been introduced to by Singlebarrel that I love. It’s called the Cuban Manhattan, which is a Perfect Manhattan made with dark rum. (To live up to the name, I’d suggest Flor de Caña Grand Reserve.) This is my take on it, though, using one of my favorite dark rums: a Venezuelan rum called Pampero Aniversario.

Pampero Manhattan

  • 2 oz. Pampero Aniversario rum
  • 1 oz. Carpano Antica sweet vermouth
  • 1–2 dashes chocolate bitters

Stir with ice and strain into a chilled cocktail glass. Garnish with a twist of orange peel.

(I used Fee’s Aztec Chocolate Bitters because that’s what I happen to have, although the general consensus seems to be that Bittermens Xocolatl Mole Bitters are the best of the lot.)

Private clubs and open bars: on App.net vs. Twitter

App.net is an ambitious experiment to create a for-profit “social infrastructure” system for the Internet: you could use it to build something like Twitter, or a file syncing or storage service, or a notification service. In practice the only one that got any attention, about a year ago, was App.net’s vision of a better Twitter. This came shortly after it became clear that Twitter had decided it was in their own best interest to knife their development community.

Yet like it or not they’re still competing against Twitter, and Twitter is “free.” (The air quotes are important.) App.net started out at a price of $50 a year, then dropped to $36, then added a free tier of deliberately minimal utility. $36/year isn’t an outrageous fee, but it’s infinitely higher than free. It’s too high to even be a credible impulse buy.

Even if you didn’t see the stories yesterday you can write the next part. App.net discovered this model isn’t working well after all, and they’re keeping the service open but laying off everyone. This seemingly crazy move may work, but it reduces App.net to a side project.

Maybe App.net needed cheaper pricing, or maybe it just couldn’t work competing with “free” no matter what. Maybe focusing on the kinds of people who give a poop about Twitter’s relationship to its development community wasn’t the right tack. Maybe its problem is that it was a private haven for rich white tech dudes, as some critics snarked.

Maybe, although I’ll admit the last one grates on me. We’re mocking App.net for having a cover charge—the only legitimate beef, as there’s no evidence they discriminate based on gender, ethnicity or bank statements—then going on to the huge club down the street, the one with ads on every available bit of wall, because they give us cheap beer for free.

But maybe App.net has misdiagnosed the problem.

What’s Twitter’s functionality? Broadcast notifications sent to people who specifically subscribe to them, with simple mechanisms for setting them private or public (i.e., direct messaging) and replying. That mechanism can act as text messaging, group chat, link sharing, image sharing, status updates, site update notifications, news alerts, and more. It’s a terrific concept.

App.net’s best insight was that making notifications “infrastructure” the way email and HTTP are has amazing potential. Twitter has no interest in letting other people use their infrastructure except under the strictest terms. That’s the problem App.net’s model solves. Good for them.

But as much as this is anathema to the Valley’s technolibertarian mindset, infrastructure only works as a common good. Suppose CERN had spun off WebCorp to monetize HTTP. They could offer “free” web with tightly dictated terms on how we interact with their ecosystem, or they could be liberal with those terms and exact a connection toll. But neither of those scenarios would get us to where we are now. The Internet is the Internet because it’s built on protocols that are free. Not free-with-air-quotes, just free period.

"Wait, but if there was free infrastructure to do what Twitter does, how would anyone make money?" At first, by selling commercial clients and servers, although that market would be likely to decline over time. Some companies could run commercial notification networks with access charges to operators. (This vision needs a decentralized network like email to spread operational costs out as broadly as possible, but that network must deliver notifications in fairly close to real time and in the proper order, which is a huge problem.)

In the long run, broadcast notification services only survive if they do become like email services. App.net isn’t making enough money to sustain a full-time business, but so far Twitter isn’t either. They both believe the value is in the infrastructure, and they’re both wrong. The value comes from making the infrastructure free.

Meanwhile, I find the bits of schadenfreude I’m seeing on this—not the “I didn’t expect this to work and I’m disappointed I was right” posts, but the “ha ha, private club goes under” posts—to be a little disheartening. I like free beer, too, but the managers at Club Tweet are starting to look a bit desperate.

FourSquare de-gamifies →

As Foursquare added more Yelp-like features and kept de-emphasizing the game aspects that it had always been built around, I asked somewhat sardonically:

Turns out the answer is: next month. Ben Popper and Ellis Hamburger, The Verge:

Today, the company is announcing a brand new app called Swarm that will exist alongside the current Foursquare app. Swarm will be a social heat map, helping users find friends nearby and check in to share their location. A completely rewritten Foursquare app will launch in a month or so. The new Foursquare will ditch the check-in and focus solely on exploration and discovery, finally positioning itself as a true Yelp-killer in the battle to provide great local search.

Foursquare’s CEO has been saying for the past year that the check-ins and points and badges and all were never their real goal, but I’m dubious. The big question ahead is whether the “new” Foursquare is really competitive with Yelp. Foursquare has listings for just as many businesses as Yelp, but it doesn’t have ratings for nearly as many. But worse, for an app that wants to focus on “exploration and discovery,” the algorithm that drives Foursquare’s ratings shows a noticeable bias toward places that are both popular and get a lot of regulars. That sounds reasonable, but a lot of times you’re a regular at a place because it’s close to your office. Both Yelp and Foursquare are going to be able to tell you about that huge Mexican place that’s popular with all the tech workers for business lunches, but Yelp’s a lot more likely to tell you about the dive taqueria worth going out of your way for.

An insider-based tech bubble? →

Noam Scheiber writing in the New Republic:

The great bubbles in history, right up through the dotcom fiasco and last decade’s real estate unpleasantness, have typically been mass phenomena. […] But just because bubbles typically don’t inflate until the small-timers get involved doesn’t mean you can’t have a bubble without them.

A short but interesting piece which fits with my own somewhat cynical musings on this topic. Also:

The good news is that most of the money lost when the bubble bursts will come from private investors, not people invested in the stock market. The only average folks likely to suffer are those who make their living in the [San Francisco] Bay Area.

Awesome.

No Exit →

Gideon Lewis-Kraus writes a brilliant piece of long-form journalism in Wired this month, “No Exit,” following the path of a more typical startup in San Francisco than the ones that get huge payoffs.

As a self-taught programmer with no college degree at all, this passage particularly resonated with me:

It’s extremely difficult to hire talented engineers in the Valley unless you’ve got incredible PR, can pay a fortune, or are offering the chance to work on an unusually difficult problem. Nobody was buzzing about them, and they had no money, but the upside of having a business that relied on serious machine learning was that they had worthy challenges on the table. On January 4, they made an offer to exactly the sort of engineer they needed, Tevye. He had a PhD in AI from MIT. Just to contextualize what that means in Silicon Valley, an MIT AI PhD can generally walk alone into an investor meeting wearing a coconut-shell bra, perform a series of improvised birdcalls, and walk out with $1 million. Nick and Chris had gone to good schools of modest profile—Nick to the University of Puget Sound, Chris to the University of Vermont—and while Nick also had a Harvard business degree, both were skeptical about the credential fetish of the Valley. They were happy to play the game when they could, though.

The assumption has clearly become that it’s easier to teach good coding practices to people who know all the algorithms already than vice-versa. I can’t definitively say that’s the wrong approach, although I have my doubts. I can definitively say that it’s pushing me more toward pursuing technical writing than I was thinking about even a half-year ago.

In any case, while my own experience with startups hasn’t been nearly as stressful as what these guys are going through, I’ve seen just enough of that angle to make me wonder whether SF/Silicon Valley’s “rock stars only” mindset is healthy in the long run. (Ironically, I’m getting more contacts than ever from people inexplicably convinced I’m a Rock Star Ninja Brogrammer looking to wrangle Big Data High Scalability DevOps Buzzword Bleepbloop problems the likes of which the world has never seen. Sorry. I know enough of the words you are using to help you document your brilliant stuff, though.)

Brendan Eich Steps Down as Mozilla CEO →

For those not keeping score, Brendan Eich is a long-time Netscape/Mozilla employee, the inventor of Javascript, and a guy who gave $1000 to California’s “Proposition 8” a few years ago, the one that banned gay marriage. One of the previous facts about Eich caused a loud outcry against his recent appointment to CEO. Not the Javascript one.

Eich’s defense — from himself and others — largely boiled down to “well, you have to include all viewpoints if you’re going to call yourself inclusive,” which is a fine argument until one considers that including the side that says you can’t include the other side is the rhetorical equivalent of a division by zero error. Given how strongly Eich had been digging in his heels, one suspects he didn’t step down as much as had the ladder yanked out from under him.

And the guy invented Javascript. Isn’t that damage enough?

Facebooking the future

The net is abruptly abuzz with news that Facebook bought Oculus VR, the partially-Kickstarted virtual reality company backed by game engine wunderkind John Carmack. And, it’s abuzz with a lot of fairly predictable hair-pulling and shirt-rending.

It’s certainly interesting news, and on the surface bemusing—although no more so than half of what Google buys these days. Facebook seems to be pretty interested in keeping abreast or ahead of Google, too. Hmm. Does Google have any VR product that sort of like Oculus Rift? Something that rhymes with “crass?” I’m sure there’s something along those lines I’ve been hearing about.

Frankly, despite all the hair-pulling I don’t think this is going to make a lot of difference to Oculus Rift users. When it comes to handling acquisitions Facebook seems to be more like Microsoft than Google or Apple. And that’s actually a good thing. Microsoft has certainly done in good products through acquisitions, but look at Bungie and, before them, Softimage, one of the leading high end 3D animation programs of the 1990s. In both cases, the companies were given a great deal of autonomy—what Microsoft wanted them for actually required that. Bungie’s Halo essentially defined the Xbox gaming platform, and Softimage got NT into animation and movie studios. (I suspect this was a much bigger nail in SGI’s coffin than it’s usually given credit for.)

Facebook needs the Rift to be a successful product, and for it to be a successful product they have to not screw with it. They don’t want to take it away from gamers—what they want is, well, pretty much what Zuckerberg wrote:

Imagine enjoying a court side seat at a game, studying in a classroom of students and teachers all over the world or consulting with a doctor face-to-face — just by putting on goggles in your home.

What Facebook wants, ultimately, is to build the kind of communication device that up until now only existed in science fiction. I’m not sure any company can actually pull that off, but I can’t think of another company that genuinely has a better shot. And as much as many things about Facebook continue to exasperate me, I’ve been coming to a somewhat reluctant admiration not only of their ambition but their engineering.

I can’t say this purchase makes me more likely to use either Oculus products or Facebook, but it’s a very interesting milepost. I’ve thought for years that Facebook the product has a limited lifespan, but Facebook the company may have a much longer—and far more interesting—one than I would ever have guessed.

If only GnuTLS had been open source! Wait.

Dan Goodin, Ars Technica:

Hundreds of open source packages, including the Red Hat, Ubuntu, and Debian distributions of Linux, are susceptible to attacks that circumvent the most widely used technology to prevent eavesdropping on the Internet, thanks to an extremely critical vulnerability in a widely used cryptographic code library.

Goodin argues that the bug is worse than Apple’s highly publicized “goto fail” bug, as it appears that it may have gone undetected since 2005.

I’d like to pretend I’m above feeling a bit of schadenfreude given that some of the loudest critics of Apple tend to be the open source zealots, but I’m not. One of the more religious aspects of Open Sourcitude is the insistence that all software ills stem from proprietary, closed source code. If the code is open, then in theory there should be fewer bugs, more opportunity for new features, and a lower chance of the software dying due to abandonment by its original developers for whatever reason.

But in actual, real world practice, software with few users tends to stagnate; software that becomes popular tends to keep being developed. This holds true regardless of the license and access to the source code. There are a lot of fossilized open source projects out there, and a lot of commercial products with vibrant communities. Being open source helps create such communities for certain kinds of applications (mostly developer tools), but it’s neither necessary nor, in and of itself, sufficient. And no one—not even the most passionate open source developer—ever says something like, “You know what I’d like to do tonight? Give GnuTLS a code security audit.”

Editorially shutting down →

While I only used the service briefly, this is still unfortunate news. Editorially is (was) a collaborative online writing tool using Markdown that I was planning to use for—yes—editorial work.

I’m not sure what to take away from this; Editorially was one of those free services they were planning to figure out how to make money from later, and that’s always a dicey business. Yet as Everpix’s failure demonstrated, being a paid service from the get-go isn’t automatically a win. Editorially’s management concluded that “even if all our users paid up, it wouldn’t be enough”; would it have been if they been a paid service from the beginning? Maybe, but of course they’d have had fewer users—probably far fewer—and so that might not have been sustainable, either.