Private clubs and open bars: on App.net vs. Twitter

App.net is an ambitious experiment to create a for-profit “social infrastructure” system for the Internet: you could use it to build something like Twitter, or a file syncing or storage service, or a notification service. In practice the only one that got any attention, about a year ago, was App.net’s vision of a better Twitter. This came shortly after it became clear that Twitter had decided it was in their own best interest to knife their development community.

Yet like it or not they’re still competing against Twitter, and Twitter is “free.” (The air quotes are important.) App.net started out at a price of $50 a year, then dropped to $36, then added a free tier of deliberately minimal utility. $36/year isn’t an outrageous fee, but it’s infinitely higher than free. It’s too high to even be a credible impulse buy.

Even if you didn’t see the stories yesterday you can write the next part. App.net discovered this model isn’t working well after all, and they’re keeping the service open but laying off everyone. This seemingly crazy move may work, but it reduces App.net to a side project.

Maybe App.net needed cheaper pricing, or maybe it just couldn’t work competing with “free” no matter what. Maybe focusing on the kinds of people who give a poop about Twitter’s relationship to its development community wasn’t the right tack. Maybe its problem is that it was a private haven for rich white tech dudes, as some critics snarked.

Maybe, although I’ll admit the last one grates on me. We’re mocking App.net for having a cover charge—the only legitimate beef, as there’s no evidence they discriminate based on gender, ethnicity or bank statements—then going on to the huge club down the street, the one with ads on every available bit of wall, because they give us cheap beer for free.

But maybe App.net has misdiagnosed the problem.

What’s Twitter’s functionality? Broadcast notifications sent to people who specifically subscribe to them, with simple mechanisms for setting them private or public (i.e., direct messaging) and replying. That mechanism can act as text messaging, group chat, link sharing, image sharing, status updates, site update notifications, news alerts, and more. It’s a terrific concept.

App.net’s best insight was that making notifications “infrastructure” the way email and HTTP are has amazing potential. Twitter has no interest in letting other people use their infrastructure except under the strictest terms. That’s the problem App.net’s model solves. Good for them.

But as much as this is anathema to the Valley’s technolibertarian mindset, infrastructure only works as a common good. Suppose CERN had spun off WebCorp to monetize HTTP. They could offer “free” web with tightly dictated terms on how we interact with their ecosystem, or they could be liberal with those terms and exact a connection toll. But neither of those scenarios would get us to where we are now. The Internet is the Internet because it’s built on protocols that are free. Not free-with-air-quotes, just free period.

"Wait, but if there was free infrastructure to do what Twitter does, how would anyone make money?" At first, by selling commercial clients and servers, although that market would be likely to decline over time. Some companies could run commercial notification networks with access charges to operators. (This vision needs a decentralized network like email to spread operational costs out as broadly as possible, but that network must deliver notifications in fairly close to real time and in the proper order, which is a huge problem.)

In the long run, broadcast notification services only survive if they do become like email services. App.net isn’t making enough money to sustain a full-time business, but so far Twitter isn’t either. They both believe the value is in the infrastructure, and they’re both wrong. The value comes from making the infrastructure free.

Meanwhile, I find the bits of schadenfreude I’m seeing on this—not the “I didn’t expect this to work and I’m disappointed I was right” posts, but the “ha ha, private club goes under” posts—to be a little disheartening. I like free beer, too, but the managers at Club Tweet are starting to look a bit desperate.

FourSquare de-gamifies →

As Foursquare added more Yelp-like features and kept de-emphasizing the game aspects that it had always been built around, I asked somewhat sardonically:

Turns out the answer is: next month. Ben Popper and Ellis Hamburger, The Verge:

Today, the company is announcing a brand new app called Swarm that will exist alongside the current Foursquare app. Swarm will be a social heat map, helping users find friends nearby and check in to share their location. A completely rewritten Foursquare app will launch in a month or so. The new Foursquare will ditch the check-in and focus solely on exploration and discovery, finally positioning itself as a true Yelp-killer in the battle to provide great local search.

Foursquare’s CEO has been saying for the past year that the check-ins and points and badges and all were never their real goal, but I’m dubious. The big question ahead is whether the “new” Foursquare is really competitive with Yelp. Foursquare has listings for just as many businesses as Yelp, but it doesn’t have ratings for nearly as many. But worse, for an app that wants to focus on “exploration and discovery,” the algorithm that drives Foursquare’s ratings shows a noticeable bias toward places that are both popular and get a lot of regulars. That sounds reasonable, but a lot of times you’re a regular at a place because it’s close to your office. Both Yelp and Foursquare are going to be able to tell you about that huge Mexican place that’s popular with all the tech workers for business lunches, but Yelp’s a lot more likely to tell you about the dive taqueria worth going out of your way for.

An insider-based tech bubble? →

Noam Scheiber writing in the New Republic:

The great bubbles in history, right up through the dotcom fiasco and last decade’s real estate unpleasantness, have typically been mass phenomena. […] But just because bubbles typically don’t inflate until the small-timers get involved doesn’t mean you can’t have a bubble without them.

A short but interesting piece which fits with my own somewhat cynical musings on this topic. Also:

The good news is that most of the money lost when the bubble bursts will come from private investors, not people invested in the stock market. The only average folks likely to suffer are those who make their living in the [San Francisco] Bay Area.

Awesome.

No Exit →

Gideon Lewis-Kraus writes a brilliant piece of long-form journalism in Wired this month, “No Exit,” following the path of a more typical startup in San Francisco than the ones that get huge payoffs.

As a self-taught programmer with no college degree at all, this passage particularly resonated with me:

It’s extremely difficult to hire talented engineers in the Valley unless you’ve got incredible PR, can pay a fortune, or are offering the chance to work on an unusually difficult problem. Nobody was buzzing about them, and they had no money, but the upside of having a business that relied on serious machine learning was that they had worthy challenges on the table. On January 4, they made an offer to exactly the sort of engineer they needed, Tevye. He had a PhD in AI from MIT. Just to contextualize what that means in Silicon Valley, an MIT AI PhD can generally walk alone into an investor meeting wearing a coconut-shell bra, perform a series of improvised birdcalls, and walk out with $1 million. Nick and Chris had gone to good schools of modest profile—Nick to the University of Puget Sound, Chris to the University of Vermont—and while Nick also had a Harvard business degree, both were skeptical about the credential fetish of the Valley. They were happy to play the game when they could, though.

The assumption has clearly become that it’s easier to teach good coding practices to people who know all the algorithms already than vice-versa. I can’t definitively say that’s the wrong approach, although I have my doubts. I can definitively say that it’s pushing me more toward pursuing technical writing than I was thinking about even a half-year ago.

In any case, while my own experience with startups hasn’t been nearly as stressful as what these guys are going through, I’ve seen just enough of that angle to make me wonder whether SF/Silicon Valley’s “rock stars only” mindset is healthy in the long run. (Ironically, I’m getting more contacts than ever from people inexplicably convinced I’m a Rock Star Ninja Brogrammer looking to wrangle Big Data High Scalability DevOps Buzzword Bleepbloop problems the likes of which the world has never seen. Sorry. I know enough of the words you are using to help you document your brilliant stuff, though.)

Brendan Eich Steps Down as Mozilla CEO →

For those not keeping score, Brendan Eich is a long-time Netscape/Mozilla employee, the inventor of Javascript, and a guy who gave $1000 to California’s “Proposition 8” a few years ago, the one that banned gay marriage. One of the previous facts about Eich caused a loud outcry against his recent appointment to CEO. Not the Javascript one.

Eich’s defense — from himself and others — largely boiled down to “well, you have to include all viewpoints if you’re going to call yourself inclusive,” which is a fine argument until one considers that including the side that says you can’t include the other side is the rhetorical equivalent of a division by zero error. Given how strongly Eich had been digging in his heels, one suspects he didn’t step down as much as had the ladder yanked out from under him.

And the guy invented Javascript. Isn’t that damage enough?

Facebooking the future

The net is abruptly abuzz with news that Facebook bought Oculus VR, the partially-Kickstarted virtual reality company backed by game engine wunderkind John Carmack. And, it’s abuzz with a lot of fairly predictable hair-pulling and shirt-rending.

It’s certainly interesting news, and on the surface bemusing—although no more so than half of what Google buys these days. Facebook seems to be pretty interested in keeping abreast or ahead of Google, too. Hmm. Does Google have any VR product that sort of like Oculus Rift? Something that rhymes with “crass?” I’m sure there’s something along those lines I’ve been hearing about.

Frankly, despite all the hair-pulling I don’t think this is going to make a lot of difference to Oculus Rift users. When it comes to handling acquisitions Facebook seems to be more like Microsoft than Google or Apple. And that’s actually a good thing. Microsoft has certainly done in good products through acquisitions, but look at Bungie and, before them, Softimage, one of the leading high end 3D animation programs of the 1990s. In both cases, the companies were given a great deal of autonomy—what Microsoft wanted them for actually required that. Bungie’s Halo essentially defined the Xbox gaming platform, and Softimage got NT into animation and movie studios. (I suspect this was a much bigger nail in SGI’s coffin than it’s usually given credit for.)

Facebook needs the Rift to be a successful product, and for it to be a successful product they have to not screw with it. They don’t want to take it away from gamers—what they want is, well, pretty much what Zuckerberg wrote:

Imagine enjoying a court side seat at a game, studying in a classroom of students and teachers all over the world or consulting with a doctor face-to-face — just by putting on goggles in your home.

What Facebook wants, ultimately, is to build the kind of communication device that up until now only existed in science fiction. I’m not sure any company can actually pull that off, but I can’t think of another company that genuinely has a better shot. And as much as many things about Facebook continue to exasperate me, I’ve been coming to a somewhat reluctant admiration not only of their ambition but their engineering.

I can’t say this purchase makes me more likely to use either Oculus products or Facebook, but it’s a very interesting milepost. I’ve thought for years that Facebook the product has a limited lifespan, but Facebook the company may have a much longer—and far more interesting—one than I would ever have guessed.

If only GnuTLS had been open source! Wait.

Dan Goodin, Ars Technica:

Hundreds of open source packages, including the Red Hat, Ubuntu, and Debian distributions of Linux, are susceptible to attacks that circumvent the most widely used technology to prevent eavesdropping on the Internet, thanks to an extremely critical vulnerability in a widely used cryptographic code library.

Goodin argues that the bug is worse than Apple’s highly publicized “goto fail” bug, as it appears that it may have gone undetected since 2005.

I’d like to pretend I’m above feeling a bit of schadenfreude given that some of the loudest critics of Apple tend to be the open source zealots, but I’m not. One of the more religious aspects of Open Sourcitude is the insistence that all software ills stem from proprietary, closed source code. If the code is open, then in theory there should be fewer bugs, more opportunity for new features, and a lower chance of the software dying due to abandonment by its original developers for whatever reason.

But in actual, real world practice, software with few users tends to stagnate; software that becomes popular tends to keep being developed. This holds true regardless of the license and access to the source code. There are a lot of fossilized open source projects out there, and a lot of commercial products with vibrant communities. Being open source helps create such communities for certain kinds of applications (mostly developer tools), but it’s neither necessary nor, in and of itself, sufficient. And no one—not even the most passionate open source developer—ever says something like, “You know what I’d like to do tonight? Give GnuTLS a code security audit.”

Editorially shutting down →

While I only used the service briefly, this is still unfortunate news. Editorially is (was) a collaborative online writing tool using Markdown that I was planning to use for—yes—editorial work.

I’m not sure what to take away from this; Editorially was one of those free services they were planning to figure out how to make money from later, and that’s always a dicey business. Yet as Everpix’s failure demonstrated, being a paid service from the get-go isn’t automatically a win. Editorially’s management concluded that “even if all our users paid up, it wouldn’t be enough”; would it have been if they been a paid service from the beginning? Maybe, but of course they’d have had fewer users—probably far fewer—and so that might not have been sustainable, either.

“Apple in Talks to Revamp Set-Top Box” →

While I was going to express some dubiousness about the headline of this WSJ article—it at first suggests revamping a mythical set-top box that Apple has yet to produce, rather than the one that they’ve been producing for years—but that probably isn’t fair. It could be the actual Apple TV that they’ve been selling.

So instead, knock them for the first sentence.

"Apple Inc. appears to be scaling back its lofty TV industry plans."

Um, okay.

In theory what this means is that Apple “envisages working with cable companies, rather than competing against them.” Well, yes. As much as some may dream of Apple displacing cable companies, that’s going to be pretty hard to do in the near-to-mid term for financial reasons alone. Netflix and Amazon are starting to compete with cable networks by paying for original content, but they established themselves first. Apple will need to do the same thing. (If they even want to get into that side of the business.)

If the WSJ report is correct, Apple’s service sounds like it will be nerfed to the point of bringing nothing new to the table—but that doesn’t mean it will remain nerfed forever. Even so, for right now they have the same problem that always faced Hulu: a profound disconnect between what their customers want to get, and what the media companies are comfortable giving. Apple got nearly everything they wanted when they launched the iPhone because Cingular was desperate enough to agree to their demands. As much as techies believe Hollywood should be that desperate, they’re not.

Did Apple ban a Bitcoin wallet because they fear competition?

(Spoiler: no.)

Apple pulled Blockchain, a Bitcoin “wallet” app, from their store, and Blockchain’s CEO is sure this is because Apple sees them as a potential competitor. “I think that Apple is positioning itself to take on mobile payments in a way they haven’t described to the public and they’re being anti-competitive,” he told Wired.

As usual, though, this comes up because it fits into the default narrative about What Apple Is Like, not because there’s either direct or even circumstantial evidence to support it. While Apple may not welcome competition with open arms, there are dozens of applications on both the iOS and Mac app stores that directly compete with Apple’s products and services. Pandora, Netflix, Hulu, just about every app Amazon makes—by this logic, shouldn’t these all be gone?

For that matter, there are already mobile payment applications in the store that are far more direct competition for this still-hypothetical Apple payment system than Blockchain is. I can walk into (some) stores and pay directly with the Square or Paypal apps, but I can’t walk into a store and pay directly with Blockchain; it has no infrastructure specifically designed for point-of-sale, and even if the store took Bitcoin, I’d rather just swipe a credit card than diddle around with paying by SMS.

So why did Apple pull Blockchain? As usual, we don’t know, because Apple is about as communicative as a sack of flour. But given that they pulled Coinbase a couple months ago, the signs point to this having little to do with competition and a lot to do with Bitcoin. Apple has a rapidly expanding presence in China—whose antipathy toward Bitcoin is crystal clear—and they’re a strikingly conservative company when it comes to financial and legal matters; Bitcoin’s reputation as a haven for illegal activity may not sit well with them.