I just found out that Apple is rejecting my new manifesto Stop Stealing Dreams and won’t carry it in their store because inside the manifesto are links to buy the books I mention in the bibliography. Quoting here from their note to me, rejecting the book: “Multiple links to Amazon store. IE page 35, David Weinberger link.”
Godin argues that while private merchants should certainly get to carry what they want, he considers this a “content screen”:
I think that Amazon and Apple and B&N need to take a deep breath and make a decision on principle: what’s inside the book shouldn’t be of concern to a bookstore with a substantial choke on the marketplace. If it’s legal, they ought to let people read it if they choose to. A small bookstore doesn’t have that obligation, but if they’re seeking to be the one and only, if they have a big share of the market, then they do, particularly if they’re integrating the device into the store. I also think that if any of these companies publish a book, they ought to think really hard before they refuse to let the others sell it.
While I’m sympathetic, I think this is a murkier case than Godin makes it out to be. In a technical sense, you can argue that links to a competing storefront are “content,” but they’re not facts, theories or ideas—they’re not what we normally mean when we think of a book’s contents. Godin’s book isn’t being suppressed by Apple because they want to stifle the availability of his radical thoughts. I agree with Godin that Apple should let people read any book they choose to, but that’s not the same argument as “Apple should publish books with links to Amazon’s storefront.” Perhaps it’s petty for Apple to disallow that, but even if every electronic book store on the market disallowed links to other book stores in their ebooks, the free speech of authors would hardly be irreparably harmed.
Godin notes that he has an easy workaround, since he can just sell the EPUB on his web site. He refrains from pointing out the other easy workaround: keeping all the actual text intact, giving readers all the same information, and simply de-linking the titles in his bibliography.
Compare the way Microsoft is bringing Metro elements to Windows to the way Apple is bringing iOS elements to OS X. For all the grumping about OS X’s “iOSification,” OS X 10.7/10.8 still look definitively like desktop operating systems, adding new features that are inspired by iOS but unique to the mouse-and-windows environment. With Windows 8, Microsoft is bringing the actual touch interface over, mapping each and every touch gesture to a mouse and keyboard equivalent. While Microsoft is getting a lot of kudos for their boldness, this doesn’t strike me as the right way to go. I’d like to use Windows 8 on a tablet or a phone, but would I want to use it on a 1920×1200 monitor with a mouse or trackpad?
A few days ago I made a snarky comment here about how I was perfectly willing to peddle crappy products for money, which makes it sound like I’m in the “we’re in a new Internet bubble” camp. I’m not really willing to peddle crappy products, of course—and I’m actually in the “we’re always in a bubble” camp. Some bubbles are just more tulip-shaped than others.
I’ve been thinking about contributing factors for tech bubbles, and again, there’s a lot of cynical answers to that: greed (of course), marketing-driven hype obfuscating questionable value propositions (“it’s social!”), the inexperience of ever-younger entrepreneurs, technorati myopia. Yet, I spent 2011 working with a couple startups and interacting in glancing ways with others.1 Most seemed to genuinely believe that they were going to change the field they were in, maybe radically. They had a mission. Many of them had funding, which they saw—not irrationally—as validation.
Yet not entirely rationally, either. Buzz Andersen quipped that San Francisco is practically founded on an unwillingness to call bullshit—but that’s the culture of the tech startup everywhere. We may have our own specific flavor of it out here, but some of the biggest crashes in the original dotcom era came down in Boston, Switzerland, Bellevue, and New York City.
Analyst Horace Dediu often looks at products by asking the question, “What are we hiring this product to do?” We need (or want) something done, and we “hire” the product to help us do that: we need (or want) to be able to do tasks when we’re away from our computer that require the Internet, or to have an entertainment center in the living room, or to publish our blatherings on the web. So we get the iPhone, a 42″ plasma TV connected to a Mac mini and a PlayStation 3, or Tumblr.
But the chances are we don’t get three or four phones or television sets. We may “hire” multiples of the same things for aesthetic reasons (clothes), but not for functional reasons: I might want both a PS3 and an Xbox 360 because they have different functionality, but I don’t want two televisions in the same room.2
But what about Tumblr? It’s certainly possible to have a Tumblr and a LiveJournal and a WordPress and a few more bloggish-type things. Right?
Yes, but unless you’re Robert Scoble you probably won’t use all of them. Exhibit A: LiveJournal. Followers, lists, photo posting, polls—all those new cool features the other services have been adding over the last few years are things LJ had since, roughly, the late 1800s. But LJ has been in slow but visible decline since Twitter and Tumblr took off.3 Why?
For broadcasting the minutiae of everyday life, Twitter absolutely kicks its ass, and for sharing links, photos, quotes, and pithy commentary, Tumblr kicks its ass. For longer-form posts like this, nothing’s markedly better than LJ, but multiple options provide parity. The biggest thing LJ had going for it was the social aspect of the “Friends List,” but again, both Twitter and Tumblr handle that with less friction.
It turns out that just like most people don’t hire multiple television sets or smartphones or word processors, most people don’t hire multiple blogging services, or multiple microposting services, or photo sharing services or event invite services or calendars or review sites or check-in services or etc. Once they find something that’s Good Enough For Them, they’re not going to switch unless the Good Enough product declines over time or something new really kicks its ass. Merely being better isn’t good enough to beat good enough.
All those four-person companies looking for Rockstar! Developers! to work on their Gamified! Social! Sharing! Service! (Perks include Free! Red Bull! For! Everyone!) creating applications whose values are derived primarily from social connections made on them? If they’re competing directly with an app that already has a million users, let alone a hundred million or more, they’re toast. Sure, something can disrupt that existing application, like Twitter and Tumblr did to LiveJournal—and more famously, Facebook did to MySpace—but ask Gowalla how “like Foursquare, but prettier” worked out for them.
If I had one bit of advice to someone thinking of a startup—including myself, at times—it would be this. Solve a genuine problem, even a trivial one, that you actually have, and that isn’t being adequately solved by an existing solution. Then think about how you can get money for solving that problem. Be wary of scenarios in which your revenue base and your customer base have no overlap.
If I had a second bit of advice, it would be this. Is the elevator pitch for your new startup—no matter how sincerely you believe in its fantastic future—at its heart a variant of, “Think [well-known service name] but with [added feature or new twist]”? If it is, you’d better know somebody willing to call bullshit.
Mostly as a prospective consultant or employee. I keep thinking about writing a “lessons learned” post about that, but haven’t figured out how to do so with sufficient tact and abstraction. ↩
Yes, I’m sure you can think of edge cases where that would be useful, but you know what I mean. ↩
Except in Russia, where LJ is literally synonymous with blogging. ↩
The somewhat boggling 41-megapixel sensor in the PureView 808 is not designed to take 41-megapixel photos, but rather to oversample pixels and come up with nearly noiseless shots at 5 or 8 megapixels. As for the phone,
If you’re thinking that [the 640x360] display won’t cut it in the modern smartphone world, things get worse once you look at the operating system: it’s Symbian Belle. Nokia can say as much as it wants about the steady rate of improvement in Symbian, it’s still not an OS we’d recommend any sane person use for extended periods of time.
I’m seeing this called a “niche device,” but I’m not sure what that niche is. People who want a terrific point and shoot camera that can they can also use as a smartphone in a pinch? All the remaining Symbian fanatics? (They’re lining up now, waiting for the moment they can rush home and sync their new PureViews with their Amigas.)
I’d like to see that oversampling trick make its way into other cameras, though—I’m not sure there are even any DSLRs doing that.
In light of the ongoing controversy about Silicon Valley tech bloggers having conflicts of interest
I would just like to note that I am totally down with plugging your crappy product if you give me enough money. (Don’t offer equity, though, unless you think you can sell out in 12 months, ’cause this bubble ain’t gonna make it through 2013.)
…is that Apple is really starting to reach for big cat names. Can you guys just give up and go for OS X Timber Wolf next time? (I would suggest “OS X Coyote,” but if we want our computers to play tricks on us and generally be randomly infuriating, we already have Adobe products.)
I believe we are in the beginning of an era where new products with a great UX will rapidly take down established, but outdated players. You could make a valid argument that this is the normal lifecycle of tech, but there’s something different here. New entrepreneurs have never been so acutely aware of the importance of UX, and I think a lot of the credit goes to Apple.
At first read I agreed with Cross completely, but you know, there’s still a lot of successful products out there that don’t provide a good user experience. Apple is successful by (mostly) providing very good UX and that’s certainly influencing the current generation of designers, but it’s not the only path to success—and there’s an awful lot of work influenced only by Apple’s superficial aesthetics.
Apple doesn’t get a full pass on this. While I frequently find myself agreeing with the colorful and not very shy John Welch, I only mostly agree with his take on last week’s Path flap. He responds to a post from Dustin Curtis who argues that (in Welch’s words) “it’s up to Apple to make sure developers who have an ethics problem can’t do this”:
It is not Apple’s job to design iOS to prevent a bunch of shady dipshits from being shady, it’s the fault of the shady dipshits who should stop being shady. The fact you have the moral code of a pit viper is not Apple’s fault.
Absolutely. And I’d agree that’s true for anything shady that developers put in their code. Malware, sending any kind of tracking information without user consent, reading data from other applications by breaking the “sandbox,” stuff that’s generally illegal. You know, a lot of stuff that Apple explicitly polices the App store for and in a couple cases explicitly does design iOS to make difficult.
Which is, of course, the catch. Apple has explicitly made the case that a platform advantage of iOS is that Apple does verify that developers aren’t being shady dipshits. Isn’t that supposed to be at least part of why the iOS App Store is better than the Android Marketplace? Once you’re pitching that as an essential platform differentiator—and I think it’d be hard to argue Apple doesn’t make that pitch—then “is it Apple’s job to keep developers from being shady dipshits” is not the right question. “Why do apps only have to inform of you of some potential privacy issues, not all” is the right question.
This wouldn’t have helped in the case of Path. This is a point that I haven’t seen made often enough in the discussion. If iOS was, in fact, designed to pop up a dialog box saying “SomeApp is trying to access your address book. Allow or Deny?” like it does for the GPS, that might be a minor improvement—but it doesn’t tell you what the application in question is doing with the address book data. If Path had been asking for address book access without sharing the little detail that it was about to send it all merrily off to its company’s servers, we’d have gone through the exact same brouhaha.
In a sense this brings us back to Welch’s colorfully-phrased argument: Path would need to have explained what they were doing so the user could make an informed decision, and it’s unlikely Apple could easily audit every program to make sure it isn’t doing something unduly surprising. No matter how much Apple does to help, ultimately the last line of defense really is developers who aren’t being dipshits. In a comment, John refers to the idea of the “allow access to the address book?” dialog box option as “security theater,” and it is—to the same degree the one for location services is. In both cases, it’s telling you something you probably should have known already (“what do you mean the Yelp app needs location services to figure out what businesses are around me?”), and none of us have any blessed idea what data any of those apps are sending back to their servers even with those dialog boxes.
Yet the widespread framing that Path could only have done this if they’re horrible people who would sell off your grandma if you left them alone in the room with her for five minutes seems a little histrionic. “Never attribute to malice what can be adequately explained by stupidity” is closer, if you replace stupidity with blind spot. As an engineer, you look at a problem and think of the easiest way to solve it, and even if the privacy implications occur to you—or they’re pointed out to you—you object that your code isn’t actually doing anything bad with the data because, well, you’re not a shady dipshit, and the feature you’re coding gets a significant performance gain this way. I’ve had variants of this discussion with coworkers before. The problem is that “am I a shady dipshit” is not the right question. “How would this look to a customer who has no reason to trust me” is the right question.
At least, that’s the way CNET’s Stephen Shankland reports it:
The problem right now, [the W3C’s Daniel] Glazman said, is that programmers use -webkit prefixed features without including -o for Opera, -ms for Microsoft IE, or -moz for Mozilla’s Firefox. That happens even when those other browsers support the CSS features in question. “I am asking all the Web authors community to stop designing web sites for WebKit only, in particular when adding support for other browsers is only a matter of adding a few extra prefixed CSS properties,” Glazman said.
What Glazman is talking about here, essentially, is that there are a bunch of “newer” CSS properties which are part of the CSS3 standard which browser manufacturers all started to implement with vendor-specific prefixes: the -o and -moz and -webkit mentioned above. Instead of writing “border-radius: 5px” to get rounded borders, you had to write “-webkit-border-radius: 5px”. And repeat that exact same line with every vendor prefix, plus the line without the prefix for good measure.
Shankland compares this to Microsoft’s position with IE, when “programmers would write pages that looked good in it regardless of whether they followed standards or not.” It’s an understandable comparison on a surface level, but I’d argue the bigger problem here is: the W3C working group. The first CSS3 drafts were published in June 1999. While parts of what they’re working on—the advanced layout syntax, for instance—is pretty complex, it should not take twelve years to agree on “border-radius: .” Many of the CSS3 “modules” haven’t changed substantively for nearly a decade. Vendor prefixes are a good thing for implementations of still-in-flux proposals, but the majority of CSS3 proposals haven’t fluxed for years.
Glazman is right that web developers shouldn’t be targeting WebKit only by just using the WebKit CSS3 prefixes and ignoring the others, but the majority of CSS3 that’s in use on the web right now is not in the “wildly experimental weird shit” category, and as web developers we shouldn’t be having to write 4–5 nearly identical lines repeatedly to do bog-simple effects. The right solution isn’t for Mozilla and Opera to adopt the -webkit prefix—it’s for the W3C to stop “recommending” that browsers make web developers treat things that all the current major browsers do identically as experimental.
The company, which generates three-quarters of its revenue from digital, plans to instead focus on seeking licensees to expand its brand licensing program. It plans to continue to offer online and retail photo printing, and desktop printers. In addition to its consumer businesses segment, Kodak has a commercial segment that includes enterprise services, graphics, entertainment, and commercial films units. The company’s remaining consumer services will also include retail-based photo kiosks and dry lab systems.
So, if I’m following: Kodak, driven into bankruptcy by the rise of digital photography, is taking decisive action by dropping all their business lines relating to digital photography so they can focus on: photo printing.
I rarely quote memes, but WHAT IS THIS I DON’T EVEN
Shockingly, an Apple television may require television service
Canada’s Globe and Mail has a story that’s been getting a bit of buzz, as it relates to the television the world is (somewhat inexplicably) waiting for Apple to produce:
Rogers Communications and BCE [Bell] are in talks with Apple to become Canadian launch partners for its much-hyped Apple iTV, a product that has the potential to revolutionize TV viewing by turning conventional televisions into gigantic iPads.
So, two immediate thoughts:
Notice that the “much-hyped Apple iTV” hasn’t been mentioned once, ever, by Apple? Not only is the name “iTV” speculation, the entire product is speculation. The Globe has a “source” that says “Rogers and Bell already have the product in their labs,” which may indeed be true—but again, this is speculation. And there’s nothing that suggests that the product, if it exists, isn’t an update to the existing Apple TV.
What the hell does “gigantic iPad” mean? Seriously, stop it.
Paul Graham may want to kill Hollywood by funding “startups that will compete with movies and TV” because he believes “Hollywood is dying,” but his belief is wrong—what’s against the wall is the modern incarnation of the studio system. By and large, we still want “Parks and Recreation” and “Game of Thrones”; the sticking point is getting them. It’s not the production system that needs to be disrupted. It’s the distribution system.
The original model of selling shows to networks which distribute through local affiliates that play the shows at appointed times survived unchanged for decades, and cable-only channels only cut out the affiliates. From the perspective of the networks, the rise of the TiVo and the DVR didn’t appreciably change that, but from the perspective of DVR users it changed everything. Once you start watching shows on your own schedule, it’s just about impossible to go back. And thank to streaming services and iTunes, we all understand full well that the Internet breaks the remaining chain to the networks entirely: we can subscribe to shows and get new episodes as they’re released, whether it’s through an a la carte system like iTunes or a flat rate system like Hulu Plus or Netflix.
That system has a lot of holes in it right now, though. Most current shows aren’t available on streaming services—or if they are, they have strange, arbitrary limitations—and while many more of them are available on iTunes, even there it’s far from complete, and we have a tendency to balk at the up-front cost of buying a dozen Season Passes for shows that we like.1 Our computers or appliances like the (current) Apple TV should be kick-ass replacements for set top boxes, but they’re not. There are many reasons why, but very few of those reasons are technological.
Yet those non-technological reasons are the ones that need solving. If Apple offered an “iTV” that you had to plug a cable box into, what’s the point? If the cable box still provides the user interface, nothing changes. If the deal Apple is working out boils down to Apple providing a great user experience for a set top box that otherwise works just like all the other ones, well—that changes some things, but only incrementally. Maybe incremental change is better than no change.
The assumption has generally been—and I think this is correct—that Apple would prefer to cut the cable companies out of the distribution model entirely. They’d be happy cutting everybody out of the distribution model entirely, and having studios just sell directly to them. But just like Apple realized they needed to work with carriers rather than compete with them, for the foreseeable future, they may be better off working with cable companies rather than against them. Talking about the iTV is still talking about unicorn-chasing, but at least if we’re talking about the distribution model rather than how wonderful it would be to say “Siri, record ‘The Jersey Shore’ for me,” we’re talking about the right unicorn.
Of course, if you bought 12 season passes at an average of $40 a show, that’s still half of an $80/month cable bill. ↩
I have a more general skepticism about the utility of the “social layer” that Facebook wants to build under the entire economy. In a letter to potential investors, Zuckerberg argues that most products and services can be improved by making them “social.” This has become received wisdom in the Silicon Valley; nowadays every site, app, game, and store plugs into some kind of social network, often Facebook.
The clearest example is social gaming. The S-1 notes that Zynga, the company behind Facebook games like FarmVille, accounts for 12 percent of Facebook’s revenues. Most of the money comes from Facebook’s cut on the sale of virtual in-game items like “zebra unicorns.” The S-1 notes that there’s something dangerous in Facebook’s reliance on a single company for such a significant source of its revenue, but the real problem here isn’t Zynga. It’s the zebra unicorn.
Manjoo thinks it’s “crazy talk” that we’ll all abandon Facebook for some other social platform, and he’s right in that it’s unlikely something will come along that can do to Facebook what Facebook did to MySpace. But over the long term—and in the tech world, “the long term” is in the range of 15 years—I’m not sure that Facebook won’t suffer the same fate that AOL did: not being toppled by a single competitor, but a decline into irrelevance brought about by the advance of technology.
Facebook offers nothing that you can’t get elsewhere on the internet in terms of building an online presence, sharing with friends and family, and keeping up with people in an extended circle—it’s simply that right now, many people find them the most convenient way to do that. A long-term bet on Facebook is a bet that this will still be the case 15 years from now—or a bet that, 15 years from now, Facebook will have pivoted in a way that still keeps them just as relevant. That’s a very tall order.
Okay, Apple nerds: for years I had a MacBook Pro and used either its internal display or an external monitor with the MBP’s lid closed. If I put the MBP to sleep, I could connect the external display and USB keyboard, press a button on the keyboard, and it would wake up, with the internal monitor off. That’s pretty much what I wanted. (I’ve never had any desire to run the MBP with the lid open and the internal monitor off.)
Over the last six months or so I’ve been using a new, Lion-only 13” MacBook Air. I love it, mostly, but it exhibits an infuriating quirk with respect to this: I can’t make it shut off the damn internal display. Ever.
When I do what I described in the first paragraph, the Air will turn on both displays. If I open the lid and close it, then the internal display will go off—but Lion will still think both displays are active, so the mouse pointer will zip off into an imaginary display if it goes off the side of the big monitor. Running “Detect Displays” in Display Preferences doesn’t help—it still finds both monitors.
So: is this a weird problem with this Air, or is this the same experience other Lion/Air users have with this? And more importantly, is there a way to get the older behavior I described back?
It used to be one of the biggest pains of web development. Juggling different browser versions and wasting endless hours coming up with workarounds and hacks. Thankfully, those troubles are now largely optional for many developers of the web.
Hannson takes the now-obligatory shot at IE (“the favorite punching bag of web developers everywhere for a very good reason”). This makes me recall one of the weirder reversals in net history: while everyone remembers IE as beating Netscape Navigator just because IE was bundled with Windows, back in the days of IE 4 versus Navigator 4, Explorer was the standards-compliant one that web developers were pushing users toward.
I don’t agree with 37signals’ decision to make IE 9 the earliest version they support—IE 8 isn’t that old, and businesses tend to keep software in service a long time. But any company that tells you that you have to support IE 6 is a company you don’t want to work at.