My friend, Sheila and I were having brunch at a neighborhood local breakfast spot in Inglewood called Tals' when she said, "Girl, I know that you know that Prince is Performing at the Forum - the Welcome 2 America Tour - 21 Night Stand - and I love Prince." I said, " Me too!" Now, you know there is nothing better than living in the present. We know that symbolism means so much to me, and Prince is very es·o·ter·ic - well so am I!
Fast forward, it is 7:30pm, and we are sitting in front row seats, Lodge, Section 10 - the very first row up front, seats 5,6,7,8. Yep. Can I tell you something? The concert was AMAZING! The show opened up with the incomparable Cassandra Wilson who's mellifluous voice was as soothing as a Mother's love. Ledisi also got up on stage and did her thing - voice is powerful! Oh, baby, but know one put it down like Prince - Prince= BRILLIANT! The band is so entertaining and makes the show POP!
I don't know about you, but I'm going to catch another show. The tickets are almost sold out -85% of them are $25.00 - you know, sign of the economic times, right? ~ Go hear some music that will lift your soul ~ I got me some, go get you some! From this old system of things into to the new system ~ I will always have much love for you PRINCE!!!
Ticketmaster or at The Forum ticket box office.
Love & light,
C
The photo in this picture is the sole property of Creative Commons.org - Prince 2009 in Paris,France.
Saturday, April 30, 2011
Friday, April 22, 2011
Goodbye, IBM. Seriously.
For those of us who worked at Apple in earlier days, the company's current success is sometimes surreal. I had one of those moments today. Back in the mid 1990s, we were struggling to get to $10 billion in revenue per year, a figure that seemed ridiculously high. This week, Apple reported quarterly revenue of $27 billion. Apple is almost certainly now a $100 billion a year company.
To put that in perspective, Apple is now larger than companies like Honda, Sony, Deutsche Telekom, Procter & Gamble, Vodafone -- and IBM. Apple is very close to passing Samsung and HP, which would make it the world's largest computing company.
In 1981, when IBM entered the PC business, Apple ran a big ad in the Wall Street Journal saying "Welcome, IBM. Seriously." At the time, everyone thought it was a very cheeky move by a tiny upstart company. No one -- and I mean absolutely no one -- would have believed that 30 years later Apple would be looking at IBM in the rear view mirror.
The spookiest thing is that Apple may still have a lot of room to grow in both mobile phones and tablets. There's no way the company can keep growing like this indefinitely, but it's very hard to predict exactly when it'll slow down.
To put that in perspective, Apple is now larger than companies like Honda, Sony, Deutsche Telekom, Procter & Gamble, Vodafone -- and IBM. Apple is very close to passing Samsung and HP, which would make it the world's largest computing company.
In 1981, when IBM entered the PC business, Apple ran a big ad in the Wall Street Journal saying "Welcome, IBM. Seriously." At the time, everyone thought it was a very cheeky move by a tiny upstart company. No one -- and I mean absolutely no one -- would have believed that 30 years later Apple would be looking at IBM in the rear view mirror.
The spookiest thing is that Apple may still have a lot of room to grow in both mobile phones and tablets. There's no way the company can keep growing like this indefinitely, but it's very hard to predict exactly when it'll slow down.
Labels:
apple
Thursday, April 21, 2011
Quick Takes: The RIM Tragedy, Lame Market Research, Ebooks Closer to Tipping, Flip vs. Cisco, Google as Microsoft, Nokia and the Word "Primary"
Short thoughts on recent tech news...
RIM as Greek tragedy
I wrote last fall that I was worried about RIM's financial stability (link), but I never expected the company to start inflicting damage on itself. RIM has always come across as a calm, dependable company. Maybe not as flashy as some other firms, but reliable and smart. But as we approached the PlayBook launch, the company has started to look like its own worst enemy.
It's clear that the PlayBook was designed initially as a companion device for people who have BlackBerry phones, and only those people. That's an interesting choice -- not one I would have made, but I can see RIM's logic. But apparently RIM decided late in the game that it needed to market the tablet to a broader range of customers. It started talking up the features those users would need, without making clear that the features would not be included in the device at launch. Many of the things the company has been touting -- such as Android app compatibility and the ability to check e-mail messages independently of a BlackBerry -- were not available when the device shipped. RIM has been marketing vaporware. That guarantees disappointed reviews that focus on what the device doesn't do, rather than what it does. Check out Walt Mossberg's write-up (link).
While this has been going on, RIM co-CEO Mike Lazaridis has been compounding the problem by creating a personal reputation as a loose cannon. His latest escapade was ending a TV interview with BBC when they asked about security issues. The use of the word "security" was mildly provocative, but if you've ever dealt with the British press, you know they specialize in goading people to get an interesting reaction. The more senior your title, the more they'll poke at you, to see if you can take the heat.
The way this game works, there are several techniques you can use to deal with an aggressive question. You can laugh at it, you can calmly point out the flaw in the question, you can answer it earnestly and patiently, and you can even pretend not to understand it (I did that once on a UK TV show and it drove the interviewer crazy because he didn't have time to rephrase the question). But the one thing you can't do is stop the interview. If you do that, the BBC will post a clip of you online that makes you look like a gimlet-eyed prima donna (link).
The fact that Lazaridis did this means either he's losing personal control under pressure, or not being properly briefed by his press people, or both. Whatever the cause, it is unprofessional, and it's making RIM's challenges harder.
If you want to understand the damage being done, you can read the forward-looking obituary of RIM that Slate just ran (link). Or check out this column by Rob Pegoraro of the Washington Post (link). Rob's a very fair-minded, professional journalist who isn't given to hyperbole. But he called Lazaridis' actions "profoundly foolish from any sane marketing perspective...Seriously, does RIM not realize whom it’s competing with? The company is all but begging to get crushed by Apple."
I haven't written off RIM by any means. They have a huge customer base, a great brand, and a long history of overcoming skepticism from people like me. I hope they can do it again. But at a minimum, RIM's management needs to recognize that they do not have the marketing skills needed to play in the world of increased smartphone competition. They need professional help, immediately. And I worry that the marketing problems are actually symptoms of much deeper disorder within the company.
The lamest market research study of the year
It's still early in the year, but I think someone's going to have to work pretty hard to do a lamer market research study than Harris Interactive's EquiTrends survey of mobile phone brands in the US. Harris says the survey indicated that Motorola has the most "brand equity" of mobile phone brands in the US, followed by HTC, Sony Ericsson, Nokia, and Apple. Harris also provided a nice chart of the results (link):
There are a couple of problems here. The first is that the reportedly best-selling mobile phone brand in the US, Samsung, was not included in the results (link). Oops.
The second problem is that Harris doesn't directly measure brand equity (which is a pretty fuzzy concept anyway). What it measures is "Familiarity, Quality, and Purchase Consideration." Those three ratings were combined into an overall brand equity score.
So this is a made-up rating created through a mathematical formula that Harris hasn't shared with the public, as far as I can tell. But Harris assures us that it's meaningful: "Those companies with high brand equity are able to avoid switching behaviors of those brands that lack brand equity." (link). So, according to Harris's research, people in the US should be switching from other phone brands to Motorola.
But in the real world, the exact opposite has been happening. Motorola has been losing share. The number three rated brand, Sony Ericsson, has barely any distribution in the US, so it doesn't have much share to lose. The number four brand, Nokia, has lost most of its US share.
Harris argues that Apple's mediocre score is driven by the sophistication of the iPhone: "There is still a large audience of consumers that aren’t interested in a smartphone running their life, and Apple doesn’t have a product to meet that need." I think that's correct, but HTC also sells only smartphones, and it was ranked number two.
And oh by the way, what's the margin of error in Harris's survey? I can't find it disclosed anywhere, but my guess is that it's several points plus or minus, in which case everyone except Motorola is in a statistical tie. That wouldn't have made for a cool looking marketing chart, though.
It's been distressing to see websites pick up the Harris story and repeat it without questioning the results. PC Magazine swallowed it whole (link), as did MocoNews (link). A lot of other sites reprinted the Harris press release verbatim. Even if you didn't dig into the flaws, the study ought to fail the basic sniff test of credibility -- does anyone really believe that HTC has a stronger brand in the US than Apple?
When I worked at Apple and Palm, we hated synthetic brand rating studies like this one (and the JD Power ratings, which are similar) because the results depend more on the secret formula used by the polling company than on the actual behavior of customers. The polling companies construct these special methodologies because they can then sell long reports to the companies surveyed explaining the results, and also charge the winners for the right to quote the results in their marketing. Check out the fine print at the bottom of the Harris press release: "The EquiTrend® study results disclosed in this release may not be used for advertising, marketing or promotional purposes without the prior written consent of Harris Interactive." I don't know for sure that Harris charges to quote the survey, but that's the usual procedure.
The lesson for all of us is that you should never accept any market research study without looking into its background, even if it comes from a famous research company.
Ebooks: Here comes the tipping point
The continued strong sales of iPad, Kindle, and Nook in the US are bringing us steadily closer to the tipping point where it will pay an author to bypass paper publishing and sell direct to ebooks. The latest evidence is from the Association of American Publishers, which reported that ebooks made up 27% of all book revenue in the US in January-February 2011 (link). AAP correctly pointed out that the ebooks share was raised temporarily by people buying ebooks to read on all of the e-readers they got for Christmas. The share will go down later in the year.
Still, at any share over about 20%, it will be more economical for an established author to self-publish through ebooks (where they can retain 70% of sales revenue) rather than working through a paper publisher (where they get at most 15% of revenue). When we hit that point on a sustained basis, I expect that a lot of authors will move to electronic publishing quickly.
It looks like we'll hit that point sometime this year or next.
Flip aftershocks
Silicon Valley has the attention span of a toddler in a candy store, but it was interesting to see how people around here lingered on the story of Flip's demise several days after the announcement. There were dark suggestions of ulterior motives at Cisco -- that they had bought the company to strip it of its intellectual property (link) or that they shut it down a viable company only so they could look decisive to Wall Street (link). And that was just the stuff in the press. I've heard even more pointed speculation from people working in Silicon Valley.
My guess is the real story is a lot more complicated and nuanced, but at this point it doesn't matter. Killing Flip may have helped Cisco with Wall Street analysts, but the sequence of buying Flip and then shutting it down has seriously damaged the company's image in Silicon Valley as a leader and a partner. Silicon Valley is a very forgiving place. You can make huge strategic mistakes, and waste billions of dollars, and still you'll be forgiven as long as you did it in sincere pursuit of a reasonable business idea. But Cisco's senior management is now viewed as either overconfident to the point of stupidity, or as the deliberate torture-murderer of a beloved consumer brand. I've rarely seen this level of hostility toward a management team, and I don't think they will be forgiven anytime soon, if ever.
Does that have any practical impact on Cisco's business? Not immediately; business is business. But it will probably be a little harder for Cisco to make alliances and hire ambitious people in the future.
Google 2011 = Microsoft 2000?
It's spooky how Google is sometimes starting to remind me of Microsoft circa 2000.
The latest incident was a quote from a Google executive saying that the company wants iPhone to grow because Google makes a lot of money from it (link). Microsoft used to say the same sort of thing about Apple, claiming that it made more when a Mac was sold rather than a Windows PC (link). (The idea was that many Microsoft apps were bundled with Windows at low cost, whereas Mac customers bought Microsoft apps at retail.)
In both cases, the statements may be technically true, but what they really point out is that the company has deep internal conflicts between its various business units. Yes, part of Microsoft wanted to make Macintosh successful, but another part of Microsoft wanted to kill Macintosh. Microsoft as a whole wanted to do both at the same time, which created internal confusion. Add in antitrust lawsuits by governments and Wall Street pressure for quarterly growth, and Microsoft quickly became distracted, inwardly focused, and slow-moving.
Parts of Google, I'm sure, think iPhone is great and want it to grow. But I guarantee that the Android team is trying to kill iPhone (and Nokia, and HP/Palm). Google has its own set of government distractions, plus a big old lawsuit from Oracle, plus legal action by Microsoft and Apple against Android licensees.
There are huge differences between Google and Microsoft, of course. Google is not under the same sort of Wall Street pressure that was applied to Microsoft, and Google's founders have not lost interest in running the company.
But it's disturbing to see how quickly some of Microsoft's symptoms are showing up at Google.
Hey Nokia, how do you define "primary"?
Microsoft and Nokia said they have finalized the contract for their alliance. There were a couple of interesting tidbits in the announcement:
--Both companies said they completed the negotiations sooner than they expected. Usually that sort of statement is hype, but for an agreement of this size, it actually was a pretty fast turnaround.
--They went out of their way to say that Nokia will be paying royalties for Windows Phone similar to what other companies pay. That's important legally and for regulators, so companies like Samsung can't complain that Microsoft is giving discriminatory pricing. At the same time, the announcement also made it clear that Microsoft will be passing a ton of money to Nokia for various services and IP, which Nokia wanted on the record to help with its investors. I think the net effect will be that Nokia gets a free Windows Phone license for a long time. That will not please Samsung, HTC, and the other Windows Phone licensees, because it puts them as a price disadvantage.
--The companies are apparently cross-licensing a lot of patents. I wonder if this will help Nokia with its IP warfare against Apple.
--In an interview with AllThingsD (link), Microsoft andGoogle Nokia said Windows Phone was Nokia's "primary smartphone operating system." That leaves open the door for Nokia to play with other smartphone operating systems, and it leaves completely unanswered the question of tablets. I'm sure the Symbian/Meego fans will be all over that as a ray of hope for their platforms, but to me it just leaves some prudent wiggle room for Nokia in the future. I'd love to know how the agreement defines the words "smartphone" and "primary" -- or if it even has definitions for them.
(Note: Edited on April 22 to fix an embarrassing typo.)
RIM as Greek tragedy
I wrote last fall that I was worried about RIM's financial stability (link), but I never expected the company to start inflicting damage on itself. RIM has always come across as a calm, dependable company. Maybe not as flashy as some other firms, but reliable and smart. But as we approached the PlayBook launch, the company has started to look like its own worst enemy.
It's clear that the PlayBook was designed initially as a companion device for people who have BlackBerry phones, and only those people. That's an interesting choice -- not one I would have made, but I can see RIM's logic. But apparently RIM decided late in the game that it needed to market the tablet to a broader range of customers. It started talking up the features those users would need, without making clear that the features would not be included in the device at launch. Many of the things the company has been touting -- such as Android app compatibility and the ability to check e-mail messages independently of a BlackBerry -- were not available when the device shipped. RIM has been marketing vaporware. That guarantees disappointed reviews that focus on what the device doesn't do, rather than what it does. Check out Walt Mossberg's write-up (link).
While this has been going on, RIM co-CEO Mike Lazaridis has been compounding the problem by creating a personal reputation as a loose cannon. His latest escapade was ending a TV interview with BBC when they asked about security issues. The use of the word "security" was mildly provocative, but if you've ever dealt with the British press, you know they specialize in goading people to get an interesting reaction. The more senior your title, the more they'll poke at you, to see if you can take the heat.
The way this game works, there are several techniques you can use to deal with an aggressive question. You can laugh at it, you can calmly point out the flaw in the question, you can answer it earnestly and patiently, and you can even pretend not to understand it (I did that once on a UK TV show and it drove the interviewer crazy because he didn't have time to rephrase the question). But the one thing you can't do is stop the interview. If you do that, the BBC will post a clip of you online that makes you look like a gimlet-eyed prima donna (link).
The fact that Lazaridis did this means either he's losing personal control under pressure, or not being properly briefed by his press people, or both. Whatever the cause, it is unprofessional, and it's making RIM's challenges harder.
If you want to understand the damage being done, you can read the forward-looking obituary of RIM that Slate just ran (link). Or check out this column by Rob Pegoraro of the Washington Post (link). Rob's a very fair-minded, professional journalist who isn't given to hyperbole. But he called Lazaridis' actions "profoundly foolish from any sane marketing perspective...Seriously, does RIM not realize whom it’s competing with? The company is all but begging to get crushed by Apple."
I haven't written off RIM by any means. They have a huge customer base, a great brand, and a long history of overcoming skepticism from people like me. I hope they can do it again. But at a minimum, RIM's management needs to recognize that they do not have the marketing skills needed to play in the world of increased smartphone competition. They need professional help, immediately. And I worry that the marketing problems are actually symptoms of much deeper disorder within the company.
The lamest market research study of the year
It's still early in the year, but I think someone's going to have to work pretty hard to do a lamer market research study than Harris Interactive's EquiTrends survey of mobile phone brands in the US. Harris says the survey indicated that Motorola has the most "brand equity" of mobile phone brands in the US, followed by HTC, Sony Ericsson, Nokia, and Apple. Harris also provided a nice chart of the results (link):
There are a couple of problems here. The first is that the reportedly best-selling mobile phone brand in the US, Samsung, was not included in the results (link). Oops.
The second problem is that Harris doesn't directly measure brand equity (which is a pretty fuzzy concept anyway). What it measures is "Familiarity, Quality, and Purchase Consideration." Those three ratings were combined into an overall brand equity score.
So this is a made-up rating created through a mathematical formula that Harris hasn't shared with the public, as far as I can tell. But Harris assures us that it's meaningful: "Those companies with high brand equity are able to avoid switching behaviors of those brands that lack brand equity." (link). So, according to Harris's research, people in the US should be switching from other phone brands to Motorola.
But in the real world, the exact opposite has been happening. Motorola has been losing share. The number three rated brand, Sony Ericsson, has barely any distribution in the US, so it doesn't have much share to lose. The number four brand, Nokia, has lost most of its US share.
Harris argues that Apple's mediocre score is driven by the sophistication of the iPhone: "There is still a large audience of consumers that aren’t interested in a smartphone running their life, and Apple doesn’t have a product to meet that need." I think that's correct, but HTC also sells only smartphones, and it was ranked number two.
And oh by the way, what's the margin of error in Harris's survey? I can't find it disclosed anywhere, but my guess is that it's several points plus or minus, in which case everyone except Motorola is in a statistical tie. That wouldn't have made for a cool looking marketing chart, though.
It's been distressing to see websites pick up the Harris story and repeat it without questioning the results. PC Magazine swallowed it whole (link), as did MocoNews (link). A lot of other sites reprinted the Harris press release verbatim. Even if you didn't dig into the flaws, the study ought to fail the basic sniff test of credibility -- does anyone really believe that HTC has a stronger brand in the US than Apple?
When I worked at Apple and Palm, we hated synthetic brand rating studies like this one (and the JD Power ratings, which are similar) because the results depend more on the secret formula used by the polling company than on the actual behavior of customers. The polling companies construct these special methodologies because they can then sell long reports to the companies surveyed explaining the results, and also charge the winners for the right to quote the results in their marketing. Check out the fine print at the bottom of the Harris press release: "The EquiTrend® study results disclosed in this release may not be used for advertising, marketing or promotional purposes without the prior written consent of Harris Interactive." I don't know for sure that Harris charges to quote the survey, but that's the usual procedure.
The lesson for all of us is that you should never accept any market research study without looking into its background, even if it comes from a famous research company.
Ebooks: Here comes the tipping point
The continued strong sales of iPad, Kindle, and Nook in the US are bringing us steadily closer to the tipping point where it will pay an author to bypass paper publishing and sell direct to ebooks. The latest evidence is from the Association of American Publishers, which reported that ebooks made up 27% of all book revenue in the US in January-February 2011 (link). AAP correctly pointed out that the ebooks share was raised temporarily by people buying ebooks to read on all of the e-readers they got for Christmas. The share will go down later in the year.
Still, at any share over about 20%, it will be more economical for an established author to self-publish through ebooks (where they can retain 70% of sales revenue) rather than working through a paper publisher (where they get at most 15% of revenue). When we hit that point on a sustained basis, I expect that a lot of authors will move to electronic publishing quickly.
It looks like we'll hit that point sometime this year or next.
Flip aftershocks
Silicon Valley has the attention span of a toddler in a candy store, but it was interesting to see how people around here lingered on the story of Flip's demise several days after the announcement. There were dark suggestions of ulterior motives at Cisco -- that they had bought the company to strip it of its intellectual property (link) or that they shut it down a viable company only so they could look decisive to Wall Street (link). And that was just the stuff in the press. I've heard even more pointed speculation from people working in Silicon Valley.
My guess is the real story is a lot more complicated and nuanced, but at this point it doesn't matter. Killing Flip may have helped Cisco with Wall Street analysts, but the sequence of buying Flip and then shutting it down has seriously damaged the company's image in Silicon Valley as a leader and a partner. Silicon Valley is a very forgiving place. You can make huge strategic mistakes, and waste billions of dollars, and still you'll be forgiven as long as you did it in sincere pursuit of a reasonable business idea. But Cisco's senior management is now viewed as either overconfident to the point of stupidity, or as the deliberate torture-murderer of a beloved consumer brand. I've rarely seen this level of hostility toward a management team, and I don't think they will be forgiven anytime soon, if ever.
Does that have any practical impact on Cisco's business? Not immediately; business is business. But it will probably be a little harder for Cisco to make alliances and hire ambitious people in the future.
Google 2011 = Microsoft 2000?
It's spooky how Google is sometimes starting to remind me of Microsoft circa 2000.
The latest incident was a quote from a Google executive saying that the company wants iPhone to grow because Google makes a lot of money from it (link). Microsoft used to say the same sort of thing about Apple, claiming that it made more when a Mac was sold rather than a Windows PC (link). (The idea was that many Microsoft apps were bundled with Windows at low cost, whereas Mac customers bought Microsoft apps at retail.)
In both cases, the statements may be technically true, but what they really point out is that the company has deep internal conflicts between its various business units. Yes, part of Microsoft wanted to make Macintosh successful, but another part of Microsoft wanted to kill Macintosh. Microsoft as a whole wanted to do both at the same time, which created internal confusion. Add in antitrust lawsuits by governments and Wall Street pressure for quarterly growth, and Microsoft quickly became distracted, inwardly focused, and slow-moving.
Parts of Google, I'm sure, think iPhone is great and want it to grow. But I guarantee that the Android team is trying to kill iPhone (and Nokia, and HP/Palm). Google has its own set of government distractions, plus a big old lawsuit from Oracle, plus legal action by Microsoft and Apple against Android licensees.
There are huge differences between Google and Microsoft, of course. Google is not under the same sort of Wall Street pressure that was applied to Microsoft, and Google's founders have not lost interest in running the company.
But it's disturbing to see how quickly some of Microsoft's symptoms are showing up at Google.
Hey Nokia, how do you define "primary"?
Microsoft and Nokia said they have finalized the contract for their alliance. There were a couple of interesting tidbits in the announcement:
--Both companies said they completed the negotiations sooner than they expected. Usually that sort of statement is hype, but for an agreement of this size, it actually was a pretty fast turnaround.
--They went out of their way to say that Nokia will be paying royalties for Windows Phone similar to what other companies pay. That's important legally and for regulators, so companies like Samsung can't complain that Microsoft is giving discriminatory pricing. At the same time, the announcement also made it clear that Microsoft will be passing a ton of money to Nokia for various services and IP, which Nokia wanted on the record to help with its investors. I think the net effect will be that Nokia gets a free Windows Phone license for a long time. That will not please Samsung, HTC, and the other Windows Phone licensees, because it puts them as a price disadvantage.
--The companies are apparently cross-licensing a lot of patents. I wonder if this will help Nokia with its IP warfare against Apple.
--In an interview with AllThingsD (link), Microsoft and
(Note: Edited on April 22 to fix an embarrassing typo.)
Tuesday, April 12, 2011
The Real Lesson of Cisco's Billion-Dollar Flip Debacle
Cisco announced that it's closing down the Flip camera business and revisiting its other consumer products. With a purchase cost for Pure Digital (maker of Flip) of over $600 million, and now restructuring charges of $300 million (link), the total cost of Cisco's failed consumer experiment is probably north of a billion dollars, making it one of the larger business debacles in Silicon Valley in the last few years.
Most online analysis of the announcement doesn't really explain what happened. The consensus is that Flip was doomed by competition with smartphones, but that says more about the mindset of the tech media than it does about Cisco's actual decisions. I think the reality is that Cisco just doesn't know how to manage a consumer business.
There are important lessons in that for all tech companies.
Here are some samples from today's online commentary:
Gizmodo (link): "The Flip Camera Is Finally Dead—Your Smartphone’s Got Blood on Its Hands."
Engadget (link): "Cisco CEO John Chambers says the brand is being dispatched as the company refocuses, done in by the proliferation of high-definition sensors into smartphones and PMPs and the like."
ReadWriteWeb (link): "Single-purpose gadgetry has no place in today's smartphone-obsessed world."
ArsTechnica (link): "Flip can't be faring well against the growing number of smartphones with built-in HD cameras. The quality of your typical smartphone video camera is comparable to the Flip, and people have their phones on them all the time."
Computerworld (link): "More and more people are using their smartphones to take lower-quality video...the market for low-cost small video cameras that produce quick-and-easy videos is dead."
There's an old saying that when all you have is a hammer, every problem looks like a nail. We need a similar proverb for news analysis -- when you're obsessed with smartphones, every market change looks like it was caused by them.
But did smartphones alone kill Pure Digital? Two years ago, it was the most promising consumer hardware startup in Silicon Valley. It had excellent products and a rabid customer base. Two years later, it's completely dead. That's a lot to blame on phones. Plus, Cisco appears to be moving away from driving consumer markets in general. The Umi videoconferencing system is being refocused on business, and Cisco CEO John Chambers said, "our consumer efforts will focus on how we help our enterprise and service provider customers optimize and expand their offerings for consumers, and help ensure the network's ability to deliver on those offerings." In other words, we'll be working through partners rather than creating demand on our own (link).
Smartphones didn't cause all of that. But they did play a supporting role in the drama. They commoditized Flip's original features, putting the onus on Cisco to give it new features and innovations. As Rachel King at ZDNet pointed out (link), Cisco failed to respond:
"The technology of Flip never really evolved since then, making it a very stale gadget. Sure, even once Cisco picked up Flip, new models continued to come out each year. Yet Cisco dropped the ball by never pushing further with Flip. It never moved beyond 720p HD video quality, and it never got HDMI connectivity."
Presenting a stationary target is enough to doom any consumer electronics product. For example, what would have happened if Apple had stopped evolving the iPhone after version 1? You'd have no app store, no 3G. Today we'd be talking about iPhone as a cute idea that was fated to be crushed by commodity competition from Android.
Just the way we're talking about Flip.
The important question is why Cisco failed to rise to the challenge. Why didn't it innovate faster? I don't know, because I wasn't there, but I'm sure the transition to Cisco ownership didn't help. It was not a simple acquisition. Cisco didn't just buy Pure Digital and keep it intact, it merged the company into its existing consumer business unit, which was populated by consumer people Cisco had picked up from various Valley companies in the previous few years. Some of the key Flip managers were given new roles reaching beyond cameras, and there must have been intense politics as the various players jockeyed for influence.
Then there was the matter of Cisco's culture. I had a great meeting at Pure Digital several years ago, prior to the merger. They were housed above a department store in San Francisco, in a weird funky space with lots of consumer atmosphere. The office was surrounded by restaurants and shops.
In contrast, visiting Cisco is like visiting a factory. Every building on their massive campus looks the same, with an abstract fountain out front, the walls painted in muted tans and other muddy colors. The buildings are surrounded by an ocean of cars. The lobbies are lined with plaques of the company's patents, and the corridors inside have blown-up photographs of Cisco microprocessors. In the stairwells you'll usually see a couple of crates of networking equipment, shoved under the stairs. And all of the cubicles look the same.
The Cisco campus.
A typical Cisco building.
Cisco is an outstanding company, and an excellent place to work. But it screams respectable enterprise hardware supplier. To someone from a funky consumer company, going there would feel like having your heart ripped out and replaced with a brick.
Then there were the business practices to contend with. As an enterprise company, Cisco is used to long product development cycles, direct sales, and high margins to support all of its infrastructure. A consumer business thrives on fast product cycles, sales through retailers, and low margins used to drive volume. Almost nothing in Cisco's existing business practices maps well to a consumer company. But it's not clear that Cisco understood any of that.
The transition to Cisco management happened at a terrible time for Flip. Just when the company's best people should have been focused obsessively on their next generation of camera goodness, their management was given new responsibilities, and Cisco started "helping out" with ideas like using Flip cameras for videoconferencing -- something that had nothing to do with Flip's original customers and mission.
If Pure Digital had remained independent, would it have innovated quickly enough? Maybe not; it's very hard for a young company to think beyond the product that made it successful. But merging with Cisco, and going through all of the associated disruptions, probably made the task almost impossible.
I'm sure that as the Flip team members get their layoff notices, we'll start to hear a lot more inside scoop. But in the meantime, this announcement by Cisco looks like a classic case of an enterprise company that thought it knew how to make consumer products, and turned out to be utterly wrong.
That's not an unusual story. It's almost impossible for any enterprise company to be successful in consumer, just as successful consumer companies usually fail in enterprise. The habits and business practices that make them a winner in one market doom them in the other.
The lesson in all of this: If you're at an enterprise company that wants to enter the consumer market, or vice-versa, you need to wall off the new business completely from your existing company. Different management, different financial model, different HR and legal.
You might ask, if the businesses need to be separated so thoroughly, why even try to mix them? Which is the real point.
The other lesson of the Flip failure is that we should all be very skeptical when a big enterprise company says it's going consumer. Hey Intel, do you really think you can design phones? (link) Have you already forgotten Intel Play? (link)
I'll give the final word to Harry McCracken (link): "You can be one of the most successful maker of enterprise technology products the world has ever known, but that doesn’t mean your instincts will carry over to the consumer market. They’re really different, and few companies have ever been successful in both."
Right on.
Most online analysis of the announcement doesn't really explain what happened. The consensus is that Flip was doomed by competition with smartphones, but that says more about the mindset of the tech media than it does about Cisco's actual decisions. I think the reality is that Cisco just doesn't know how to manage a consumer business.
There are important lessons in that for all tech companies.
Here are some samples from today's online commentary:
Gizmodo (link): "The Flip Camera Is Finally Dead—Your Smartphone’s Got Blood on Its Hands."
Engadget (link): "Cisco CEO John Chambers says the brand is being dispatched as the company refocuses, done in by the proliferation of high-definition sensors into smartphones and PMPs and the like."
ReadWriteWeb (link): "Single-purpose gadgetry has no place in today's smartphone-obsessed world."
ArsTechnica (link): "Flip can't be faring well against the growing number of smartphones with built-in HD cameras. The quality of your typical smartphone video camera is comparable to the Flip, and people have their phones on them all the time."
Computerworld (link): "More and more people are using their smartphones to take lower-quality video...the market for low-cost small video cameras that produce quick-and-easy videos is dead."
There's an old saying that when all you have is a hammer, every problem looks like a nail. We need a similar proverb for news analysis -- when you're obsessed with smartphones, every market change looks like it was caused by them.
But did smartphones alone kill Pure Digital? Two years ago, it was the most promising consumer hardware startup in Silicon Valley. It had excellent products and a rabid customer base. Two years later, it's completely dead. That's a lot to blame on phones. Plus, Cisco appears to be moving away from driving consumer markets in general. The Umi videoconferencing system is being refocused on business, and Cisco CEO John Chambers said, "our consumer efforts will focus on how we help our enterprise and service provider customers optimize and expand their offerings for consumers, and help ensure the network's ability to deliver on those offerings." In other words, we'll be working through partners rather than creating demand on our own (link).
Smartphones didn't cause all of that. But they did play a supporting role in the drama. They commoditized Flip's original features, putting the onus on Cisco to give it new features and innovations. As Rachel King at ZDNet pointed out (link), Cisco failed to respond:
"The technology of Flip never really evolved since then, making it a very stale gadget. Sure, even once Cisco picked up Flip, new models continued to come out each year. Yet Cisco dropped the ball by never pushing further with Flip. It never moved beyond 720p HD video quality, and it never got HDMI connectivity."
Presenting a stationary target is enough to doom any consumer electronics product. For example, what would have happened if Apple had stopped evolving the iPhone after version 1? You'd have no app store, no 3G. Today we'd be talking about iPhone as a cute idea that was fated to be crushed by commodity competition from Android.
Just the way we're talking about Flip.
The important question is why Cisco failed to rise to the challenge. Why didn't it innovate faster? I don't know, because I wasn't there, but I'm sure the transition to Cisco ownership didn't help. It was not a simple acquisition. Cisco didn't just buy Pure Digital and keep it intact, it merged the company into its existing consumer business unit, which was populated by consumer people Cisco had picked up from various Valley companies in the previous few years. Some of the key Flip managers were given new roles reaching beyond cameras, and there must have been intense politics as the various players jockeyed for influence.
Then there was the matter of Cisco's culture. I had a great meeting at Pure Digital several years ago, prior to the merger. They were housed above a department store in San Francisco, in a weird funky space with lots of consumer atmosphere. The office was surrounded by restaurants and shops.
In contrast, visiting Cisco is like visiting a factory. Every building on their massive campus looks the same, with an abstract fountain out front, the walls painted in muted tans and other muddy colors. The buildings are surrounded by an ocean of cars. The lobbies are lined with plaques of the company's patents, and the corridors inside have blown-up photographs of Cisco microprocessors. In the stairwells you'll usually see a couple of crates of networking equipment, shoved under the stairs. And all of the cubicles look the same.
The Cisco campus.
A typical Cisco building.
Cisco is an outstanding company, and an excellent place to work. But it screams respectable enterprise hardware supplier. To someone from a funky consumer company, going there would feel like having your heart ripped out and replaced with a brick.
Then there were the business practices to contend with. As an enterprise company, Cisco is used to long product development cycles, direct sales, and high margins to support all of its infrastructure. A consumer business thrives on fast product cycles, sales through retailers, and low margins used to drive volume. Almost nothing in Cisco's existing business practices maps well to a consumer company. But it's not clear that Cisco understood any of that.
The transition to Cisco management happened at a terrible time for Flip. Just when the company's best people should have been focused obsessively on their next generation of camera goodness, their management was given new responsibilities, and Cisco started "helping out" with ideas like using Flip cameras for videoconferencing -- something that had nothing to do with Flip's original customers and mission.
If Pure Digital had remained independent, would it have innovated quickly enough? Maybe not; it's very hard for a young company to think beyond the product that made it successful. But merging with Cisco, and going through all of the associated disruptions, probably made the task almost impossible.
I'm sure that as the Flip team members get their layoff notices, we'll start to hear a lot more inside scoop. But in the meantime, this announcement by Cisco looks like a classic case of an enterprise company that thought it knew how to make consumer products, and turned out to be utterly wrong.
That's not an unusual story. It's almost impossible for any enterprise company to be successful in consumer, just as successful consumer companies usually fail in enterprise. The habits and business practices that make them a winner in one market doom them in the other.
The lesson in all of this: If you're at an enterprise company that wants to enter the consumer market, or vice-versa, you need to wall off the new business completely from your existing company. Different management, different financial model, different HR and legal.
You might ask, if the businesses need to be separated so thoroughly, why even try to mix them? Which is the real point.
The other lesson of the Flip failure is that we should all be very skeptical when a big enterprise company says it's going consumer. Hey Intel, do you really think you can design phones? (link) Have you already forgotten Intel Play? (link)
I'll give the final word to Harry McCracken (link): "You can be one of the most successful maker of enterprise technology products the world has ever known, but that doesn’t mean your instincts will carry over to the consumer market. They’re really different, and few companies have ever been successful in both."
Right on.
Friday, April 1, 2011
The Five Most Colossal Tech Industry Failures You've Never Heard Of
The tech industry is famous for forgetting its own history. We're so focused on what's next that we often forget what came before. Sometimes that's useful, because we're not held back by old assumptions. But sometimes it's harmful, when we repeat over and over and over and over the mistakes that have already been made by previous generations of innovators.
In the spirit of preventing those repeated failures, I spent time researching some of the biggest, but most forgotten, failures in technology history. I was shocked by how much we've forgotten -- and by how much we can learn from our own past.
5. Atari Suitmaster 5200
Video console manufacturer Atari was notorious for its boom and bust growth in the 1980s. The company's best-known failure was probably the game cartridge ET the Extraterrestrial, which Atari over-ordered massively in anticipation of hot Christmas sales that never materialized. Legend says that truckloads of ET cartridges were secretly crushed and buried in a New Mexico landfill.
What's much less well known is that Atari was also involved in the creation of an early motion-controller for home videogames, a predecessor of Microsoft's Kinect. Since video detection technology was not sufficiently advanced at the time, the Suitmaster motion controller consisted of a bodysuit with 38 relays sewn into the lining at the joints, plus 20 mercury switches for sensing changes in position. The suit was to be bundled with the home cartridge version of Krull, a videogame based on the science fiction movie of the same name.
A massive copromotion was arranged with the producers of Krull, and Atari made a huge advance purchase of Suitmaster bodysuits and cartridges. Unfortunately, development was rushed, and late testing revealed two difficulties. The first was that the suit's electromechanical components consumed about 200 watts of power, much of which was dissipated as heat. That may not sound like much, but imagine jamming two incandescent light bulbs under your armpits and you'll get the picture. There were also allegedly several unfortunate incidents involving mercury leaks from broken switches, but the resulting lawsuits were settled out of court and the records were sealed, so the reports cannot be verified.
The Christmas promotion was canceled, but Atari didn't give up on the Suitmaster immediately. The next year, it was repurposed as a coin-op game accessory, allowing the user to control a game of Dig Dug through gestures. Sadly, Atari's rushed development caught up with it again. Due to a programming error in the port to Dig Dug, under certain obscure circumstances when Dig Dug got flamed by a Fygar the suit would electrocute the player. (The bug was discovered by an arcade operator trying out the game after hours, in what is now memorialized in coin-op gaming circles as The Paramus Incident). That was the last straw for Atari's corporate parent, Warner Communications. To limit its potential liability if a Suitmaster were to fall into public hands, Warner arranged to have the entire inventory chopped up and mixed into concrete poured into a sub-basement of the Sears Tower in Chicago, which was then undergoing renovation. A small bronze plaque in the third sub-basement of the Sears Tower is the Suitmaster's only memorial:
4. eSocialSite.com
Before Facebook, before MySpace, even before Friendster, the most successful social networking site on the web was eSocial. Largely forgotten today, eSocial thrived in the late 1990s as usage of web browsers took off on PCs. By 1998, it had reached more than 50 million users worldwide, an unheard-of success at the time. Its Series A fundraising in 1999 raised more than $132 million from a consortium of VCs led by Sequoia Capital. Many people still cite eSocial's Super Bowl ad in January 2000, which featured a singing yak puppet, as a classic of the dot-com bubble era. When the company went IPO in February 2000, its stock price made it the 23rd most valuable company in North America.
Unfortunately, just two months later, it was revealed that 99.999974% of eSocial's registered users were fake people simulated algorithmically by a rogue eSocial programmer. The other 13 were middle school students from Connecticut who were technically too young to sign up for the service. eSocial was sued for allowing underage users, which delayed critical service upgrades for several months. By the time the litigation was resolved, Friendster had seized the initiative, and eSocial was quickly forgotten.
eSocial found a second life overseas, though, and today it is still the leading social site in several former Soviet republics in Central Asia. The founders of eSocial have long since left the company, and today are active in Wikidoctor.org, a promising new site that enables people to crowdsource the diagnosis of diseases and other chronic health problems.
3. The cardboard aeroplane
It's an unfortunate fact that wartime is a great stimulator of innovation. Desperation leads countries to try all sorts of crazy ideas. The successful ones become famous, while the failures are usually forgotten. For example, you don't hear much today about Britain's World War II plan to turn icebergs into aircraft carriers (link).
Even more obscure was the effort to create an aircraft from cardboard. One of the greatest bottlenecks in aircraft construction during the war was the shortage of aluminum feedstock. Britain could not expand aluminum production quickly enough to meet its needs, so it attempted to substitute the output from the Empire's massive Canadian paper mills. The idea of a cardboard airplane sounds crazy at first, but cardboard can be incredibly rigid in some directions (as you've found if you've ever tried to smash a box for recycling). Through the proper use of corrugation in multiple directions, the British found that they could create a material with the same tensile characteristics as aluminum, with only slightly greater weight.
Early flight tests of the cardboard aircraft were not encouraging, as the first two test planes broke up suddenly in mid-flight. Subsequent investigation revealed that water was infiltrating the corrugations, and then freezing when the plane reached altitude. The expansion of the ice caused the cardboard to delaminate, resulting in failure of the airframe.
But the engineers persevered, sealing the cardboard with paraffin wax to waterproof it. These new models successfully completed flight tests in the UK, and were demonstrated for Winston Churchill in 1943, who endorsed them enthusiastically.
The new aircraft were deployed to North Africa, where another unfortunate problem appeared: the paraffin melted in the desert heat, causing the planes to wilt on the tarmac. Needless to say, this limited their effectiveness. The British engineers persevered, eventually creating a new waterproofing scheme utilizing used cooking oil. This not only waterproofed the planes, but also made them smell like fish & chips, a definite plus to homesick British airmen. Unfortunately, wartime supplies of cooking oil in Britain were limited, and by the time alternate supplies could be imported from the America South, the war was nearly over.
The cardboard airplane disappeared into history, but its spirit lives on (link).
2. The microwave hairdryer
The 1950s and 1960s were the golden age of innovation in electronics. Companies like HP, Varian, and Raytheon created amazing new devices, often adapted from wartime technologies. One example was the microwave oven, which was derived from radar.
But microwaves were once used for a lot more than cooking food. My dad worked in the electronics industry at the time, and he often told me stories about the remarkable new product ideas he worked on. One was the microwave hairdryer.
Today we're frightened of microwaves because they're "radiation," and that's assumed to be bad. But in the 1960s people understood that microwaves had nothing to do with nuclear radiation. They were just another tool that you could use to get things done, like arsenic or high voltage electronics. Engineers at my dad's employer (which he asked me not to name) were looking for new ways to use microwaves to solve everyday problems. Someone noted the number of hours women spent under rigid-hood hairdryers, used to finish the elaborate hairdos that were prevalent in the 1960s, and realized that a microwave hairdrying helmet could do the same job in just 45 seconds -- creating a massive increase in national productivity.
Unfortunately, the microwave hairdryer ran into a series of technical problems. The first was that the microwaves caused metal bobby pins and hair clips to arc, which frightened customers and gave their hair an unattractive burned smell. That was solved by substituting plastic clips. The second problem was that the microwave frequency that couples best with wet hair is very close to the frequency that couples best with blood plasma. This required some precise adjustments to the three-foot-long Klystron tubes that powered the hairdryers. If they were jostled there was a very slight risk of causing the client's blood to boil (although this never actually happened in practice).
The technical problems were eventually resolved, but the death knell to the microwave hairdryer was something no engineer could fix: a sudden change in hairstyles in the late 1960s. The move toward long straight hair, frequently unwashed among younger people, caused a collapse in the hairdryer market, from which it has never recovered.
There was an abortive attempt to create a microwave blow dryer in the 1970s, but it was pulled from the market when it caused LED watches to burst into flame.
1. Apple Gravenstein
During the Dark Years when Steve Jobs was away, a rudderless and confused Apple Computer churned out a long series of failed initiatives. Their names echo faintly in tech industry history: CyberDog, Taligent, Kaleida, OpenDoc, HyperCard, Pippin, eWorld, emate, A/UX, the 20th Anniversary Macintosh, Macintosh Portable, QuickTake, the G4 Cube (oh, wait, Steve did that one), Newton, and so on.
But the most catastrophic failure was the one Apple worked hardest to hush up, the project called Gravenstein. Simply put, Gravenstein was Apple's secret project to produce an electric automobile.
In the late 1980s, Apple was growing like a weed, but the driver of its growth was the Macintosh product line initiated under Steve Jobs. John Sculley and the rest of Apple's senior management team were concerned with securing their historical legacy by doing something completely different. Sculley, noting the chaos caused in the world economy by the oil embargo of the 1970s, chose to focus on the creation of an all-electric car. Michael Spindler, ironically nicknamed "Diesel," was chosen to manage the production of the vehicle. Bob Brunner drove the overall design, but Jean-Louis Gassee was asked to do the interior, on account of he's French and has good taste.
Apple used its Cray supercomputer to craft a unique teardrop aerodynamic shape for the car. Apple purchased all the needed parts, and planned to begin production in its Fremont, California factory. To prepare the market for the car, Sculley started working automobile references into Apple's advertising. The most famous of these was the "Helocar" advertisement (link). If you watch the ad closely, you can see actual diagrams of the Gravenstein's design and aerodynamic shape, although of course the first version of the car was not intended to fly.
Unfortunately, the public response to the Helocar ad was so overwhelmingly negative that it frightened Apple's Board of Directors. Sculley was ordered to scrap the Gravenstein project, and all documents related to it were shredded and then burned. Although Gravenstein never came to market, its legacy affected Apple's products for decades to come. The Macintosh Portable, for example, used bulky lead-acid batteries that were originally intended to power Gravenstein. And many years later, Jonathan Ive reused the Helocar's aerodynamic shape in the design of the original iMac.
Those are my five top little-known tech failures of all time. What are yours? There are many other candidates. Honorable mentions should include Leonardo da Vinci's steam-powered snail killer, Thomas Alva Edison's notorious electric bunion trimmer, spitr.com, and of course Microsoft Bob.
You can draw many lessons from these failures, but to me the most important lesson of all is that you can't trust blog posts written on this particular date.
In the spirit of preventing those repeated failures, I spent time researching some of the biggest, but most forgotten, failures in technology history. I was shocked by how much we've forgotten -- and by how much we can learn from our own past.
5. Atari Suitmaster 5200
Video console manufacturer Atari was notorious for its boom and bust growth in the 1980s. The company's best-known failure was probably the game cartridge ET the Extraterrestrial, which Atari over-ordered massively in anticipation of hot Christmas sales that never materialized. Legend says that truckloads of ET cartridges were secretly crushed and buried in a New Mexico landfill.
What's much less well known is that Atari was also involved in the creation of an early motion-controller for home videogames, a predecessor of Microsoft's Kinect. Since video detection technology was not sufficiently advanced at the time, the Suitmaster motion controller consisted of a bodysuit with 38 relays sewn into the lining at the joints, plus 20 mercury switches for sensing changes in position. The suit was to be bundled with the home cartridge version of Krull, a videogame based on the science fiction movie of the same name.
A massive copromotion was arranged with the producers of Krull, and Atari made a huge advance purchase of Suitmaster bodysuits and cartridges. Unfortunately, development was rushed, and late testing revealed two difficulties. The first was that the suit's electromechanical components consumed about 200 watts of power, much of which was dissipated as heat. That may not sound like much, but imagine jamming two incandescent light bulbs under your armpits and you'll get the picture. There were also allegedly several unfortunate incidents involving mercury leaks from broken switches, but the resulting lawsuits were settled out of court and the records were sealed, so the reports cannot be verified.
The Christmas promotion was canceled, but Atari didn't give up on the Suitmaster immediately. The next year, it was repurposed as a coin-op game accessory, allowing the user to control a game of Dig Dug through gestures. Sadly, Atari's rushed development caught up with it again. Due to a programming error in the port to Dig Dug, under certain obscure circumstances when Dig Dug got flamed by a Fygar the suit would electrocute the player. (The bug was discovered by an arcade operator trying out the game after hours, in what is now memorialized in coin-op gaming circles as The Paramus Incident). That was the last straw for Atari's corporate parent, Warner Communications. To limit its potential liability if a Suitmaster were to fall into public hands, Warner arranged to have the entire inventory chopped up and mixed into concrete poured into a sub-basement of the Sears Tower in Chicago, which was then undergoing renovation. A small bronze plaque in the third sub-basement of the Sears Tower is the Suitmaster's only memorial:
4. eSocialSite.com
Before Facebook, before MySpace, even before Friendster, the most successful social networking site on the web was eSocial. Largely forgotten today, eSocial thrived in the late 1990s as usage of web browsers took off on PCs. By 1998, it had reached more than 50 million users worldwide, an unheard-of success at the time. Its Series A fundraising in 1999 raised more than $132 million from a consortium of VCs led by Sequoia Capital. Many people still cite eSocial's Super Bowl ad in January 2000, which featured a singing yak puppet, as a classic of the dot-com bubble era. When the company went IPO in February 2000, its stock price made it the 23rd most valuable company in North America.
Unfortunately, just two months later, it was revealed that 99.999974% of eSocial's registered users were fake people simulated algorithmically by a rogue eSocial programmer. The other 13 were middle school students from Connecticut who were technically too young to sign up for the service. eSocial was sued for allowing underage users, which delayed critical service upgrades for several months. By the time the litigation was resolved, Friendster had seized the initiative, and eSocial was quickly forgotten.
eSocial found a second life overseas, though, and today it is still the leading social site in several former Soviet republics in Central Asia. The founders of eSocial have long since left the company, and today are active in Wikidoctor.org, a promising new site that enables people to crowdsource the diagnosis of diseases and other chronic health problems.
3. The cardboard aeroplane
It's an unfortunate fact that wartime is a great stimulator of innovation. Desperation leads countries to try all sorts of crazy ideas. The successful ones become famous, while the failures are usually forgotten. For example, you don't hear much today about Britain's World War II plan to turn icebergs into aircraft carriers (link).
Even more obscure was the effort to create an aircraft from cardboard. One of the greatest bottlenecks in aircraft construction during the war was the shortage of aluminum feedstock. Britain could not expand aluminum production quickly enough to meet its needs, so it attempted to substitute the output from the Empire's massive Canadian paper mills. The idea of a cardboard airplane sounds crazy at first, but cardboard can be incredibly rigid in some directions (as you've found if you've ever tried to smash a box for recycling). Through the proper use of corrugation in multiple directions, the British found that they could create a material with the same tensile characteristics as aluminum, with only slightly greater weight.
Early flight tests of the cardboard aircraft were not encouraging, as the first two test planes broke up suddenly in mid-flight. Subsequent investigation revealed that water was infiltrating the corrugations, and then freezing when the plane reached altitude. The expansion of the ice caused the cardboard to delaminate, resulting in failure of the airframe.
But the engineers persevered, sealing the cardboard with paraffin wax to waterproof it. These new models successfully completed flight tests in the UK, and were demonstrated for Winston Churchill in 1943, who endorsed them enthusiastically.
The new aircraft were deployed to North Africa, where another unfortunate problem appeared: the paraffin melted in the desert heat, causing the planes to wilt on the tarmac. Needless to say, this limited their effectiveness. The British engineers persevered, eventually creating a new waterproofing scheme utilizing used cooking oil. This not only waterproofed the planes, but also made them smell like fish & chips, a definite plus to homesick British airmen. Unfortunately, wartime supplies of cooking oil in Britain were limited, and by the time alternate supplies could be imported from the America South, the war was nearly over.
The cardboard airplane disappeared into history, but its spirit lives on (link).
2. The microwave hairdryer
The 1950s and 1960s were the golden age of innovation in electronics. Companies like HP, Varian, and Raytheon created amazing new devices, often adapted from wartime technologies. One example was the microwave oven, which was derived from radar.
But microwaves were once used for a lot more than cooking food. My dad worked in the electronics industry at the time, and he often told me stories about the remarkable new product ideas he worked on. One was the microwave hairdryer.
Today we're frightened of microwaves because they're "radiation," and that's assumed to be bad. But in the 1960s people understood that microwaves had nothing to do with nuclear radiation. They were just another tool that you could use to get things done, like arsenic or high voltage electronics. Engineers at my dad's employer (which he asked me not to name) were looking for new ways to use microwaves to solve everyday problems. Someone noted the number of hours women spent under rigid-hood hairdryers, used to finish the elaborate hairdos that were prevalent in the 1960s, and realized that a microwave hairdrying helmet could do the same job in just 45 seconds -- creating a massive increase in national productivity.
Unfortunately, the microwave hairdryer ran into a series of technical problems. The first was that the microwaves caused metal bobby pins and hair clips to arc, which frightened customers and gave their hair an unattractive burned smell. That was solved by substituting plastic clips. The second problem was that the microwave frequency that couples best with wet hair is very close to the frequency that couples best with blood plasma. This required some precise adjustments to the three-foot-long Klystron tubes that powered the hairdryers. If they were jostled there was a very slight risk of causing the client's blood to boil (although this never actually happened in practice).
The technical problems were eventually resolved, but the death knell to the microwave hairdryer was something no engineer could fix: a sudden change in hairstyles in the late 1960s. The move toward long straight hair, frequently unwashed among younger people, caused a collapse in the hairdryer market, from which it has never recovered.
There was an abortive attempt to create a microwave blow dryer in the 1970s, but it was pulled from the market when it caused LED watches to burst into flame.
1. Apple Gravenstein
During the Dark Years when Steve Jobs was away, a rudderless and confused Apple Computer churned out a long series of failed initiatives. Their names echo faintly in tech industry history: CyberDog, Taligent, Kaleida, OpenDoc, HyperCard, Pippin, eWorld, emate, A/UX, the 20th Anniversary Macintosh, Macintosh Portable, QuickTake, the G4 Cube (oh, wait, Steve did that one), Newton, and so on.
But the most catastrophic failure was the one Apple worked hardest to hush up, the project called Gravenstein. Simply put, Gravenstein was Apple's secret project to produce an electric automobile.
In the late 1980s, Apple was growing like a weed, but the driver of its growth was the Macintosh product line initiated under Steve Jobs. John Sculley and the rest of Apple's senior management team were concerned with securing their historical legacy by doing something completely different. Sculley, noting the chaos caused in the world economy by the oil embargo of the 1970s, chose to focus on the creation of an all-electric car. Michael Spindler, ironically nicknamed "Diesel," was chosen to manage the production of the vehicle. Bob Brunner drove the overall design, but Jean-Louis Gassee was asked to do the interior, on account of he's French and has good taste.
Apple used its Cray supercomputer to craft a unique teardrop aerodynamic shape for the car. Apple purchased all the needed parts, and planned to begin production in its Fremont, California factory. To prepare the market for the car, Sculley started working automobile references into Apple's advertising. The most famous of these was the "Helocar" advertisement (link). If you watch the ad closely, you can see actual diagrams of the Gravenstein's design and aerodynamic shape, although of course the first version of the car was not intended to fly.
Unfortunately, the public response to the Helocar ad was so overwhelmingly negative that it frightened Apple's Board of Directors. Sculley was ordered to scrap the Gravenstein project, and all documents related to it were shredded and then burned. Although Gravenstein never came to market, its legacy affected Apple's products for decades to come. The Macintosh Portable, for example, used bulky lead-acid batteries that were originally intended to power Gravenstein. And many years later, Jonathan Ive reused the Helocar's aerodynamic shape in the design of the original iMac.
Those are my five top little-known tech failures of all time. What are yours? There are many other candidates. Honorable mentions should include Leonardo da Vinci's steam-powered snail killer, Thomas Alva Edison's notorious electric bunion trimmer, spitr.com, and of course Microsoft Bob.
You can draw many lessons from these failures, but to me the most important lesson of all is that you can't trust blog posts written on this particular date.
Posted April 1, 2011
Subscribe to:
Posts
(
Atom
)