Monthly Archives: November 2015

Ten Ideas to Innovate in Uncertain Times

Following my post yesterday about Invention, Entrepreneurship and Innovation, here is a short presentation I made yesterday about the culture of innovation. I had already mentioned it in a previous post (without the slides) entitled Can the next google come from Europe? An answer by Fathi Derder. Derder, a Swiss politician, has written a book explaining what Switzerland needs to change in the general framework conditions. It is an important book. When I talk to students and young entrepreneurs, I focus more on the importance of culture. Which is what you can read in the slide below. Enjoy!

Invention, Entrepreneurship and Innovation

“Anything that won’t sell, I don’t want to invent.” – Thomas Edison

This article arouse from a discussion with colleagues about what innovation really is. I have to admit the conversation helped me in clarifying and correcting a few misconceptions I had. So let me try to explain how the three concepts of Invention, Entrepreneurship and Innovation differ and how they are related. At least these are my views.

Invention - Entrepreneurship - Innovation

So let me begin with definitions:

Invention: something new, that did not exist previously and that is recognized as the product of some unique intuition or genius. A product of the imagination. Something that has never been made before. “Something new under the sun”. A discovery pre-exists the discoverer, by opposition to the inventor and her/his invention.

Innovation: the successful implementation and adoption by society of something new. So an innovation is the succesful commercialization or use (if non-profit) of an invention.

Entrepreneurship: it is the process of designing a new business (wikipedia). The entrepreneur perceives a (new) business opportunity and gathers the resources to implement it, ideally successfully. When the entrepreneur succeeds in implementing something new, (s)he is an innovator. But (s)he does not need to be an innovator, (s)he can also be an imitator.

So this makes a clear difference between an invention and an innovation. There is always an invention before an innovation, but an innovator does not have to be an inventor. It also shows that an entrepreneur does not have to invent, neither to innovate.

My biggest mistake was to say “big companies do not innovate anymore”. I was wrong, Though most established companies imitate, many do innovate. They rarely invent and not many are entrepreneurial. But in order to innovate, it is better to be established. Let me clarify.

Let me come back to my favorite topic: “A start-up is an organization formed to search for a repeatable and scalable business model.” This is the best definition I have found so far and it comes from Steve Blank. This beautifully explains that all companies are not start-ups (for example when they have a clear business model from day one and/or if they do not try to scale). It also explains when a company is not a start-up anymore. Then it can innovate.

Another misconception is to confuse Research and Development (R&D) with innovation. Research deals with inventing or discovering. Development follows. Innovation comes afterwards. Patenting belong more to the invention side than to the innovation side of the equation. All this explains also why I have so many doubts about innovation metrics. They measure inputs (such as inventions or R&D) more than what innovation really is, an output.

Invention - Innovation

So how are these three concepts related? Read again, Edison’s quote above. In the past, large innovative firms such as IBM or Bell Labs were inventing. They had big R&D labs. Xerox was famous for its inventive capability and low innovation output. So Apple “stole” many of its inventions and innovated instead. Today, many established companies go to universities to find inventions they license. Or they collaborate with partners (i.e. “open innovation”). However, the risk and uncertainty linked to inventing as well as finding a market for new things makes innovation difficult without entrepreneurship…

Entrepreneurship - Innovation

Entrepreneurship is a great way to enable innovation. Entrepreneurs see an opportunity and accept the uncertainty and risk taking. When it is done in-house. It is called intrapreneurship. Nespresso is one example (even if Nestle did not initially encourage its intrapreneur – who by the way was also the inventor). (Indeed because of the definition given above) corporations stop being start-ups when they innovate! Indeed they are often acquired (M&A) by big, established companies who know better how to commercialize – innovate.

Invention - Entrepreneurship

I had to add the intersection between invention and entrepreneurship. But does this make sense? I am not sure. There is however one industry which has combined both without a real need for innovation: the biotechnology industry is mostly an entrepreneurial activity which develops invention thanks to clinical trials. Biotechnology firms seldom innovate (Genentech and Amgen were exceptions – with a few other firms which managed to commercialize their molecules) because they are often acquired by large pharmaceutical firms or at best license their products to the bigger players. In fact many start-ups are in the same situation. But the truth is companies very seldom invent. Inventions occur before firms are established, at least in the high-tech field.

The following extract from Science Lessons: What the Business of Biotech Taught Me About Management, by Gorden Binder, former CEO of Amgen is interesting:
Biotech Model

Inventors, Entrepreneurs and Innovators

Inventor - Entrepreneur - Innovator

For the same reasons as explained above, individuals have seldom the three attributes. At Apple, Wozniak was an inventor. Jobs was an entrepreneur and an innovator. But Bill Gates or Larry Page and Sergey Brin, the Google founders, were rare cases of inventors, entrepreneurs and innovators combined. However Brin and Page invented at Stanford and then created Google to implement succesfully their invention.

So let me finish with a great definition of innovation given in How Google Works [page 206]: “To us, innovation entails both the production and implementation of novel and useful ideas. Since “novel” is often just a fancy synonym for “new”, we should also clarify that for something to be innovative, it needs to offer new functionality, but it also has to be surprising. If your customers are asking for it, you aren’t being innovative when you give them what they want; you are just being responsive. That’s a good thing, but it’s not innovative. Finally “useful” is a rather underwhelming adjective to describe that innovation hottie, so let’s add an adverb and make it radically useful, Voilà: For something to be innovative, it needs to be new, surprising, and radically useful.” […] “But Google also releases over five hundred improvements to its search every year. Is that innovative? Or incremental? They are new and surprising, for sure, but while each one of them, by itself is useful, it may be a stretch to call it radically useful. Put them all together, though, and they are. […] This more inclusive definition – innovation isn’t just about the really new, really big things – matters because it affords everyone the opportunity to innovate, rather than keeping it to the exclusive realm of these few people in that off-campus building [Google[x]] whose job is to innovate.”

Innovation is complex. Do I need to remind you of the challenges that Clayton Christensen – The Innovator’s Dilemma – Geoffrey Moore – Crossing the Chasm – or Steve Blank – The Four Steps to the Epiphany – have brilliantly described to explain why innovation remains somewhat magical…

Innovation Challenges

PS: can you be an entrepreneur without inventing and innovating? Sure! Not just small companies and craftmen who use their know-how for a decent living. You just need to imitate. Telecom operators such as Vodafone or Bouygues Telecom compete without a need to invent or innovate. They copy other telecom operators. (OK sometimes, they innovate, too). In the start-up world, the Samwer brothers have been famous for copy/paste American success stories and adapt them to the European market. You can find many references about this online and the clones they created include Alando (eBay), Zalando (Zappos, EasyTaxi (Uber), Pinspire (Pinterest), StudiVZ (Facebook), CityDeal (acquired by Groupon), Plinga (Zynga), and Wimdu (Airbnb). See also When Samwer was not Samwer yet but was writing a book – way before Rocket Internet and its clones.

Art as an Answer to the Tragedy of Life

This blog is about Start-ups. But from time to time, I take the freedom of touching other topics. Often about Art. It will be again the case here. And also when tragic events occur. Last Friday, Paris was stricken again. And I do not have any answer but to say I believe in Peace and Love, not in war and hate. My brother sent me two beautiful pictures he took in Paris recently. They mean a lot.

WP_20151115_15_29_30_Pro

WP_20151114_16_30_20_Pro

My friend Dominique sent me a quote from René Girard in “Achever Clausewitz”, 2007, Champs Gallimard, pages 57/58« Ces échecs de résolution [de conflit] sont fréquents quand deux groupes « montent aux extrêmes » : nous l’avons vu dans le drame yougoslave, nous l’avons vu au Rwanda. Nous avons beaucoup à craindre aujourd’hui de l’affrontement des chiites et des sunnites en Irak et au Liban. La pendaison de Saddam Hussein ne pouvait que l’accélérer. Bush est, de ce point de vue, la caricature même de ce qui manque à l’homme politique, incapable de penser de façon apocalyptique. Il n’a réussi qu’une chose : rompre une coexistence maintenue tant bien que mal entre ces frères ennemis de toujours. Le pire est maintenant probable au Proche Orient, où les chiites et les sunnites montent aux extrêmes. Cette escalade peut tout aussi bien avoir lieu entre les pays arabes et le monde occidental. Elle a déjà commencé : ce va et vient des attentats et des « interventions » américaines ne peut que s’accélérer, chacun répondant à l’autre. Et la violence continuera sa route. L’affrontement sino-américain suivra…. » which translates as follows:

Girard-Clausewitz

So my reaction goes elsewhere. I recieved emails on Friday night and Saturday from the Space Invader community checking that everyone was all right. People who love street art go out to find the works and take pictures. Indeed Invader was in the same mood last January. He is now invading New York City, with a fifth wave. By following him with a work in progress, I see art as an answer to the tragedy of life.

SI est charlie

Here is my own work in progress:

as well as my map (if I gave you access – Invader asks people not to give inidcations to people who destroy his work).

Marcel Salathé’s Creating a European Culture of Innovation

Regularly but not often enough I read about people calling Europe to wake up and react. Recently it was Nicolas Colin’s What makes an entrepreneurial ecosystem. But now I also remember Risto Siilasmaa’s Entrepreneurship should be cherished” and I had my own Europe, Wake Up!. The latest of these is Marcel Salathé’s Creating a European Culture of Innovation. Another must read. Thanks Marcel! So let me just quote him.

Salathe-Blog

At stake is the future of Europe. And we, the innovators, entrepreneurs, scientists, activists, and artists, need to step up and take ownership of this future. Because if we don’t, Europe will continue its downward trajectory that it’s currently on, and become what it many places it already has transformed into – a museum of history.
[…]
The information and communication technology sector is now the dominant economic driver of growth. Think Apple, Google, Facebook, Amazon, Uber. Noticed something? Not a single European company. Only 1 out of 4 dollars in this sector are made by European companies, and all the indicators for the future are pointing down. Some numbers are even more dire: when you list the top 20 global leaders of internet companies that are public, you know how many are European? Zero. And among all publicly listed companies in the digital economy, 83% are American, and a mere 2% are European. 2%!
[…]
So where’s the problem? Some say it’s VC funding, which is only partially true. Yes, the culture of VC funding is probably less mature in Europe than it is in the US, especially for stage A, B and C funding. But money will find its way into good ideas and market opportunities one way or another. Others say it’s simply the European market, and European regulation. I think that is an illusion. Look at AirBnB, the US startup that now has a valuation of over 25 Billion dollars. It was started as a three person startup in California’s Y Combinator, but it now gets over half(!) of its revenues from within Europe. And by the way, San Francisco is probably one of the worst regulatory environments you can find yourself in. AirBnB is currently facing huge battles in San Francisco, and a Californian judge recently ruled Uber drivers employees, causing a minor earthquake in the booming sharing economy. Indeed, California is probably one of the most regulated of the American States, and yet it does exceedingly well.
I think that the problem is actually quite simple. But it’s harder to fix. It’s simply us. We, the people. We, the entrepreneurs. We, the consumers. I have lived in the San Francisco Bay area for more than three years. What’s remarkable about the area is not its laws, or its regulations, or its market, or its infrastructure. What’s truly remarkable is that almost everyone is building a company in one way or another. Almost everyone wants to be an entrepreneur, or supports them. Almost everyone is busy building the future. Indeed, you can almost physically feel that the environment demands it from you. When someone asks you about what it is you are doing professionally, and you don’t respond by saying that you’re building a company, they look at you funny, as if to say, “then what the hell are you doing here”?

[…]
It’s not a trivial point I think. The other day, I was in Turin in Italy, and I desperately needed a coffee. I walked into the next random coffeeshop, where I was served a heavenly cappuccino, with a chocolate croissant that still makes my mouth wet when I just think about it. Was I just lucky? No – all the coffee shops there are that good. Because the environment demands it. Sure, you can open a low-quality coffee shop in Turin if you want to, but you’ll probably have to file for bankruptcy before you have the time to say buongiorno. The environment will simply not accept bad quality. In another domain, I had the same personal experience when I was a postdoc at Stanford. Looking back, all of my best and most cited papers I wrote there. I don’t think it’s coincidence. Every morning, as I was walking across campus to my office, I could sense the environment demanding that I do the most innovative work – if I didn’t, then what the hell was I doing there?
So this is my message to you. I’m asking you to create those environments, both by doing the best and most innovative you can, but also by demanding the same from everyone else around you. These two things go together; they create a virtuous circle.

[…]
Don’t ask for permission, ask for forgiveness if necessary. If you are waiting for permission, you will wait for the rest of your life. Most rules exist for a simple reason: to protect incumbents. Don’t ask for permission, just go and do it.
[…]
Orson Welles was best at describing why asking for permission is deadly…

[…]
So please, let us all live in the future and build what’s missing – here in Europe. I am worried sick that the easiest way for me to live in the future is to buy a ticket to San Francisco. Just like the easiest way for Americans to relive the past is to buy a ticket to Europe, rich in history. I’m asking you to become even more ambitious, more daring, and more demanding, both of yourself, but most importantly also of your environment.

Salathe talks also about role models. His was the founder of Day Interactive, a Swiss start-up which went public in 2000 before being acquired by Adobe for $250M in 2010. So coming next… its cap. table..

DayInteractiveIPO

Isaacson’s The Innovators (final thoughts) – is the future about thinking machines?

It is always very sad to end reading a great book, but Isaascon’s beautifully finishes his with Ada Lovelace considerations (during the 19th century!) about the role of computers. “Ada might also be justified in boasting that she was correct, at least thus far, in her more controversial contention that no computer, no matter how powerful would ever truly be a “thinking” machine. A century after she died, Alan Turing dubbed the “Lady Lovelace’s Objection” and tried to dismiss it by providing an operational definition of a thinking machine. […] But it’s now been more than sixty years, and the machines that attempt to fool people on the test are at best engaging in lame conversation tricks rather than actual thinking. Certainly none has cleared Ada’s higher bar of being able to “originate” any thoughts of its own. […] Artificial intelligence enthusiasts have long been promising, or threatening, that machines like HAL would soon merge and prove Ada wrong. Such was the promise of the 1956 conference at Dartmouth organized by John McCarthy and Marvin Minsky, where the field of artificial intelligence was launched. The conference concluded that a breakthrough was about twenty years away. It wasn’t.” [Page 468]

800px-Ada_Lovelace_portrait
Ada, Countess of Lovelace, 1840

John von Neumann realized that the architecture of the human brain is fundamentally different. Digital computers deal in precise units, whereas the brain, to the extent we understand it, is also partly an analog system which deals with a continuum of possibilities, […] not just binary yes-no data but also answers such as “maybe” and “probably” and infinite other nuances, including occasional bafflement. Von Neumann suggested that the future of intelligent computing might require abandoning the purely digital approach and creating “mixed procedures”. [Page 469]

“Artifical Intelligence”

Discussion about artificial intelligence flared up a bit, at least in the popular press, after IBM’s Deep Blue, a chess-playing machine beat the world champion Garry Kasparov in 1997 and then Watson, its natural-language question-answering computer won at Jeopardy! But […] these were not breakthroughs of human-like artificial intelligence, as IBM’s CEO was first to admit. Deep Blue won by brute force. […To] one question about the “anatomical oddity” of the former Olympic gymnast George Eyer, Watson answered “What is a leg?” The correct answer was that Eyer was missing a leg. The problem was understanding “oddity”, explained David Ferruci, who ran the Watson project at IBM. “The computer wouldn’t know that a missing leg is odder than anything else.” […]
“Watson did not understand the questions, nor its answers, nor that some of its answers were right and some wrong, nor that it was playing a game, nor that it won – because it doesn’t understand anything, according to John Searle [a Berkeley philosophy professor]. “Computers today are brilliant idiots” John E. Kelly III, IBM’s director of research. “These recent achievements have, ironically, underscored the limitations of computer science and artificial intelligence.” Professor Tomaso Poggio, director of the Center of Brain, Minds, and Machines at MIT. “We do not yet understand how the brain gives rise to intelligence, nor do we know how to build machines that are as broadly intelligent as we are.” Ask Google “Can a crocodile play basketball?” and it will have no clue, even though a toddler could tell you, after a bit of giggling.
[Pages 470-71] I tried the question on Google and guess what. It gave me the extract by Isaacson…

The human brain not only combines analog and digital processes, it also is a distributed system, like the Internet, rather than a centralized one. […] It took scientists forty years to map the neurological activity of the one-millimeter long roundworm, which has 302 neurons and 8,000 synapses. The human brain has 86 billion neurons and up to 150 trillion synapses. […] IBM and Qualcomm each disclosed plans to build “neuromorphic”, or brain-like, computer processors, and a European research consortium called the Human Brain project announced that it had built a neuromorphic microchip that incorporated “fifty million plastic synapses and 200,000 biologically realistic neuron models on a single 8-inch silicon wafer. […] These latest advances may even lead to the “Singularity” a term that von Neumann coined and the futurist Ray Kurzweil and the science fiction writer Vernor Vinge popularized, which is sometimes used to describe the moment when computers are not only smarter than humans but also can design themselves to be even supersmarter, and will thus no longer need us mortals. Isaacon is wiser than I am (as I feel that these ideas are stupid) when he adds: “We can leave the debate to the futurists. Indeed depending on your definition of consciousness, it may never happen. We can leave “that” debate to the philosophers and theologians. “Human ingenuity” wrote Leonardo da Vinci “will never devise any inventions more beautiful, nor more simple, nor more to the purpose than Nature does”. [Pages 472-74]

Computers as a Complement to Humans

Isaacson adds: “There is however another possibility, one that Ada Lovelace would like. Machines would not replace humans but would instead become their partners. What humans would bring is originality and creativity” [page 475]. After explaining that in a 2005 chess tournament, “the final winner was not a grandmaster nor a state-of-the-art computer, not even a combination of both, but two Americans amateurs who used three computers at the same time and knew how to manage the process of collaborating with their machines” (page 476) and that “in order to be useful, the IBM team realized [Watson] needed to interact [with humans] in a manner that made collaboration pleasant” (page 477) Isaacson further speculates:

Let us assume, for example, that a machine someday exhibits all of the mental capabilitie of a human: giving the outward appearance of recognizing patterns, perceiving emotions, appreciating beauty, creating art, having desires, forming moral values, and pursuing goals. Such a machine might be able to pass a Turing test. It might even pass we could call the Ada test, which is that it could appear to “originate” its own thoughts that go beyond what we humans program it to do. There would, however, be still another hurdle before we could say that artificial intelligence has triumphed over augmented intelligence. We call it the Licklider Test. It would go beyong asking whether a machine could replicate all the components of human intelligence to ask whether the machine accomplishes these tasks better when whirring away completely on its own or when working in conjunction with humans. In other words, is it possible that humans and machines working in partnership will be indefinitely more powerful than an artificial intelligence machine working alone? If so, then “man-computer symbiosis,” as Licklider called it, will remain triumphant. Artificial Intelligence need not be the holy grail of computing. The goal instead could be to find ways to olptimize the collaboration between human and machine capabilities – to forge a èartnership in which we let the machines do what they do best, and they let us do what we do best. [Pages 478-79]

Ada’s Poetical Science

At his last such apperance, for the iPad2 in 2011, Steve Jobs declared: “It’s in Apple’s DNA that technology alone is not enough – that it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heat sing”. The converse to this paean to the humanities, however, is also true. People who love the arts and humanities should endeavor to appreciate the beauties of math and physics, just as Ada did. Otherwise they will be left at the intersection of arts and science, where most digital-age creativity will occur. They will surrender control of that territory to the engineers. Many people who celebrate the arts and the humanities, who applaud vigorously the tributes to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don’t understand math or physics. They extoll the virtues of learning Latin, but they are clueless about how to write an algorithm or tell BASIC from C++, Python from Pascal. They consider people who don’t know Hamlet from Macbeth to be Philistines, yet they might merrily admit that they don’t know the difference between a gene and a chromosome, or a transistor and a capacitor, or an integral and a differential equation. These concepts may seem difficult. Yes, but so, too, is Hamlet. And like Hamlet, each of these concepts is beautiful. Like an elegant mathematical equation, they are expressions of the glories of the universe. [Pages 486-87]

leonardo-da-vinci-vitruvian-man
Issacson’s book last page presents Vinci’s Vitruvian Man, 1492

Halt and Catch Fire – the TV series about innovation (without Silicon Valley and start-ups)

I will always remember the day when one of my former bosses told me I should focus on (watching, making) videos rather than (reading, writing) books. I am a book person so I will probably not follow his advice ! Still from time to time I discover movies about High-tech innovation and entrepreneurship, start-ups.

Halt and Catch Fire is not precisely about start-ups, it is not a documentary, it is not a movie. It is a TV series that is certainly more serious (and less fun) than HBO’s Silicon Valley. It is an interesting accident that I began watching it while reading Isaacson’s the Innovators. Both talk about the early days of Personal Computers in a (rather) dramatic manner.

I am still in the beginning of Season 1 so my comments come as much from what I read as from what I saw! Halt and Catch Fire takes place in Texas (not in Silicon Valley), in an established company, Cardiff Electric (not a start-up) where three individuals who should probably have never met, a sales man, an engineer and a geek (not entrepreneurs) will try to prove to the world that they can change it. So why Texas? According to French Wikipedia: “Season 1 (which takes place in 1983-198) is inspired by the creation of Compaq launched in 1982 to develop the first IBM-compatible portable PC. Compaq engineers had to reverse engineer by disassembling the IBM BIOS to make a compatible version rewritten by people who had never seen the IBM BIOS in order not to violate copyrights.” (My Compaq cap. table below.)

Scoot McNairy as Gordon Clark, Mackenzie Davis as Cameron Howe and Lee Pace as Joe MacMillan - Halt and Catch Fire _ Season 1, Gallery - Photo Credit: James Minchin III/AMC

Scoot McNairy as Gordon Clark, Mackenzie Davis as Cameron Howe and Lee Pace as Joe MacMillan – Halt and Catch Fire _ Season 1, Gallery – Photo Credit: James Minchin III/AMC

I should credit Marc Andreessen for helping me discovering this new AMC TV series. In a long portrait by the New Yorker, the Netscape founder mentions the series: “He pushed a button to unroll the wall screen, then called up Apple TV. We were going to watch the final two episodes of the first season of the AMC drama “Halt and Catch Fire,” about a fictional company called Cardiff, which enters the personal-computer wars of the early eighties. The show’s resonance for Andreessen was plain. In 1983, he said, “I was twelve, and I didn’t know anything about startups or venture capital, but I knew all the products.” He used the school library’s Radio Shack TRS-80 to build a calculator for math homework.” […] “The best scenes with Cameron were when she was alone in the basement, coding.” I said I felt that she was the least satisfactory character: underwritten, inconsistent, lacking in plausible motivation. He smiled and replied, “Because she’s the future.”

According to Wikipedia’s article about the series, “the show’s title refers to computer machine code instruction HCF, the execution of which would cause the computer’s central processing unit to stop working (“catch fire” was a humorous exaggeration).” It the series is not about entrepreneurship and start-ups so far, it is about rebellion, mutiny. There is a beautiful moment where one of the heroes convince his two colleagues to follow when they are about to stop. They are on quest.

amc_haltandcatchfire_mutiny

I haven’t seen many movies and videos about my favorite topic so let me try and recapitulate:
– I began with Something Ventured, a documentary about the early days of Silicon Valley entrepreneurs and venture capitalists.
The Startup Kids is another documentary about young (mostly) web entrepreneurs. Often very moving.
HBO’s Silicon Valley is funnier than HFC but maybe not as good. Only time will say.
– I saw The Social Network which seems to remain the best fiction movie about all this, but
– I have not seen the two movies about Steve Jobs. It’s apparently not worth watching Jobs (2013) but I will probably try not to miss Steve Jobs (2015)

So as a conclusion, watch the trailer.

The Compaq Capitalization Table at IPO

Compaq_Cap_Table

Walter Isaacson’s The Innovators (part 4) – Steal… or Share?

How many times will I say how great a book is Walter Isaacson’s The Innovators – How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution ? And how many posts will I write about it ? Now this this the 4th part ! Isaacson shows how collaboration in software contributed to a unique value creation. This may mean sharing but also stealing !

Gates complained to the members of the Homebrew Computer Club about this: “Two surprising things are apparent, however, 1) Most of these “users” never bought BASIC (less than 10% of all Altair owners have bought BASIC), and 2) The amount of royalties we have received from sales to hobbyists makes the time spent on Altair BASIC worth less than $2 an hour. Why is this? As the majority of hobbyists must be aware, most of you steal your software. Hardware must be paid for, but software is something to share. Who cares if the people who worked on it get paid? Is this fair? One thing you don’t do by stealing software is get back at MITS for some problem you may have had. MITS doesn’t make money selling software. […] The thing you do is theft. I would appreciate anyone who wants to pay.” [Page 342 and http://www.digibarn.com/collections/newsletters/homebrew/V2_01/gatesletter.html]

homebrew_V2_01_p2

But Isaacson nuances : “Still there was a certain audacity to the letter. Gates was, after all, a serial stealer of computer time, and he had manipulated passwords to hack into accounts from eighth grade through his sophomore year at Harvard. Indeed, when he claimed in his letter that he and Allen had used more than $40,000 worth of computer time to make BASIC, he omitted the fact he had never actually paid for that time. […] Also, though Gates did not appreciate it at the time, the widespread pirating of Microsoft BASIC helped his fledgling company in the long run. By spending so fast, Microsoft BASIC became a standard, and other computer makers had to license it.” [Page 343]

And what about Jobs and Wozniak? Everyone knows about how phone phreaks had created a device that emitted just the right tone chirps to fool the Bell System and cadge free long-distance calls. […] “I have never designed a circuit I was prouder of. I still think it was incredible”. They tested it by calling the Vatican, with Wozniak pretending to be Henry Kissinger needing to speak to the pope, it took a while but the officials at the Vatican finally realized it was a prank before they woke up the pontiff. [Page 346]

Gates, Jobs and the GUI

And the greatest robbery may have been the GUI – Graphical User Interface. But who stole? Later when he was challenged about pilfering Xerox’s ideas, Jobs quoted Picasso: “Good artists copy, great artists steal. And we have been always shameless about stealing great ideas. They were copier-heads who had no clue about what a computer could do.” [Page 365]

However when Microsoft copied Apple for Windows, it was a different story… “In the early 1980s, before the introduction of the Macintosh, Microsoft had a good relationship with Apple. In fact on the day that IBM launched its PC in August 1981, Gates was visiting Jobs at Apple, which was a regular occurrence since Microsoft was making most of its revenue writing software for the Apple II. Gates was still the supplicant in the relationship. In 1981, Apple had $334 million in revenue, compared to Microsoft’s $15 million. […] Jobs had one major worry about Microsoft: he didn’t want it to copy the graphical user interface. […] His fear that Gates would steal the idea was somewhat ironic, since Jobs himself had filched the concept from Xerox.” [Pages 366-67]

Things would go worse… “Well, Steve, I think there’s more than one way of looking at it. I think it’s more like we both had this rich neighbor named Xerox and I broke into his house to steal the TV set and I found out that you had already stolen it”. [Page 368]

Stallman, Torvalds, free- and open-source

There would be other oppositions. The hacker corps that grew up around GNU [Stallman’s free software] and Linux [Torvalds’ open software] showed that emotional incentives, beyond financial rewards, can motivate voluntary collaboration. “Money is not the greatest of motivations,” Torvalds said. “Folks do their best work when they are driven by passion. When they are having fun. This is as true for playwrights and sculptors and entrepreneurs as it is for software engineers.” There is also, intended or not, some self-interest involved. “Hackers are also motivated, in large part, by the esteem they can gain in the eyes of their peers, improve their reputation, elevate their social status. Open source development gives programmers the chance.” Gates “Letter to Hobbysts”, complaining about the unauthorized sharing of Microsoft BASIC, asked in a chiding way, “who can afford to do professional work for nothing?”. Torvalds found that an odd outlook. He and Gates were from two very different cultures, the communist-tinged radical academia of Helsinki versus the corporate elite of Seattle. Gates may have ended up with the bigger house, but Torvalds reaped anti-establishment adulation. “Journalists seemed to love the fact that, while Gates lived a high-tech lakeside mansion, I was tripping over my daughter’s playthings in a three-bedroom ranch house with bad plumbing in boring Santa Clara,” he said with ironic self-awareness. “And that I drove a boring Pontiac. And answered my own phone. Who wouldn’t love me?” [Pages 378-79]

Which does not make open a friend of free. The disputes went beyond mere substance and became, in some ways, ideological. Stallman was possessed by a moral clarity and unyielding aura, and he lamented that “anyone encouraging idealism today faces a great obstacle: the prevailing ideology encourages people to dismiss idealism as ‘impractical'”. Torvalds, on the contrary, was unabashedly practical, like an engineer. “I led the pragmatists,” he said. “I have always thought that idealistic people are interesting, but kind of boring and scary.” “Torvalds admitted to “not exactly being a huge fan” of Stallman, explaining, “I don’t like single-issue people, nor do I think that people who turn the world into black and white are very nice or ultimately very useful. The fact is, there aren’t just two sides to any issue, there’s almost always a range of responses, and ‘it depends’ is almost always the right answer to any big question. He also believed it should be permissible to make money from open-source software. “Open-source is about letting everybody play. Why should business, which fuels so much of society’s technological advancement, be excluded?”. Software may want to be free, but the people who write it may want to feed their kids and reward their investors. [Page 380]