TechNewSources is a one stop shop for all the latest, datacenter TechnNews you can use.
Dell, EMC, Dell Technologies, Cisco,
Monday, January 15, 2018
Apple, Xerox, IBM, And Fumbling The Future
Thirty-five years ago this week, Apple introduced a computer that changed the way people communicated with their electronic devices, using graphical icons and visual indicators rather than punched cards or text-based commands.

On January 19, 1983, Apple introduced Lisa, a $9,995 PC for business users. Many of its innovations such as the graphical user interface, a mouse, and document-centric computing, were taken from the Alto computer developed at Xerox PARC, introduced as the $16,595 Xerox Star in 1981.
Steve Jobs recalled (in Walter Isaacson’s Steve Jobs) that he and the Lisa team were very relieved when they saw the Xerox Star: “We knew they hadn’t done it right and that we could–at a fraction of the price.” Isaacson says that “The Apple raid on Xerox PARC is sometimes described as one of the biggest heists in the chronicles of industry” and quotes Jobs on the subject: “Picasso had a saying–‘good artists copy, great artists steal’—and we have always been shameless about stealing great ideas… They [Xerox management] were copier-heads who had no clue about what a computer could do… Xerox could have owned the entire computer industry.”
The story of how Xerox invented the future but failed to realize it has become a popular urban legend in tech circles, especially after the publication in 1998 of Fumbling the Future: How Xerox Invented, Then Ignored, the First Personal Computer by D.K. Smith and R.C. Alexander. Moshe Vardi, the former Editor-in-Chief of the Communications of the ACM (CACM), recounted a few years ago his own story of fumbling the future, as a member of a 1989 IBM research team that produced a report envisioning an “information-technology-enabled 21st-century future.”
The IBM report got right some of the implications of its vision of a “global, multi-media, videotext-like utility.” For example, it predicted a reduced need for travel agents, a flood of worthless information, and how “fast dissemination of information through a global information utility” will increase the volatility of politics, diplomacy, and “other aspects of life.”

Vardi also brought to his readers’ attention a video on the future produced by AT&T in 1993 “with a rather clear vision of the future, predicting what was then revolutionary technology, such as paying tolls without stopping and reading books on computers.”
What can we learn from these yesterday’s futures? Vardi correctly concluded that “The future looks clear only in hindsight. It is rather easy to practically stare at it and not see it. It follows that those who did make the future happen deserve double and triple credit. They not only saw the future, but also trusted their vision to follow through, and translated vision to execution.”
But what exactly those who “fumbled the future” did not see? More important, what is it that we should understand now about how their future has evolved?
The IBM report and the AT&T video look prescient today but they repeated many predictions that were made years before 1989 and 1993. These predictions eventually became a reality but it is how we got there that these descriptions of the future missed. To paraphrase Lewis Carroll, if you know where you are going, it matters a lot which road you are taking.

The IBM report says: “In some sense, the proposed vision may not appear to be revolutionary: the envisioned system might be dismissed as a safely predictable extrapolation from and merging of existing information tools that it may complement or even replace.” I would argue that the vision, for both IBM and AT&T, was not just an “extrapolation of existing information tools,” but also an extrapolation of their existing businesses—what they wanted the future to be. Their vision was based on the following assumptions:
The business/enterprise market will be the first to adopt and use the global information utility; the consumer/home market will follow. IBM: “the private home consumer market would probably be the last to join the system because of yet unclear needs for such services and the initial high costs involved.” And: “An important vehicle to spur the development of home applications will be business applications.”
The global information utility will consist of a “global communications network” and “information services” riding on top of it. It will be costly to construct and the performance and availability requirements will be very high. IBM: “Once an information utility is meant to be used and depended on as a ‘multi-media telephone’ system, it must live up to the telephone system RAS [Reliability, Availability, and Serviceability] requirements, which go far beyond most of today’s information systems.” And: “Without 24-hour availability and low MTTR [Mean Time To Repair/Restore] figures, no subscriber will want to rely on such a utility.”
Information will come from centralized databases developed by established information providers (companies) and will be pushed over the network to the users when they request it on a “pay-as-you-go” basis.

When Vardi wrote that “it is practically easy to stare at [the future] and not see it,” he probably meant the Internet, which no doubt all of the authors of the IBM report were familiar with (in 1989, a 20-year-old open network connecting academic institutions, government agencies and some large corporations). But neither IBM nor AT&T (nor other established IT companies) cared much about it because it was not “robust” enough and would not answer the enterprise-level requirements of their existing customers. Moreover, they did not control it, as they controlled their own private networks.
Now, before you say “innovator’s dilemma,” let me remind you (and Professor Christensen) that there were many innovators outside the established IT companies in the 1980s and early 1990s that were pursuing the vision that is articulated so beautifully in the IBM report. The most prominent examples – and for a while, successful – were CompuServe and AOL. A third, Prodigy, was a joint venture of IBM, CBS, and Sears.
So, as a matter of fact, even the established players were trying to innovate along these lines and they even followed Christensen’s advice (which he gave about a decade later) that they should do it outside of their “stifling” corporate walls. Another innovator, previously-successful and very-successful-in-the-future, who followed the same vision, was the aforementioned Steve Jobs, launching in 1988 his biggest failure, the NeXT Workstation (the IBM report talks about NeXT-like workstations as the only access device to the global information utility, never mentioning PCs, or laptops, or mobile phones).
The vision of “let’s-use-a-heavy-duty-access-device-to-find-or-get-costly-information-from-centralized-databases-running-on-top-of-an-expensive-network” was thwarted by one man, Tim Berners-Lee, and his 1989 invention, the World Wide Web.

Berners-Lee put the lipstick on the pig, lighting up with information the standardized, open, “non-robust,” and cheap Internet (which was – and still is – piggybacking on the “robust” global telephone network). The network and its specifications were absent from his vision which was focused on information, on what the end results of the IBM and AT&T visions were all about, i.e., providing people with easy-to-use tool for creating, sharing, and organizing information. As it turned out, the road to letting people plan their travel on their own was not through an expensive, pay-as-you-go information utility, but through a hypermedia browser and an open network only scientists (and other geeks such as IBM researchers) knew about in 1989.
The amazing thing is that the IBM researchers understood well the importance of hypermedia. The only computer company mentioned by name in the report is Apple and its Hypercard. IBM: “In the context of a global multi-media information utility, the hypermedia concept takes on an enhanced significance in that global hypermedia links may be created to allow users to navigate through and create new views and relations from separate, distributed data bases. A professional composing a hyper-document would imbed in it direct hyperlinks to the works he means to cite, rather than painfully typing in references. ‘Readers’ would then be able to directly select these links and see the real things instead of having to chase them through references. The set of all databases maintained on-line would thus form a hypernet of information on which the user’s workstation would be a powerful window.”
Compare this to Tim Berners-Lee writing in Weaving the Web: “The research community has used links between paper documents for ages: Tables of content, indexes, bibliographies and reference sections… On the Web… scientists could escape from the sequential organization of each paper and bibliography, to pick and choose a path of references that served their own interest.” There is no doubt that the future significance of hypermedia was an insanely great insight by the IBM researchers in 1989, including hinting at Berners-Lee’s great breakthrough which was to escape from (in his words) “the straightjacket of hierarchical documentation systems.”
But it was Berners-Lee, not IBM, that successfully translated his vision into a viable product (or, more accurately, three standards that spawned millions of successful products). Why? Because he looked at the future through different lens than IBM’s (or AOL’s).

Berners-Lee’s vision did not focus on the question of how you deliver information – the network – but on the question of how you organize and share it. This, as it turned out, was the right path to realizing the visions of e-books, a flood of worthless information, and the elimination of all kinds of intermediaries. And because this road was taken by Berners-Lee and others, starting with Mosaic (the first successful browser), information became free and its creation shifted in big way from large, established media companies to individuals and small “new media” ventures. Because this road was taken, IT innovation in the last thirty years has been mainly in the consumer space, and the market for information services has been almost entirely consumer-oriented.
I’m intimately familiar with IBM-type visions of the late 1980s because I was developing similar ones for my employer at the time, Digital Equipment Corporation, most famously (inside DEC) my 1989 report, “Computing in the 1990s.” I predicted that the 1990s will give rise to “a distributed network of data centers, servers and desktop devices, able to provide adequate solutions (i.e., mix and match various configurations of systems and staff) to business problems and needs.” Believe me, this was quite visionary for people used to talk only about “systems.” (My report was incorporated in the annual business plan for the VAX line of mini-computers, the plan referring to them as “servers” for the first time).
In another report, on “Enterprise Integration,” I wrote: “Successful integration of the business environment, coupled with a successful integration of the computing environment, may lead to data overload. With the destruction of both human and systems barriers to access, users may find themselves facing an overwhelming amount of data, without any means of sorting it and capturing only what they need at a particular point in time. It is the means of sorting through the data that carry the potential for true Enterprise Integration in the 1990s.” Not too bad, if I may say so myself, predicting Google when Larry Page and Sergey Brin were still in high-school.
And I was truly prescient in a series of presentations and reports in the early 1990s, arguing that the coming digitization of all information (most of it was in analog form at the time), is going to blur what were then rigid boundaries between the computer, consumer electronics, and media “industries.”

But I never mentioned the Internet in any of these reports. Why pay attention to an obscure network which I used a few times to respond to questions about my reports by some geeks at places with names like “Argonne National Laboratory,” when Digital had at the time the largest private network in the world, Easynet, and more than 10,000 communities of VAX Notes (electronic bulletin boards with which DEC employees – and authorized partners and customers and friendly geeks – collaborated and shared information)?
Of course, the only future possible was that of a large, expensive, global, multi-media, high-speed, robust network. Just like Easynet. Or IBM’s and AT&T’s private networks or the visions from other large companies of how the future of computing and telecommunications will be a nice and comforting extrapolation of their existing businesses.
The exception to these visions of the future of computing of the late 1980s and early 1990s was the one produced by Apple in 1987, The Knowledge Navigator. It was also an extrapolation of the company’s existing business, and because of that, it portrayed a different future.
In contrast to IBM’s and AT&T’s (and DEC’s), it was focused on information and individuals, not on communication utilities and commercial enterprises. It featured a university professor, conducting his research work, investigating data and collaborating with a remote colleague, assisted by a talking, all-knowing “smart agent.” The global network was there in the background, but the emphasis was on navigating knowledge and a whole new way of interacting with computers by simply talking to them, as if they were humans.

We are not there yet, but Steve Jobs and Apple moved us closer in 2007 by introducing a new way (touch) for interacting with computers, packaged as phones, which also turned out to be the perfect access devices—much better than NeXT Workstations—to the global knowledge network, the Web.
Back in 1983, the Lisa failed to become a commercial success, the second such failure in a row for Apple. The innovative, visionary company almost fumbled the future. But then it found the right packaging, the right market, and the right pricing for its breakthrough human-computer interface: The Macintosh.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment