|Gregory Gromov, 1995.
Updated: 1998, 2005, 2012
A comprehensive and fascinating
overview of the philosophy and history of the Internet.
Table of Contents:
Table of Contents:
Atlantic cable of 1858 was established to carry instantaneous communications across the
ocean for the first time.
The manufacture of the cable started in early 1857 and was completed in June. Before the end of July it was stowed on the American "Niagara" and the British "Agamemnon" -- both naval vessels lent by their respective governments for the task.
The manufacture of the cable started in early 1857 and was completed in June. Before the end of July it was stowed on the American "Niagara" and the British "Agamemnon" -- both naval vessels lent by their respective governments for the task.
Although the laying of this first cable was seen as a landmark
event in society, it was a technical failure. It only remained in service a few days. Subsequent
cables laid in 1866 were completely successful and compare to events like the moon landing
of a century later... the cable ... remained in use for almost 100 years.
1958 - February 7th - In response to the launch of Sputnik, the US Department of Defense issues directive 5105.15 establishing the Advanced Research Projects Agency (ARPA).
The organization united some of America's most brilliant
people, who developed the United States' first successful satellite in 18 months. Several
years later ARPA began to focus on computer networking and communications technology.
In 1962, Dr. J.C.R. Licklider was chosen to head ARPA's research in improving the military's use of computer technology. Licklider was a visionary who sought to make the government's use of computers more interactive. To quickly expand technology, Licklider saw the need to move ARPA's contracts from the private sector to universities and laid the foundations for what would become the ARPANET.
The Atlantic cable of 1858 and Sputnik of 1957 were two basic milestone of the Internet prehistory. See also: Telecommunications and Computers preHistory
To appreciate the import ante the new computer-aided communication can have, one must consider the dynamics of "critical mass," as it applies to cooperation in creative endeavor. Take any problem worthy of the name, and you find only a few people who can contribute effectively to its solution. Those people must be brought into close intellectual partnership so that their ideas can come into contact with one another. But bring these people together physically in one place to form a team, and you have trouble, for the most creative people are often not the best team players, and there are not enough top positions in a single organization to keep them all happy. Let them go their separate ways, and each creates his own empire, large or small, and devotes more time to the role of emperor than to the role of problem solver. The principals still get together at meetings. They still visit one another. But the time scale of their communication stretches out, and the correlations among mental models degenerate between meetings so that it may take a year to do a weeks communicating. There has to be some way of facilitating communicantion among people wit bout bringing them together in one place.
The visible results of Licklider's approach comes shortly ...
Around Labor Day in 1969, BBN delivered an Interface Message Processor (IMP) to UCLA that was based on a Honeywell DDP 516, and when they turned it on, it just started running. It was hooked by 50 Kbps circuits to two other sites (SRI and UCSB) in the four-node network: UCLA, Stanford Research Institute (SRI), UC Santa Barbara (UCSB), and the University of Utah in Salt Lake City.
The plan was unprecedented: Kleinrock, a pioneering computer science professor at UCLA, and his small group of graduate students hoped to log onto the Stanford computer and try to send it some data.They would start by typing "login," and seeing if the letters appeared on the far-off monitor:
1972: First public demonstration of ARPANET
It took Bob about a year to get everybody far enough along to demonstrate a bunch of applications on the ARPANET. The idea was that we would install a packet switch and a Terminal Interface Processor or TIP in the basement of the Washington Hilton Hotel, and actually let the public come in and use the ARPANET, running applications all over the U.S ....
The demo was a roaring success, much to the surprise of the people at AT&T who were skeptical about whether it would work.
By Vinton Cerf
About one-two years after the first online demo of how "actually let the public come in and use the ARPANET, running applications all over the U.S ...." (Vinton Cerf) the NET became really busy especially "every Friday night":
Around about 1973 - 1975 I maintained PDP 10 hardware at SRI.
By Bob Bell, DEC Field Service.
Logical map of the ARPANET, April 1971
By Charles Babbage Institute, Center For the History of Information Processing, University of Minnesota
The Internet has
changed the way we currently communicate... But could the Internet have performed the function it was originally designed for?
CNN: Would the internet survive nuclear war?
The Internet Post-Apocalypse There's a common myth that the Internet could survive a nuclear attack. If the Internet, or pieces of it, did withstand such a war, how would it be used post-apocalypse? Would the Internet itself be used to wage war? Would it become a sole source of information for the surviving masses?
Or would it be too cluttered with dead sites and falsehoods to be worth anything?
The point that I do want to dust off and raise again is that ARPA wouldn't have happened, if what used to be the Soviet Union hadn't shaken complacent U.S. awake with a tin can in the sky, Sputnik.
Wars do wonders for the advancement of technology, and the Cold one was certainly no exception. The way to get a technology advanced is to gather a lot of really smart people under one roof and get them to concentrate on a single project. Of course, that takes some organization and money. Where does that come from? But that's another can of worms - to be opened with relish at a later date. In this case, it was the only body that had a stake in making sure the Net worked - the government.
What with the Cold War in full swing and all, the military, specifically its think tank the Rand Corporation, was concerned that if the war ever got hot and large chunks of the country were vaporized, those phone lines (not to mention considerable segments of the population) would be radioactive dust. And the top brass wouldn't be able to get in touch and carry on. Thus the packets bouncing from node to node, each of those nodes able to send, receive and pass on data with the same authority as any other. It was anarchy that worked, and on a technical level, it still does, obviously.
REWIRED' by David Hudson, JOURNAL OF A STRAINED NET, August 9th, 1996
Many people don't realize that there is more than a metaphor which connects the "Information Superhighway"with the Interstate Highway System as the Roads That Were Built By Ike. "I like Ike" was an irressistible slogan in 1952.
In Europe: A term often used by the media to describe the Internet. "The Internet Dictionary", Bradford, England
In USA: Information Superhighway / Infobahn: The terms were coined to describe a possible upgrade to the existing Internet through the use of fiber optic and/or coaxial cable to allow for high speed data transmission. This highway does not exist - the Internet of today is not an information superhighway. "Internet Glossary", SquareOne Technology
Gore has become the point man in the Clinton administration's effort to build a national information highway much as his father, former Senator Albert Gore, was a principal architect of the interstate highway system a generation or more earlier.
"Principal Figures in the Development of the Internet".
24 Jun 1986: Albert Gore (D-TN) introduce
21 March 1994:
This was not the observation of a physicist--or a neurologist. Instead, these visionary words were written in 1851 by Nathaniel Hawthorne, one of my country's greatest writers, who was inspired by the development of the telegraph. Much as Jules Verne foresaw submarines and moon landings, Hawthorne foresaw what we are now poised to bring into being...
... I opened by quoting Nathaniel Hawthorne, inspired by Samuel Morse's invention of the
telegraph. Morse was also a famous portrait artist in the U.S.--his portrait of President
James Monroe hangs today in the White House. While Morse was working on a portrait of
General Lafayette in Washington, his wife, who lived about 500 kilometers away, grew ill
and died. But it took seven days for the news to reach him.
The history of every great invention is based on a lot of pre-history. In the case of the World-Wide Web, there are two lines to be traced: the development of hypertext, or the computer-aided reading of electronic documents, and the development of the Internet protocols which made the global network possible.
By Robert Cailliau, Text of a speech delivered at the launching of the European branch of the W3 Consortium, Paris, November 1995
See also: Robert Cailliau: "How It Really Happened "
As usually... in the beginning was - chaos and ...
The Stage is Set - early 1980's.
To my knowledge, the first time any "Internet Protocol" was used at CERN was during the second phase of the STELLA Satellite Communication Project, from 1981-83, when a satellite channel was used to link remote segments of two early local area networks (namely "CERNET", running between CERN and Pisa, and a Cambridge Ring network running between CERN and Rutherford Laboratory). This was certainly inspired by the ARPA IP model, known to the Italian members of the STELLA collaboration (CNUCE, Pisa) who had ARPA connections...
TCP/IP Introduced at CERN.
In August, 1984 I wrote a proposal to the SW Group Leader, Les Robertson, for the establishment of a pilot project to install and evaluate TCP/IP protocols on some key non-Unix machines at CERN including the central IBM-VM mainframe and a VAX VMS system....
By 1990 CERN had become the largest Internet site in Europe and this fact, as mentioned above, positively influenced the acceptance and spread of Internet techniques both in Europe and elsewhere...
The Web Materializes.
A key result of all these happenings was that by 1989 CERN's Internet
facility was ready to become the medium within which Tim Berners-Lee would cle
By Ben M. Segal / CERN PDP-NS / April, 1995
By Ben M. Segal / CERN PDP-NS / April, 1995
Ben Segal: I'm British, graduated in Physics and Mathematics in 1958 from Imperial College, London, then worked for the UK Atomic Energy Authority and later in the USA for the Detroit Edison Company on fast breeder reactor development. I've been at CERN since 1971, after finishing my Ph.D. at Stanford University in Mechanical and Nuclear Engineering...
Except for a sabbatical in 1977, when I worked at Bell Northern Research in Palo Alto on a PABX development project (and encountered UNIX for the first time), CERN has kept me pretty busy on five main projects , including the coordinated introduction of the Internet Protocols at CERN beginning in 1985.
What does it mean: CERN?
We've received this question from one of our readers:
forwarded it to Ben and have got the following answer :
... the acronym "CERN" stands for "Centre European pour la Recherche Nucleaire", the original French name of the organization. More recently it was felt that "Nucleaire" implied reactor or even military applications, so the name of the organization was changed to the ""European color="#400040" y for Part color="#400040" icle Physics" but the acronym was left as it was. Confusing, isn't it?
~ Ben Segal
Why the WWW was born in CERN:
CERN is now the world's largest research laboratory with over 50%of all the active
particle physicists in the world taking part in over 120 different research projects. 3000
staff members, 420 young students and fellows supported by the Organization and 5000
visiting physicists, engineers, computer experts and scientists specializing in a variety
of front-line technologies are collaborating with CERN from 40 countries and 371
scientific institutions .
Highlights of CERN History: 1949 - 1994
Below is our email exchange with Ben Segal:
The first web client and server -- built with NEXTSTEP. The WWW project was originally developed to provide a distributed hypermedia system which could easily access -- from any desktop computer -- information spread across the world. The web includes standard formats for text, graphics, sound, and video which can be indexed easily and searched by all networked machines. Using NeXT's object-oriented technology, the first Web server and client machines were built by CERN -- the European Laboratory for Particle Physics in November 1990. Since then the Web has truly encompassed the globe and access has proliferated across all computer platforms in both the corporate and home markets.
Source: NeXT Software, Inc., 1996
The Web as a NextStep of PC Revolution.
On the road to World Wide Web’s development, the baton was thus passed from Steven Paul Jobs, co-founder of the Apple that ignited the PC revolution, to Tim Berners-Lee, co-inventor of the WWW.
The below following text is the quotation from "Steve
Paul Jobs" biography. By Lee Angelelli, Undergraduate Student, Department of Computer Science, Virginia Tech, Fall 1994.
(Assignment as part of the requirements for the course
"Professionalism in Computing", CS 3604), very lightly edited by J.A.N. Lee
Comments to: "...NextStep closed down
its hardware division in 1993. Jobs realized that he was not going to revolutionize the
During summer 1998 some of the Web surfing people, who usually visited the NeXT Software, Inc. Web site: http://www.next.com , began to receive the following message: "The site has moved to http://www.apple.com/enterprise/ "
Success is never final,
and failure is never fatal.
The Web reminds me of
early days of the PC industry.
~ Steve Jobs, Wired, February 1996
Cc: R. Brun/CN, K. Gieselmann/ECP, R.Jones/ECP, T. Osborne/CN, P. Palazzi/ECP, N.Pellow/CN, B. Pollermann/CN, E.M.Rimmer/ECP
From: T. Berners-Lee/CN, R. Cailliau/ECP
Date: 12 November 1990
... HyperText is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. It provides a single user-interface to large classes of information (reports, notes, data-bases, computer documentation and on-line help). We propose a simple scheme incorporating servers already available at CERN.
The project has two phases: firstly we make use of existing software and hardware as well as implementing simple browsers for the user's workstations, based on an analysis of the requirements for information access needs by experiments. Secondly, we extend the application area by also allowing the users to add new material.
Phase one should take 3 months with the full manpower complement, phase two a further 3 months, but this phase is more open-ended, and a review of needs and wishes will be incorporated into it.
The manpower required is 4 software engineers and a programmer, (one of which could be a Fellow). Each person works on a specific part (eg. specific platform support).
Each person will require a state-of-the-art workstation , but there must be one of each of the supported types. These will cost from 10 to 20k each, totalling 50k. In addition, we would like to use commercially available software as much as possible, and foresee an expense of 30k during development for one-user licences, visits to existing installations and consultancy.
We will assume that the project can rely on some computing support at no cost: development file space on existing development systems, installation and system manager support for daemon software.
. . . . . . .Abstract: HyperText is a way to link and access information of various kinds as a web of nodes in which the user can browse at will. Potentially, HyperText provides a single user-interface to many large classes of stored information such as reports, notes, data-bases, computer documentation and on-line systems help. We propose the implementation of a simple scheme to incorporate several different servers of machine-stored information already available at CERN, including an analysis of the requirements for information access needs by experiments.
Introduction: The current incompatibilities of the platforms and tools make it impossible to access existing information through a common interface, leading to waste of time, frustration and obsolete answers to simple data lookup. There is a potential large benefit from the integration of a variety of systems in a way which allows a user to follow links pointing from one piece of information to another one. This forming of a web of information nodes rather than a hierarchical tree or an ordered list is the basic concept behind HyperText.
At CERN, a variety of data is already available: reports, experiment data, personnel data, electronic mail address lists, computer documentation, experiment documentation, and many other sets of data are spinning around on computer discs continuously. It is however impossible to "jump" from one set to another in an automatic way: once you found out that the name of Joe Bloggs is listed in an incomplete description of some on-line software, it is not straightforward to find his current electronic mail address. Usually, you will have to use a different lookup-method on a different computer with a different user interface. Once you have located information, it is hard to keep a link to it or to make a private note about it that you will later be able to find quickly.
Hypertext concepts: ... A program which provides access to the hypertext world we call a browser. When starting a hypertext browser on your workstation, you will first be presented with a hypertext page which is personal to you : your personal notes, if you like. A hypertext page has pieces of text which refer to other texts. Such references are highlighted and can be selected with a mouse (on dumb terminals, they would appear in a numbered list and selection would be done by entering a number). When you select a reference, the browser presents you with the text which is referenced: you have made the browser follow a hypertext link ... T. Berners-Lee, R. Cailliau
W W Why are they green? "Because I see all "W"s as green..."
Robert Cailliau: Recently I discovered that I'm a synaesthetic. Well, I've known it for a long time, but I did not realise that there was a name for it. I'm one of those people who combine two senses: for me, letters have colours.
Only about one in 25'000 have this condition, which is
perfectly harmless and actually quite useful. Whenever I
think of words, they have colour patterns. For example,
the word "CERN" is yellow, green, red and brown, my
internal telephone number, "5005" is black, white,
white, black. The effect sometimes works like a spelling
checker: I know I've got the right or the wrong number
because the colour pattern is what I remember or not...
The Web timeline according to R. Cailliau:
CERN: A Joint proposal for a hypertext system is presented to the management.
Mike Sendall buys a NeXT cube for evaluation, and gives it to Tim. Tim's prototype implementation on NeXTStep is made in the space of a few months, thanks to the qualities of the NeXTStep software development system. This prototype offers WYSIWYG browsing/authoring! Current Web browsers used in "surfing the Internet" are mere passive windows, depriving the user of the possibility to contribute.
During some sessions in the CERN cafeteria, Tim and I try to find a catching name for the system. I was determined that the name should not yet again be taken from Greek mythology. Tim proposes "World-Wide Web". I like this very much, except that it is difficult to pronounce in French...
The prototype is very impressive, but the NeXTStep system is not widely spread. A simplified, stripped-down version (with no editing facilities) that can be easily adapted to any computer is constructed: the Portable "Line-Mode Browser".
SLAC, the Stanford Linear Accelerator Center in California, becomes the first Web server in USA.
It serves the contents of an existing, large data base of abstracts of physics papers.
Distribution of software over the Internet starts.
The Hypertext'91 conference (San Antonio) allows us a "poster" presentation (but does not see any use of discussing large, networked hypertext systems...).
The portable browser is released by CERN as freeware.
Many HEP laboratories now join with servers: DESY (Hamburg), NIKHEF (Amsterdam), FNAL (Chicago).
Interest in the Internet population picks up.
The Gopher system from the University of Minnesota, also networked, simpler to install, but with no hypertext links, spreads rapidly.
We need to make a Web browser for the X system, but have no in-house expertise. However, Viola (O'Reilly Assoc., California) and Midas (SLAC) are wysiwyg implementations that create great interest.
The world has 50 Web servers!
Some of the other viewpoints on the first 5 years of the Web:
... as Tim
Berners-Lee and other Web developers enriched
the standard for structuring data, programmers
around the world began to enrich the browsers.
In the Web's
first generation, Tim Berners-Lee launched the Uniform
Resource Locator (URL), Hypertext Transfer Protocol
(HTTP), and HTML standards with prototype Unix-based
servers and browsers. A few people noticed that the
Web might be better than Gopher.
Meanwhile -- between
all these 3 generations --
a lot of historical scale events happened.
I started work on Spyglass Mosaic on April 5th, 1994. The demo for our first prospective customer was already on the calendar in May. ... Yes, we licensed the technology and trademarks from NCSA (at the University of Illinois), but we never used any of the code. We wrote our browser implementations completely from scratch, on Windows, MacOS, and Unix.
... Netscape didn't even exist yet, but things happened fast. Just a few weeks after I started coding, Jim Clark rode into town and gathered a select group of programmers from NCSA. Mosaic Communications Corporation was born. It was interesting to note that certain people on the NCSA browser team were not invited to the special meeting. I can still remember hearing about how ticked off they were to be excluded. Champaign-Urbana is a very small town.
Spyglass had the legal right to the "Mosaic" trademark. A few tantrums and lots of lawyering later, MCC changed its name to Netscape.
We thought we had a nice head start on Netscape. We had a really top-notch team and we moved the rest of our developers over to browser work quickly. We were ready to compete with anybody. But Jim Clark was, after all, Jim Clark. His SGI-ness knew how to work the advantages of being in Silicon Valley. He provided his young company with lots of press coverage and very deep pockets.
We decided to approach this market with an OEM business model. Instead of selling a browser to end users we developed core technology and sold it to corporations who in turn provided it to their end users. We considered ourselves to be the arms dealer for the browser wars. Over 120 companies licensed Spyglass Mosaic so they could bundle it into their product. Our stuff ended up in books, operating systems, ATM machines, set-top boxes, help systems, and kiosks. It was an extremely profitable business. The company grew fast and ours was one of the first Internet IPOs.
Along the way, we got involved in the standards process. In fact, I became the chair of the IETF HTML Working Group for the standardization of HTML 2.0. I learned a lot through this experience. In May 1994 I went to the first WWW conference in Geneva, Tim Berners-Lee took me aside and shared his plans for a World-Wide Web Consortium. It didn't take too long for the W3C to become the venue for HTML standards discussions. Eventually this was A Good Thing. Both Netscape and Microsoft became active participants in the W3C HTML Working Group. Any group which didn't have their involvement was doomed to irrelevance.
For much of 1994, it seemed like we were ahead of Netscape. Shortly after we released our 2.0 version, I remember one of the Netscape developers griping about how their schedule had been moved up by six months. We smiled because we knew we were the reason. They had not been taking us seriously and they were being forced to do so.
But Netscape was running at a much faster pace. They got ahead of us on features and they began to give their browser away at no cost to end users. This made Netscape the standard by which all other browsers were judged. If our browser didn't render something exactly like Netscape, it was considered a bug. I hated fixing our browser to make it bug-compatible with Netscape even though we had already coded it to "the standard". Life's not fair sometimes.
We won the Microsoft deal. I suppose only the higher echelons of Spyglass management really know the gory details of this negotiation. I was asked to be the primary technical contact for Microsoft and their effort to integrate our browser into Windows 95. I went to Redmond and worked there for a couple of weeks as part of the "Chicago" team. It was fun, but weird. They gave me my own office. At dinner time, everyone went to the cafeteria for food and then went back to work. On my first night, I went back to my hotel at 11:30pm. I was one of the first to leave.
Internet Explorer 2.0 was basically Spyglass Mosaic with not too many changes. IE 3.0 was a major upgrade, but still largely based on our code. IE 4.0 was closer to a rewrite, but our code was still lingering around -- we could tell by the presence of certain esoteric bugs that were specific to our layout engine.
Licensing our browser was a huge win for Spyglass. And it was a huge loss. We got a loud wake-up call when we tried to schedule our second conference for our OEM browser customers. Our customers told us they weren't coming because Microsoft was beating them up. The message became clear: We sold our browser technology to 120 companies, but one of them slaughtered the other 119.
The time between IE 3 and IE 4 was a defining period for Spyglass. It was clear that the browser war had become a two-player race.
- Even with our IPO stash, we didn't have
the funding to keep up with Netscape.
For the development of IE 4.0, a new Program Manager appeared. His name was Scott Isaacs and I started seeing him at the HTML standards group meetings. At one of those meetings we sat down for a talk which was a major turning point for me and for Spyglass. Scott told me that the IE team had over 1,000 people.
I was stunned. That was 50 times the size of the Spyglass browser team. It was almost as many people as Netscape had in their whole company. I could have written the rest of the history of web browsers on that day -- no other outcomes were possible ...
According to Gary
Wolf, "Andreessen also left the NCSA, departing in
December 1993 with the intention of abandoning Mosaic
development altogether. He moved to California and took
a position with a small software company. But within a
few months he had quit his new job and formed a
partnership with SGI founder Jim Clark.
The (Second Phase
of the) Revolution Has Begun,
There are two ages of the Internet - before Mosaic, and after. The combination of Tim Berners-Lee's Web protocols, which provided connectivity, and Marc Andreesen's browser, which provided a great interface, proved explosive. In twenty-four months, the Web has gone from being unknown to absolutely ubiquitous.
A Brief History of Cyberspace, by Mark Pesce, ZDNet, October 15, 1995
Bill Gates, "...an Internet browser is a trivial piece of software. There are at least 30 companies that have written very credible Internet browsers, so that's nothing..."
The world according to Gates By Don Tennant, InfoWorld Electric, Jan 4, 1996.
"The most important thing for the Web is stay ahead of Microsoft."
still be No. 2 in the Internet race, but it's
rapidly closing the gap. What's more, Microsoft
has forgotten more about PR and marketing than
Netscape ever learned.
Of Silicon Valley and Sominex, by Charles Cooper, PC Week, June 5, 1996.
Magazine, June 26, 1996: Is Microsoft Evil?
December, 1995: i-Pearl Harbor
"Pearl Harbor Day." Time Magazine reported it when Bill Gates declared war on December 7, 1995... By Jeff Sutherland
February, 1996: 2-year Prediction
Steve Jobs: We have a two-year window. If the Web doesn't reach ubiquity in the next two years, Microsoft will own it. And that will be the end of it.
Wire, February 1996, p.162
June, 1996: "How many ...?"
: Netscape has certainly come on awfully strong.
The world according to Gates By Don Tennant, InfoWorld Electric, Jan 4, 1996
The Web Browser market share dramatically changed for a couple of month:
October, 1996: How much?
. . . . .
Bob Ney, QuikNet, Inc. Sacramento CA
Netscape Navigator market-share historical trend:
The First 15 Years of the Browsers Wars:
years of HYPERTEXT concept's EVOLUTION
1945: Vannevar Bush (Science Advisor to president Roosevelt during WW2) proposes Memex -- a conceptual machine that can store vast amounts of information, in which users have the ability to create information trails, links of related texts and illustrations, which can be stored and used for future reference.
"As We May Think " This article was originally published in the July 1945 issue of The Atlantic Monthly... Like Emerson's famous address of 1837 on ``The American Scholar,'' this paper by Vannevar Bush calls for a new relationship between thinking man and the sum of our knowledge.
The human mind does not work that way. It operates by association.
item in its grasp, it snaps instantly to
the next that is suggested by the
association of thoughts, in accordance
with some intricate web of trails
carried by the cells of the brain.
The first idea, however, to be drawn
from the analogy concerns selection.
Selection by association, rather than by
indexing, may yet be mechanized. One
cannot hope thus to equal the speed and
flexibility with which the mind follows
an associative trail, but it should be
possible to beat the mind decisively in
regard to the permanence and clarity of
the items resurrected from storage.
By Vannevar Bush - As We May Think - The Atlantic Monthly, July 1945
1965: Ted Nelson coins the word Hypertext
Ted Nelson, Literary Machines
1967: Andy van Dam and others build the Hypertext Editing System ...
The first working hypertext system was developed at Brown University, by a team led by Andries van Dam. The Hypertext Editing System ran in 128K memory on an IBM/360 mainframe and was funded by IBM, who later sold it to the Houston Manned Spacecraft Center, where it was used to produce documentation for the Apollo space program.
1981: Ted Nelson conceptualizes "Xanadu", a central, pay-per-document hypertext database encompassing all written information ...
words "hypertext" and "hypermedia" were
coined by my friend Ted Nelson in a
paper to the ACM 20th national
conference in 1965, before I (Andrew
Pam) was even born! Although I had come
across occasional articles Ted had
written for Creative Computing magazine,
my first exposure to his legendary
Xanadu project did not occur until 1987
when I purchased the Microsoft Press
second edition of his classic book
Computer Lib / Dream Machines... , which
outlined his idea of a "docuverse" or
universal library of multimedia
By Andrew Pam, Xanadu Australia
All the children of Nelson's
imagination do not have
equal stature. Each is
derived from the one, great,
unfinished project for which
he has finally achieved the
fame he has pursued since
his boyhood. During one of
our (Gary Wolf) many
explained that he never
succeeded as a filmmaker or
businessman because "the
first step to anything I
ever wanted to do was Xanadu."
The Curse of Xanadu, by Gary Wolf, Wired 3.06
If you think you're living in a revolutionary period now, wait till you start getting unsolicited e-mail from the Bolsheviks or Mao, or find yourself on Catherine the Great's home page. World Wide Web will sound like an awfully modest enterprise. You laugh? Go ahead. They laughed at Galileo... Not to mention the Internet.
Philadelphia Online:Philadelphia Inquirer : Books, November 1996
1960. It occurs to me that the future of humanity is at the interactive computer screen, that the new writing and movies will be interactive and interlinked. It will be united by bridges of transclusion (see below) and we need a world-wide network to deliver it with royalty. I begin.
. . . . .
February, 1988. Autodesk buys the Xanadu project, which has been bundled into XOC, Inc. Nelson gives up the trademark.
LATE 1988 the program designed in 1981 is finished (and dubbed 88.1), then set aside, to begin work on a MUCH FINER design-
August, 1992. Autodesk drops the project and gives us carfare. Our heroes find themselves out in the street.
Interesting Times Number Three, October
Nelson's life is so full of unfinished projects that it might fairly be said to be built from them, much as lace is built from holes or Philip Johnson's glass house from windows... He has been at work on an overarching philosophy of everything called General Schematics, but the text remains in thousands of pieces, scattered on sheets of paper, file cards, and sticky notes.
Curse of Xanadu, by Gary Wolf, Wired
Theodor Holm Nelson:
response to the Web was "nice try".
..after the Advisory Committee meeting of the WWW Consortium, in Tokyo, June 1997. This one [photo] was made by Hakon Lie at dinner.
It shows me [Robert Cailliau], sitting between Tim Berners-Lee and Ted Nelson. Tim and Ted are clearly engaged in a serious debate about some hypertext phenomenon behind my back, while I'm discussing philosophy with Hakon, who was sitting opposite me and took the photo."
By R. Cailliau: "Tim, Robert and Ted"
The picture was taken by me [Hakon Wium Lie] in June 97 in the top-floor restaurant of Hotel Shingawa Prince in Tokyo. The table had just finished a "Hokkaido wedding dinner" when this amazing scene revealed itself in front of me. Thankfully there was one last picture left in my "film with lens".
Tim and Ted are clearly talking behind Robert's back, but Robert doesn't seem to mind. Maybe because he had just presented his latest theory about religion in Europe, and -- given that scale -- even hypertext theories fall short.
By Hakon Wium Lie: "Tim, Robert and Ted"
I was right for some wrong reasons
or whether I was right, ...
way to predict the future is to invent it
of transparent support for mirroring
Trying to fix HTML is
like trying to graft arms and legs onto
Hypertext Guru Has New Spin on Old
Plans, Wired, 17.Apr.98.by James
1960 Ted Nelson's designs showed two screen windows connected by visible lines, pointing from parts of an object in one window to corresponding parts of an object in another window. No existing windowing software provides this facility even today.
1965 Nelson's design concentrated on the single-user system and was based on "zipper lists", sequential lists of elements which could be linked sideways to other zipper lists for large non-sequential text structures.
1970 Nelson invented certain data structures and algorithms called the "enfilade" which became the basis for much later work (still proprietary to Xanadu Operating Company, Inc.)
1972 Implementations ran in both Algol and Fortran.
1974 William Barus extended the enfilade concept to handle interconnection.
1979 Nelson assembled a new team (Roger Gregory, Mark Miller, Stuart Greene, Roland King and Eric Hill) to redesign the system.
1981 K. Eric Drexler created a new data structure and algorithms for complex versioning and connection management.
The Project Xanadu team completed the design of a universal networking server for Xanadu, described in various editions of Ted Nelson's book "Literary Machines" ..
1983 Xanadu Operating Company, Inc. (XOC, Inc.) was formed to complete development of the 1981 design.
1988 XOC, Inc. was acquired by Autodesk, Inc. and amply funded, with offices in Palo Alto and later Mountain View California. Work continued with Mark Miller as chief designer. ..
1992 Autodesk entered into the throes of an organisational shakeup and dropped the project, after expenditures on the order of five million US dollars. Rights to continued development of the XOC server were licensed to Memex, Inc. of Palo Alto, California and the trademark "Xanadu" was re-assigned to Nelson.
1993 Nelson re-thought the whole thing and respecified Xanadu publishing as a system of business arrangements. Minimal specifications for a publishing system were created under the name "Xanadu Light", and Andrew Pam of Serious Cybernetics in Melbourne, Australia was licensed to continue development as Xanadu Australia.
1994 Nelson was invited to Japan and founded the Sapporo HyperLab...
By Andrew Pam, Xanadu Australia
Xanadu database makes it possible to
address any substring of any
document from any other document.
the most radical computer dream of the
Ted Nelson,Wired, 3.09
his colleagues of Project Xanadu pioneered
in issues of distributed hypermedia,
distributed documents and evolving edit
systems. It can be argued that HyperCard,
World Wide Web, Lotus Notes and much of
"multimedia" all derive from this work.
Ted Nelson, Be-In, 1996
continue to hold exactly to my
original vision, that
transclusive hypermedia will be
the publishing medium of the
future, under whatever brand
Ted Nelson, Wired, 3.09
One profound insight can be extracted from the long and sometimes painful Xanadu story: the most powerful results often come from constraining ambition and designing only microstandards on top of which a rich exploration of applications and concepts can be supported. That's what has driven the Web and its underlying infrastructure, the Internet.
Xanks and No, Xanks, Wired, 3.09 , by Vint Cerf
Chapter 8: Growth of the Internet - Statistics
The First Decade of the Internet History: Brief Stats Story
*/ The total
number of the all types of Domains (commercial -- com.;
non-profit organizations -- org.; educational -- edu.; ...
Growth of the Internet: Statistics
Compiled from: Nua Internet Surveys
Source: Global Internet Statistics (by Language)
ISP Sources of Revenue: ... early beginning ...
data than voice
take place daily on
telephone calls were
being replaced by
(e-mail) ... ...
increased use of
in addition to
would double the
size of the British
market from its
billion within five
total 23 million
So, according to " Irresponsible Internet Statistics...", .. there is no absolute way to measure any statistic regarding the growth of the Internet. As John Quarterman of MIDS says:
So, "the Internet is getting big,..." Is it always good?
... The Kittridge Street Elementary School, in Los Angeles, killed its music program last year to hire a technology coordinator; ... Mansfield, Massachusetts, administrators dropped proposed teaching positions in art, music, and physical education, and then spent $333,000 on computers; in one Virginia school the art room was turned into a computer laboratory. (Ironically, a half dozen preliminary studies recently suggested that music and art classes may build the physical size of a child's brain, and its powers for subjects such as language, math, science, and engineering -- in one case far more than computer work did.) ...