Produced by Al Haines
THE EBOOK IS 40 (1971-2011)
Marie Lebert
Project Gutenberg News, 2011
INTRODUCTION
The ebook (electronic book) is 40 years old. After humble beginnings, it is firmly standing alongside the print book. We now read ebooks on our computers, PDAs, mobile phones, smartphones and ebook readers.
“The ebook is 40” is a chronology in 44 episodes from 1971 to 2011. Unless specified otherwise, the quotes are excerpts from the NEF Interviews <www.etudes-francaises.net/entretiens/>, University of Toronto, and the interviews that followed as a complement. Many thanks to all those who are quoted here, for their time and their friendship.
Part of this book was published as a series of articles in Project Gutenberg News <www.gutenbergnews.org> in July 2011, to celebrate the 40th anniversary of Project Gutenberg on 4 July 2011.
This book marks the very end of a 12-year research project, with 100 participants worldwide.
Marie Lebert is a researcher and journalist specializing in technology for books and languages. Her books are freely available in Project Gutenberg <www.gutenberg.org>, in various formats for any electronic device.
Copyright © 2011 Marie Lebert
TABLE OF CONTENTS
1971 > Project Gutenberg, a visionary project 1974 > The internet “took off” 1990 > The invention of the web 1991 > From ASCII to Unicode 1992 > Homes for electronic texts 1993 > The Online Books Page 1993 > PDF, from past to present 1994 > The internet as a marketing tool 1995 > The print press went online 1995 > Amazon, a pioneer in cybercommerce 1996 > The Internet Archive, for future generations 1996 > Libraries launched websites 1996 > Towards a digital knowledge 1996 > The @folio project, a mobile device for texts 1997 > Multimedia convergence 1997 > A portal for European national libraries 1997 > E Ink, an electronic ink technology 1998 > The Electronic Beowulf Project 1998 > Web-extended commercial books 1998 > A more restrictive copyright law 1998 > The first ebook readers 1999 > Librarians in cyberspace 1999 > The Ulysses Bookstore on the web 1999 > The internet as a novel character 2000 > Encyclopedias and dictionaries 2000 > The web portal yourDictionary.com 2000 > A standard format for ebooks 2000 > Experiments by best-selling authors 2000 > Cotres.net, works of digital literature 2000 > The Gutenberg Bible online 2001 > Broadband became the norm 2001 > Wikipedia, a collaborative encyclopedia 2001 > The Creative Commons license 2003 > Handicapzéro, the internet for everyone 2003 > The Public Library of Science 2004 > The web 2.0, community and sharing 2005 > From PDAs to smartphones 2005 > From Google Print to Google Books 2005 > The Open Content Alliance, a universal library 2006 > The union catalog WorldCat on the web 2007 > The Encyclopedia of Life, a global effort 2007 > The future of ebooks seen from France 2010 > From the Librié to the iPad 2011 > The ebook in ten points
1971 > PROJECT GUTENBERG, A VISIONARY PROJECT
[Summary] The first ebook was available in July 1971, as eText #1 of Project Gutenberg, a visionary project launched by Michael Hart to create free electronic versions of literary works and disseminate them worldwide. In the 16th century, Gutenberg allowed anyone to have print books for a small cost. In the 21st century, Project Gutenberg would allow anyone to have a digital library at no cost. First considered as totally unrealistic, the project got its first boost with the invention of the web in 1990, which made it easier to distribute ebooks and recruit volunteers, and its second boost with the creation of Distributed Proofreaders in 2000, to share the proofreading of ebooks between thousands of volunteers. In 2011, for its 40th anniversary, Project Gutenberg offered 36,000 ebooks being downloaded by the tens of thousands every day, with websites in the United States, in Australia, in Europe, and in Canada, and 40 mirror websites worldwide.
***
The first ebook was available in July 1971, as eText #1 of Project Gutenberg, a visionary project launched by Michael Hart to create free electronic versions of literary works and disseminate them worldwide.
In the 16th century, Gutenberg allowed anyone to have print books for a small cost. In the 21st century, Project Gutenberg would allow anyone to have a digital library at no cost.
# Beginning
As recalled by Michael Hart in January 2009 in an email interview: "On July 4, 1971, while still a freshman at the University of Illinois (UI), I decided to spend the night at the Xerox Sigma V mainframe at the UI Materials Research Lab, rather than walk miles home in the summer heat, only to come back hours later to start another day of school. I stopped on the way to do a little grocery shopping to get through the night, and day, and along with the groceries they put in the faux parchment copy of 'The U.S. Declaration of Independence' that became quite literally the cornerstone of Project Gutenberg. That night, as it turned out, I received my first computer account — I had been hitchhiking on my brother's best friend's name, who ran the computer on the night shift. When I got a first look at the huge amount of computer money I was given, I decided I had to do something extremely worthwhile to do justice to what I had been given. (…) As I emptied out groceries, the faux parchment ‘Declaration of Independence’ fell out, and the light literally went on over my head like in the cartoons and comics… I knew what the future of computing, and the internet, was going to be… 'The Information Age.' The rest, as they say, is history."
Michael typed in the “U.S. Declaration of Independence” in upper case, because there was no lower case yet. He mentioned where the 5 K file was stored to the 100 users of the embryonic internet of the time, though without a hypertext link, because the web was still 20 years ahead. It was downloaded by six users.
Michael decided to search the books from public domain available in libraries, digitize these books and store their electronic versions. Project Gutenberg's mission would be the following: to put at everyone's disposal, in electronic versions, as many literary works from public domain as possible for free.
First considered as totally unrealistic, the project got its first boost with the invention of the web in 1990, which made it easier to distribute ebooks and recruit volunteers.
Years later, in August 1998, Michael wrote in an email interview: "We consider etext to be a new medium, with no real relationship to paper, other than presenting the same material, but I don't see how paper can possibly compete once people each find their own comfortable way to etexts, especially in schools."
A book became a continuous text file instead of a set of pages, using the low set of ASCII, called Plain Vanilla ASCII, with caps for the terms in italic, bold or underlined of the print version, for it to be read on any hardware and software. As a text file, a book would be easily copied, indexed, searched, analyzed and compared with other books.
# Distributed Proofreaders
The project got its second boost with the creation of Distributed Proofreaders in 2000, to share the proofreading of ebooks between thousands of volunteers.
Distributed Proofreaders was launched in October 2000 by Charles Franks to support the digitization of public domain books and assist Project Gutenberg in its efforts to offer free electronic versions of literary works. The books are scanned from a print version and converted into a text version by using OCR, 99% reliable at the best, which leaves a few errors per page. Volunteers choose one of the books available on the site and proofread a given page. It is recommended they do a page per day if possible.
Distributed Proofreaders became the main source of Project Gutenberg's ebooks, and an official Project Gutenberg site in 2002. Distributed Proofreaders became a separate legal entity in May 2006 and continues to maintain a strong relationship with Project Gutenberg. 10,000 books were digitized, proofread, and "preserved for the world" in December 2006, and 20,000 ebooks in April 2011, as “unique titles [sent] to the bookshelves of Project Gutenberg, free to enjoy for everybody. (…) Distributed Proofreaders is a truly international community. People from over the world contribute.” Distributed Proofreaders Europe (DP Europe) began production in early 2004. Distributed Proofreaders Canada (DP Canada) began production in December 2007.
# “Less is more”
Project Gutenberg keeps its administrative and financial structure to the bare minimum. Its motto fits into three words: "Less is more." The minimal rules give much space to volunteers and to new ideas. The goal is to ensure its independence from loans and other funding and from ephemeral cultural priorities, to avoid pressure from politicians and others. The aim is also to ensure respect for the volunteers, who can be confident their work will be used not just for a few years but for generations. Volunteers can network through mailing lists, weekly or monthly newsletters, discussion lists, forums, wikis and blogs.
In July 2011, for its 40th anniversary, Project Gutenberg offered 36,000 ebooks being downloaded by the tens of thousands every day, with websites in the United States, in Australia, in Europe, and in Canada, and 40 mirror websites worldwide.
40 years after the beginning of Project Gutenberg, Michael Hart describes himself as a workaholic who has devoted his entire life to his project. He considers himself a pragmatic and farsighted altruist. For years he was regarded as a nut but now he is respected.
Michael has often stated in his writings that, after Gutenberg allowing anyone to have its own print books for a small cost, Project Gutenberg would allow anyone to have a library at no cost stored in a pocket device. The collection of Project Gutenberg has the size of a local public library, but this time available on the web to be downloaded for free. The project’s goal is to change the world through freely available ebooks that can be used and copied endlessly, and reading and culture for everyone at minimal cost.
1974 > THE INTERNET “TOOK OFF”
[Summary] The internet “took off” in 1974 with the creation of TCP/IP (Transmission Control Protocol / Internet Protocol) by Vinton Cerf and Bob Kahn, fifteen years before the invention of the web. The internet expanded as a network linking U.S. governmental agencies, universities and research centers, before spreading worldwide in 1983. The internet got its first boost in 1990 with the invention of the web by Tim Berners-Lee, and its second boost in 1993 with the release of Mosaic, the first browser for the general public. The Internet Society (ISOC) was founded in 1992 by Vinton Cerf to promote the development of the internet as a medium that was becoming part of our lives. There were 100 million internet users in December 1997, with one million new users per month, and 300 million users in December 2000.
***
The internet “took off” in 1974 with the creation of TCP/IP
(Transmission Control Protocol / Internet Protocol) by Vinton Cerf and
Bob Kahn, fifteen years before the invention of the web.
# A new medium
The internet expanded as a network linking U.S. governmental agencies, universities and research centers, before spreading worldwide in 1983.
The internet got its first boost in 1990 with the invention of the web by Tim Berners-Lee, and its second boost in 1993 with the release of Mosaic, the first browser for the general public.
Vinton Cerf founded the Internet Society (ISOC) in 1992 to promote the development of the internet as a medium that was becoming part of our lives. When interviewed by the French daily Libération on 16 January 1998, he explained that the internet was doing two things. Like books, it could accumulate knowledge. But, more importantly, it presented knowledge in a way that connected it with other information whereas, in a book, information stayed isolated.
Because the web was easy to use with hyperlinks going from one document to the next, the internet could now be used by anyone, and not only by computer literate users. There were 100 million internet users in December 1997, with one million new users per month, and 300 million users in December 2000.
# A worldwide expansion
North America was leading the way in computer science and communication technology, with significant funding and cheap computers compared to Europe. A connection to the internet was much cheaper too.
In some European countries, internet users needed to surf the web at night (including the author of these lines), when phone rates by the minute were cheaper, to cut their expenses. In late 1998 and early 1999, some users in France, Germany and Italy launched a movement to boycott the internet one day per week, as a way to force internet providers and phone companies to set up a special monthly rate. This action paid off, and providers began to offer "internet rates".
In summer 1999, the number of internet users living outside the U.S. reached 50%.
In summer 2000, the number of internet users having a mother tongue other than English also reached 50%, and went on steadily increasing then. According to statistics regularly published on the website of Global Reach, a marketing consultancy promoting internationalization and localization, they were 52.5% in summer 2001, 57% in December 2001, 59.8% in April 2002, 64.4% in September 2003 (including 34.9% non- English-speaking Europeans and 29.4% Asians), and 64.2% in March 2004 (including 37.9% non-English-speaking Europeans and 33% Asians).
Broadband became the norm over the years. Jean-Paul, webmaster of the hypermedia website cotres.net, summarized things in January 2007: “I feel that we are experiencing a ‘floating’ period between the heroic ages, when we were moving forward while waiting for the technology to catch up, and the future, when high-speed broadband will unleash forces that just begin to move, for now only in games.”
# The internet of the future
The internet of the future could be a “pervasive” network allowing us to connect in any place and at any time on any device through a single omnipresent network.
The concept of a “pervasive” network was developed by Rafi Haladjian, founder of the European company Ozone, who explained on its website in 2007 that “the new wave would affect the physical world, our real environment, our daily life in every moment. We will not access the network any more, we will live in it. The future components of this network (wired parts, non wired parts, operators) will be transparent to the final user. The network will always be open, providing a permanent connection anywhere. It will also be agnostic in terms of applications, as a network based on the internet protocols themselves.” We do look forward to this.
As for the content of the internet, Timothy Leary, a visionary writer, described it in 1994 in his book “Chaos & Cyber Culture” as gigantic glass towers containing the whole world information, with free access, through the cyberspace, not only to all books, but also to all pictures, all movies, all TV shows, and all other data. In 2011, we are not there yet, but we are getting there.
1990 > THE INVENTION OF THE WEB
[Summary] The World Wide Web was invented in 1990 by Tim Berners-Lee at CERN (European Center for Nuclear Research, that later became the European Organization for Nuclear Research), Geneva, Switzerland. In 1989, Tim Berners-Lee networked documents using hypertext. In 1990, he developed the first HTTP (HyperText Transfer Protocol) server and the first web browser. In 1991, the web was operational and radically changed the way people were using the internet. Hypertext links allowed us to move from one textual or visual document to another with a simple click of the mouse. Information became interactive, thus more attractive to many users. Later on, this interactivity was further enhanced with hypermedia links that could link texts and images with video and sound. The World Wide Web Consortium (W3C) was founded in October 1994 to develop protocols for the web.
***
The World Wide Web was invented in 1990 by Tim Berners-Lee, a researcher at CERN (European Center for Nuclear Research), Geneva, Switzerland, who made the internet accessible to all.
# How the web started
In 1989, Tim Berners-Lee networked documents using hypertext. In 1990, he developed the first HTTP (HyperText Transfer Protocol) server and the first web browser. In 1991, the web was operational and made the internet accessible to all. Hypertext links allowed us to move from one textual or visual document to another with a simple click of the mouse. Information became interactive, thus more attractive to many users. Later on, this interactivity was further enhanced with hypermedia links that could link texts and images with video and sound.
Developed by NCSA (National Center for Supercomputing Applications) at the University of Illinois (USA) and distributed free of charge in November 1993, Mosaic was the first browser for the general public, and contributed greatly to the development of the web. In early 1994, part of the Mosaic team migrated to the Netscape Communications Corporation to develop a new browser called Netscape Navigator. In 1995, Microsoft launched its own browser, the Internet Explorer. Other browsers were launched then, like Opera and Safari, Apple's browser.
The World Wide Web Consortium (W3C) was founded in October 1994 to develop interoperable technologies (specifications, guidelines, software, other tools) for the web, for example specifications for markup languages (HTML, XML and others). It also acted as a forum for information, commerce, communication and collective understanding. In 1998, the section Internationalization/Localization gave access to some protocols for creating a multilingual website: HTML, base character set, new tags and attributes, HTTP, language negotiation, URLs and other identifiers including non-ASCII characters, etc.
# Tim Berners-Lee’s dream
Pierre Ruetschi, a journalist for the Swiss daily “Tribune de Genève”, asked Tim Berners-Lee on 20 December 1997: "Seven years later, are you satisfied with the way the web has evolved?". He answered that, if he was pleased with the richness and diversity of information, the web still lacked the power planned in its original design. He would like "the web to be more interactive, and people to be able to create information together", and not only to be information consumers. The web was supposed to become a "medium for collaboration, a world of knowledge that we share."
In an essay posted on his webpage, Tim Berners-Lee wrote in May 1998: "The dream behind the web is of a common information space in which we communicate by sharing information. Its universality is essential: the fact that a hypertext link can point to anything, be it personal, local or global, be it draft or highly polished. There was a second part of the dream, too, dependent on the web being so generally used that it became a realistic mirror (or in fact the primary embodiment) of the ways in which we work and play and socialize. That was that once the state of our interactions was online, we could then use computers to help us analyze it, make sense of what we are doing, where we individually fit in, and how we can better work together." (excerpt from "The World Wide Web: A very short personal history")
# The web 2.0
According to Netcraft, a company tracking data on the internet, the number of websites went from one million (April 1997) to 10 million (February 2000), 20 million (September 2000), 30 million (July 2001), 40 million (April 2003), 50 million (May 2004), 60 million (March 2005), 70 million (August 2005), 80 million (April 2006), 90 million (August 2006) and 100 million (November 2006), with a growing number of personal websites and blogs.
The term “web 2.0” was invented in 2004 by Tim O’Reilly, a publisher of computer books, as a title for a series of conferences he was organizing. The web 2.0 may begin to answer Tim Berners-Lee’s dream as a web based on community and sharing, with many collaborative projects across borders and languages.
Fifteen years after the invention the web, Wired stated in its August 2005 issue that less than half of the web was commercial, with the other half being run by passion. As for the internet, according to the French daily Le Monde dated 19 August 2005, its three powers — ubiquity, variety and interactivity — made its potential use quasi infinite.
Robert Beard, a language teacher at Bucknell University, Pennsylvania, and the founder of A Web of Online Dictionaries in 1995, wrote as early as September 1998: "The web will be an encyclopedia of the world by the world for the world. There will be no information or knowledge that anyone needs that will not be available. The major hindrance to international and interpersonal understanding, personal and institutional enhancement, will be removed. It would take a wilder imagination than mine to predict the effect of this development on the nature of humankind."
1991 > FROM ASCII TO UNICODE
[Summary] Used since the beginning of computing, ASCII (American Standard Code for Information Interchange) is a 7-bit coded character set for information interchange in English. It was published in 1963 by ANSI (American National Standards Institute). With the internet spreading worldwide, to communicate in English (and Latin) was not enough anymore. The accented characters of several European languages and characters of some other languages were taken into account from 1986 onwards with 8-bit variants of ASCII, also called extended ASCII, that provided sets of 256 characters. But problems were not over until the publication of Unicode in January 1991 as a new universal encoding system. Unicode provided "a unique number for every character, no matter what the platform, no matter what the program, no matter what the language", and could handle 65,000 characters or ideograms.
***
With the internet spreading worldwide, the use of ASCII and extended ASCII was not enough anymore, thus the need to take into account all languages with Unicode, whose first version was published in January 1991.
Used since the beginning of computing, ASCII (American Standard Code for Information Interchange) is a 7-bit coded character set for information interchange in English (and Latin). It was published in 1963 by ANSI (American National Standards Institute). The 7-bit plain ASCII, also called Plain Vanilla ASCII, is a set of 128 characters with 95 printable unaccented characters (A-Z, a-z, numbers, punctuation and basic symbols), the ones that are available on the American / English keyboard.
With computer technology spreading outside North America, the accented characters of several European languages and characters of some other languages were taken into account from 1986 onwards with 8-bit variants of ASCII, also called extended ASCII, that provided sets of 256 characters.
Brian King, director of the WorldWide Language Institute (WWLI), explained in September 1998: “Computer technology has traditionally been the sole domain of a 'techie' elite, fluent in both complex programming languages and in English — the universal language of science and technology. Computers were never designed to handle writing systems that couldn't be translated into ASCII. There wasn't much room for anything other than the 26 letters of the English alphabet in a coding system that originally couldn't even recognize acute accents and umlauts — not to mention non-alphabetic systems like Chinese. But tradition has been turned upside down. Technology has been popularized. (…)
An extension of (local) popularization is the export of information technology around the world. Popularization has now occurred on a global scale and English is no longer necessarily the lingua franca of the user. Perhaps there is no true lingua franca, but only the individual languages of the users. One thing is certain — it is no longer necessary to understand English to use a computer, nor it is necessary to have a degree in computer science. A pull from non- English-speaking computer users and a push from technology companies competing for global markets has made localization a fast growing area in software and hardware development. This development has not been as fast as it could have been. The first step was for ASCII to become extended ASCII. This meant that computers could begin to start recognizing the accents and symbols used in variants of the English alphabet — mostly used by European languages. But only one language could be displayed on a page at a time. (…)
The most recent development [in 1998] is Unicode. Although still evolving and only just being incorporated into the latest software, this new coding system translates each character into 16 bits. Whereas 8-bit extended ASCII could only handle a maximum of 256 characters, Unicode can handle over 65,000 unique characters and therefore potentially accommodate all of the world's writing systems on the computer. So now the tools are more or less in place. They are still not perfect, but at last we can surf the web in Chinese, Japanese, Korean, and numerous other languages that don't use the Western alphabet. As the internet spreads to parts of the world where English is rarely used — such as China, for example, it is natural that Chinese, and not English, will be the preferred choice for interacting with it. For the majority of the users in China, their mother tongue will be the only choice."
First published in January 1991, Unicode "provides a unique number for every character, no matter what the platform, no matter what the program, no matter what the language" (excerpt from the website). This double-byte platform-independent encoding provides a basis for the processing, storage and interchange of text data in any language. Unicode is maintained by the Unicode Consortium, with its variants UTF- 8, UTF-16 and UTF-32 (UTF: Unicode Transformation Format), and is a component of the specifications of the World Wide Web Consortium (W3C). Unicode has replaced ASCII for text files on Windows platforms since 1998. Unicode surpassed ASCII on the internet in December 2007.
1992 > HOMES FOR ELECTRONIC TEXTS
[Summary] The first homes for electronic texts were the Etext Archives, founded in 1992 by Paul Southworth, and the E-Zine-List, founded in 1993 by John Labovitz, among others. The first electronic texts were mostly political. They were followed by electronic zines also covering cultural topics, and not targeted towards a mass audience, at least during the first years. The Etext Archives, hosted on the website of the University of Michigan, were "home to electronic texts of all kinds, from the sacred to the profane, and from the political to the personal", without judging their content. The E-Zine-List was a directory of e-zines around the world, accessible via FTP, gopher, email, the web and other services. The list was updated monthly. 3,045 zines were listed in November 1998. John wrote on its website: "Now the e-zine world is different. (…) Even the term 'e-zine' has been co-opted by the commercial world, and has come to mean nearly any type of publication distributed electronically. Yet there is still the original, independent fringe, who continue to publish from their heart, or push the boundaries of what we call a 'zine'."
***
The first homes for electronic texts were the Etext Archives, founded in 1992 by Paul Southworth, and the E-Zine-List, founded in 1993 by John Labovitz, among others.
The first electronic texts were mostly political. They were followed by electronic zines, that also covered cultural topics.
What exactly is a zine? John Labovitz explained on its website: "For those of you not acquainted with the zine world, 'zine' is short for either 'fanzine' or 'magazine', depending on your point of view. Zines are generally produced by one person or a small group of people, done often for fun or personal reasons, and tend to be irreverent, bizarre, and/or esoteric. Zines are not 'mainstream' publications — they generally do not contain advertisements (except, sometimes, advertisements for other zines), are not targeted towards a mass audience, and are generally not produced to make a profit. An 'e-zine' is a zine that is distributed partially or solely on electronic networks like the internet."
# The Etext Archives
The Etext Archives were founded in 1992 by Paul Southworth, and hosted on the website of the University of Michigan. They were "home to electronic texts of all kinds, from the sacred to the profane, and from the political to the personal", without judging their content.
There were six sections in 1998: (a) "E-zines": electronic periodicals from the professional to the personal; (b) "Politics": political zines, essays, and home pages of political groups; (c) "Fiction": publications of amateur authors; (d) "Religion": mainstream and off-beat religious texts; (e) "Poetry": an eclectic mix of mostly amateur poetry; and (f) "Quartz": the archive formerly hosted at quartz.rutgers.edu.
As recalled on the website the same year: "The web was just a glimmer [in 1992], gopher was the new hot technology, and FTP was still the standard information retrieval protocol for the vast majority of users. The origin of the project has caused numerous people to associate it with the University of Michigan, although in fact there has never been an official relationship and the project is supported entirely by volunteer labor and contributions. The equipment is wholly owned by the project maintainers. The project was started in response to the lack of organized archiving of political documents, periodicals and discussions disseminated via Usenet on newsgroups such as alt.activism, misc.activism.progressive, and alt.society.anarchy. The alt.politics.radical-left group came later and was also a substantial source of both materials and regular contributors. Not long thereafter, electronic zines (e-zines) began their rapid proliferation on the internet, and it was clear that these materials suffered from the same lack of coordinated collection and preservation, not to mention the fact that the lines between e-zines (which at the time were mostly related to hacking, phreaking, and internet anarchism) and political materials on the internet were fuzzy enough that most e-zines fit the original mission of The Etext Archives. One thing led to another, and e-zines of all kinds — many on various cultural topics unrelated to politics — invaded the archives in significant volume."
# The E-Zine-List
The E-Zine-List was founded by John Labovitz in summer 1993 as a directory of e-zines around the world, accessible via FTP, gopher, email, the web, and other services. The list was updated monthly.
How did the E-Zine-List begin? On the website, John explained he originally wanted to publicize the print zine Crash by making an electronic version of it. Looking for directories, he only found the discussion group alt.zines and archives like The Well and The Etext Archives. Then came the idea of an organized directory. He began with twelve tiles listed manually in a word processor. Then he wrote his own database.
3,045 zines were listed in November 1998. John wrote on the website: "Now the e-zine world is different. The number of e-zines has increased a hundredfold, crawling out of the FTP and gopher woodworks to declaring themselves worthy of their own domain name, even asking for financial support through advertising. Even the term 'e-zine' has been co-opted by the commercial world, and has come to mean nearly any type of publication distributed electronically. Yet there is still the original, independent fringe, who continue to publish from their heart, or push the boundaries of what we call a 'zine'."
After maintaining the list during years, John passed the torch to others.
1993 > THE ONLINE BOOKS PAGE
[Summary] Founded in 1993 by John Mark Ockerbloom when he was a student at Carnegie Mellon University (CMU, Pittsburgh, Pennsylvania), the Online Books Page is "a website that facilitates access to books that are freely readable over the internet." John Mark Ockerbloom first maintained this page on the website of the School of Computer Science of Carnegie Mellon University. In 1999, he moved it at the University of Pennsylvania Library, after being hired as a digital library planner and researcher. The Online Books Page offered links to 12,000 books in 1999, 20,000 books in 2003 (including 4,000 books published by women), 25,000 books in 2006, 30,000 books in 2008 (including 7,000 books from Project Gutenberg) and 35,000 books in 2010.
***
In 1993, John Mark Ockerbloom created The Online Books Page as “a website that facilitates access to books that are freely readable over the internet.”
The web was still in its infancy, with Mosaic as its first browser.
John Mark Ockerbloom was a graduate student at the School of Computer
Science (CS) of Carnegie Mellon University (CMU, Pittsburgh,
Pennsylvania).
Five years later, in September 1998, John Mark wrote: "I was the original webmaster here at CMU CS, and started our local web in 1993. The local web included pages pointing to various locally developed resources, and originally The Online Books Page was just one of these pages, containing pointers to some books put online by some of the people in our department. (Robert Stockton had made web versions of some of Project Gutenberg's texts.) After a while, people started asking about books at other sites, and I noticed that a number of sites (not just Gutenberg, but also Wiretap and some other places) had books online, and that it would be useful to have some listing of all of them, so that you could go to one place to download or view books from all over the net. So that's how my index got started.
I eventually gave up the webmaster job in 1996, but kept The Online Books Page, since by then I'd gotten very interested in the great potential the net had for making literature available to a wide audience. At this point there are so many books going online that I have a hard time keeping up. But I hope to keep up my online books works in some form or another. I am very excited about the potential of the internet as a mass communication medium in the coming years. I'd also like to stay involved, one way or another, in making books available to a wide audience for free via the net, whether I make this explicitly part of my professional career, or whether I just do it as a spare-time volunteer."
In 1998, there was an index of 7,000 books that could be browsed by author, title or subject. There were also pointers to significant directories and archives of online texts, and to special exhibits.
As stated on the website at the time: "Along with books, The Online Books Page is also now listing major archives of serials (such as magazines, published journals, and newspapers) (…). Serials can be at least as important as books in library research. Serials are often the first places that new research and scholarship appear. They are sources for firsthand accounts of contemporary events and commentary. They are also often the first (and sometimes the only) place that quality literature appears. (For those who might still quibble about serials being listed on a 'books page', back issues of serials are often bound and reissued as hardbound 'books'.)"
In 1999, after graduating from Carnegie Mellon with a Ph.D. in computer science, John Mark was hired as a digital library planner and researcher at the University of Pennsylvania Library. He also moved The Online Books Page there, kept it as clear and simple, and went on expanding it.
The Online Books Page offered links to 12,000 books in 1999, 20,000 books in 2003 (including 4,000 books published by women), 25,000 books in 2006, 30,000 books in 2008 (including 7,000 books from Project Gutenberg) and 35,000 books in 2010. The books "have been authored, placed online, and hosted by a wide variety of individuals and groups throughout the world". The FAQ listed copyright information about most countries in the world, with links to further reading.
1993 > PDF, FROM PAST TO PRESENT
[Summary] From California, Adobe launched PDF (Portable Document Format) in June 1993, along with Acrobat Reader (free, to read PDFs) and Adobe Acrobat (for a fee, to create PDFs). As stated on the website, PDF "lets you capture and view robust information from any application, on any computer system and share it with anyone around the world.” As the "veteran" format, PDF was perfected over the years as a global standard for distribution and viewing of information. Acrobat Reader was available in several languages, for various platforms (Windows, Mac, Linux, Palm OS, Pocket PC, Symbian OS, etc.), and for various devices (computer, PDA, smartphone). In May 2003, Acrobat Reader (5th version) merged with Acrobat eBook Reader (2nd version) to become Adobe Reader, starting with version 6, which could read both standard PDF files and secure PDF files of copyrighted books.
***
From California, Adobe launched PDF (Portable Document Format) in June 1993, along with Acrobat Reader (free, to read PDFs) and Adobe Acrobat (for a fee, to make PDFs).
As stated on the website, PDF "lets you capture and view robust information from any application, on any computer system and share it with anyone around the world. Individuals, businesses, and government agencies everywhere trust and rely on Adobe PDF to communicate their ideas and vision.”
As the "veteran" format, PDF was perfected over the years as a global standard for distribution and viewing of information. Acrobat Reader and Adobe Acrobat gave the tools to create and view PDF files in several languages and for several platforms (Windows, Mac, Linux).
In August 2000, Adobe bought Glassbook, a software company intended for publishers, booksellers, distributors and libraries. Adobe also partnered with Amazon.com and Barnes & Noble.com to offer ebooks for Acrobat Reader and Glassbook Reader.
# Two new software
In January 2001, Adobe launched Acrobat eBook Reader (free) and the Adobe Content Server (for a fee).
Acrobat eBook Reader was meant to read PDF files of copyrighted books, while adding notes and bookmarks, visualizing the book covers in a personal library, and browsing a dictionary.
The Adobe Content Server was intended for publishers and distributors, for the packaging, protection, distribution and sale of PDF copyrighted books, while managing their access with DRM according to the copyright holder’s instructions, for example allowing or not the printing and loan of a book. The Adobe Content Server was replaced with the Adobe LiveCycle Policy Server in November 2004.
In April 2001, Adobe partnered with Amazon, for Amazon’s eBookStore to include 2,000 copyrighted books for Acrobat eBook Reader. These were titles of major publishers, travel guides and children books.
Acrobat Reader was then available for PDAs, beginning with the Palm
Pilot in May 2001 and the Pocket PC in December 2001.
# Adobe Reader
From 1993 to 2003, according to Adobe’s website, over 500 million copies of Acrobat Reader were downloaded worldwide. In 2003, Acrobat Reader was available in many languages and for many platforms (Windows, Mac, Linux, Palm OS, Pocket PC, Symbian OS, etc.). Approximately 10% of the documents on the internet were available in PDF. PDF was also the main format for ebooks.
In May 2003, Acrobat Reader (5th version) merged with Acrobat eBook Reader (2nd version) to become Adobe Reader, starting with version 6, which could read both standard PDF files and secure PDF files of copyrighted books.
In late 2003, Adobe opened its own online bookstore, the Digital Media Store, with PDF titles from major publishers, for example HarperCollins, Random House and Simon & Schuster, and electronic versions of newspapers and magazines, for example The New York Times or Popular Science. Adobe also launched Adobe eBooks Central as a service to read, publish, sell and lend ebooks, and Adobe eBook Library as a prototype digital library.
After being a proprietary format, PDF was officially released as an open standard in July 2008, and published by the International Organization for Standardization (ISO) as ISO 32000-1:2008.
1994 > THE INTERNET AS A MARKETING TOOL
[Summary] Some publishers decided to use the web as a marketing tool to promote their books among the 50,000 new books published per year in the U.S. NAP (National Academy Press) was the first publisher in 1994 to post the full text of some books on its website, for free, with the authors’ consent. It was followed by MIT Press (MIT: Massachusetts Institute of Technology) in 1995. Oddly enough, there was no drop in sales. On the contrary, sales increased. These initiatives were praised by a number of other publishers, who were reluctant to do the same, for three reasons: the cost of posting thousands of pages online, problems linked to copyright, and what they saw as a “competition” between digital versions for free and print versions for a fee.
***
Some publishers decided to post the full text of some books for free on their websites, and to use the web as a marketing tool to sell the print versions.
NAP (National Academy Press) was the first publisher in 1994 to post the full text of some books, with the authors’ consent, as a way to promote their books among the 50,000 new books published per year in the U.S. NAP was followed by the MIT Press (MIT: Massachusetts Institute of Technology) in 1995.
NAP was created by the National Academy of Sciences to publish its own reports and the ones of the National Academy of Engineering, the Institute of Medicine, and the National Research Council. In 1994, NAP was publishing 200 new books a year in science, engineering and health. The publisher began posting full books for free, as suggested by their authors themselves, for people to browse them on the website before buying their print versions. Oddly enough, there was no drop in sales. On the contrary, sales increased. Print books ordered online were 20% cheaper. There were also more sales by phone. In 1998, the NAP Reading Room offered 1,000 entire books, available online for free in "image" format, HTML and PDF.
In 1995, MIT Press was publishing 200 new books per year and 40 journals, in science and technology, architecture, social theory, economics, cognitive science, and computational science. MIT Press also decided to put a number of books online for free, as "a long-term commitment to the efficient and creative use of new technologies". Sales of print books with a free online version increased as well.
These initiatives were praised by a number of other publishers, who were reluctant to do the same, for three reasons: the cost of posting thousands of pages online, problems linked to copyright, and what they saw as a “competition” between digital versions for free and print versions for a fee.
1995 > THE PRINT PRESS WENT ONLINE
[Summary] The print press going online in the 1990s led the way to print books going online a few years later, thus the need for this chapter. The first electronic versions of print newspapers were available in the early 1990s through commercial services like America Online and CompuServe. In 1995, major newspapers like The New York Times, The Washington Post or The Wall Street Journal began offering websites with a partial or full version of their latest issue, as well as online archives. In the United Kingdom, the daily Times and the Sunday Times set up a common website called Times Online, with a way to create a personalized edition. The weekly publication The Economist went online too, as well as the daily Le Monde and Libération in France, the daily El País in Spain, and the weekly Focus and Der Spiegel in Germany.
***
The print press going online in the 1990s led the way to print books going online a few years later, thus the need for this chapter.
The first electronic versions of print newspapers were available in the early 1990s through commercial services like America Online and CompuServe.
In 1995, newspapers began offering websites with a partial or full version of their latest issue, available freely or through subscription (free or paid), as well as online archives.
For example, The New York Times site could be accessed free of charge, with articles of the print daily, breaking news updated every ten minutes, and original reporting only available online. The site of The Washington Post gave the daily news online, with a full database of articles, with images, sound and videos. The site of The Wall Street Journal was available with a paid subscription, with 100,000 subscribers in 1998.
In the United Kingdom, the daily Times and the Sunday Times set up a common website called Times Online, with a way to create a personalized edition. The weekly publication The Economist went online too, as well as the daily Le Monde and Libération in France, the daily El País in Spain, and the weekly Focus and Der Spiegel in Germany.
"More than 3,600 newspapers now publish on the internet", Eric K. Meyer stated in an essay published in late 1997 on the website of AJR/NewsLink. "A full 43% of all online newspapers now are based outside the United States. A year ago, only 29% of online newspapers were located abroad. Rapid growth, primarily in Canada, the United Kingdom, Norway, Brazil and Germany, has pushed the total number of non-U.S. online newspapers to 1,563. The number of U.S. newspapers online also has grown markedly, from 745 a year ago to 1,290 six months ago to 2,059 today. Outside the United States, the United Kingdom, with 294 online newspapers, and Canada, with 230, lead the way. In Canada, every province or territory now has at least one online newspaper. Ontario leads the way with 91, Alberta has 44, and British Columbia has 43. Elsewhere in North America, Mexico has 51 online newspapers, 23 newspapers are online in Central America and 36 are online in the Caribbean. Europe is the next most wired continent for newspapers, with 728 online newspaper sites. After the United Kingdom, Norway has the next most — 53 — and Germany has 43. Asia (led by India) has 223 online newspapers, South America (led by Bolivia) has 161 and Africa (led by South Africa) has 53. Australia and other islands have 64 online newspapers."
The online versions of newspapers brought a wealth of information. The web provided readers not only with news available online, but also with a whole encyclopedia to help understand them. The reader could click on hyperlinks to get maps, biographies, official texts, political and economic data, photographs, as well as the first attempts in audio and video coverage. The reader could also easily access other articles on the same topic, with search engines sorting out articles by date, author, title or subject.
1995 > AMAZON, A PIONEER IN CYBERCOMMERCE
[Summary] Jeff Bezos launched Amazon.com in July 1995 in Seattle, on the West Coast, after a market study which led him to conclude that books were the best products to sell on the internet. The online bookstore started with 10 employees and a catalog of 3 million books. Unlike traditional bookstores, Amazon's windows were its webpages, with transactions made through the internet. Books were stored in huge storage facilities before being put into boxes and sent by mail. In November 2000, Amazon had 7,500 employees, a catalog of 28 million items, 23 million clients worldwide and four subsidiaries in United Kingdom (launched in August 1998), Germany (August 1998), France (August 2000), and Japan (November 2000). A fifth subsidiary opened in Canada in June 2002. A sixth subsidiary, named Joyo, opened in China in September 2004. In July 2005, for its 10th anniversary, Amazon had 41 million clients and 9,000 employees.
***
Jeff Bezos launched Amazon.com in July 1995 in Seattle, on the West Coast, after a market study which led him to conclude that books were the best products to sell on the internet.
The online bookstore started with 10 employees and a catalog of 3 million books, i.e. the catalog of books available for sale in the U.S. Unlike traditional bookstores, Amazon’s windows were its webpages, with transactions made through the internet. Books were stored in huge storage facilities before being put into boxes and sent by mail.
What exactly was the idea behind Amazon.com? In spring 1994, Jeff Bezos drew up a list of twenty products that could be sold online, ranging from clothing to gardening tools, and then researched the top five, which were CDs, videos, computer hardware, computer software and books.
As recalled by Jeff Bezos in 1997 in Amazon's press kit: "I used a whole bunch of criteria to evaluate the potential of each product, but among the main criteria was the size of the relative markets. Books, I found out, were an $82 billion market worldwide. The price point was another major criterion: I wanted a low-priced product. I reasoned that since this was the first purchase many people would make online, it had to be non-threatening in size. A third criterion was the range of choice: there were 3 million items in the book category and only a tenth of that in CDs, for example. This was important because the wider the choice, the more the organizing and selection capabilities of the computer could be put in good use."
In the wake of the Internet Bookstore in United Kingdom, that was the largest online bookstore in Europe, Amazon.com launched is own Associates Program in spring 1997. There were 30,000 associates in spring 1998, and 60,000 associates in June 1998.
As stated in a press release dated 8 June 1998 to promote the program: "The Amazon.com Associates Program allows website owners to easily participate in hassle-free electronic commerce by recommending books on their site and referring visitors to Amazon.com. In return, participants earn referral fees of up to 15 percent of the sales they generate. Amazon.com handles the secure online ordering, customer service, and shipping and sends weekly email sales reports. Enrollment in the program is free, and participants can be up and running the same day. Associates range from large and small businesses to nonprofits, authors, publishers, personal home pages, and more. The popularity of the program is reflected in the range of additions to the Associates Community in the past few months: Adobe, InfoBeat, Kemper Funds, PR Newswire, Travelocity, Virtual Vineyards, and Xoom."
People could search Amazon’s online catalog by author, title, subject, date or ISBN. The website offered excerpts from books, book reviews, customer reviews, and author interviews. People could "leaf" through extracts and reviews, order some books online, and pay with their credit card. Books arrived within a week at their doorstep. As an online retailer, Amazon could offer lower prices than local bookstores, a larger selection, and a wealth of product information. Customers could subscribe to a mailing list to get reviews of new books by their favorite authors, or new books in their favorite topics, with 44 topics to choose from. In 1998, Amazon was also selling CDs, DVDs, audio books and computer games, with 3 million clients in 160 countries.
Amazon’s main competitor was the online bookstore of Barnes & Noble, a major bookseller with 481 stores nationwide in 1997, in 48 states out of 50, as well as 520 B. Dalton stores in shopping malls. Barnes & Noble first launched its America OnLine (AOL) website in March 1997, as the exclusive bookseller for the 12 million AOL customers, before launching its own website barnesandnoble.com in May 1997 in partnership with Bertelsmann (Barnes & Noble bought Bertelsmann’s portion (36,8%) back for 164 million dollars in July 2003).
Barnes & Noble’s site offered significant discounts: 30% off all in- stock hardcovers, 20% off all in-stock paperbacks, 40% off select titles, and up to 90% off bargain books. Its Affiliate Network spread quickly, with 12,000 affiliate websites in May 1998, including CNN Interactive, Lycos and ZDNet. One year later, Barnes & Noble.com launched a revamped website with a better design, an Express Lane one- click ordering, improved book search capabilities, and a new software "superstore". A fierce price war began with Amazon for the best book discounts, and Amazon.com came to be known as Amazon.toast, which didn’t last. With a two-year head start, Amazon stayed ahead in the competition.
Amazon launched its eBookStore in November 2000, three months after Barnes & Noble, after partnering in August 2000 with Microsoft to sell ebooks for the Microsoft Reader, and with Adobe to offer ebooks for the Acrobat Reader and the Glassbook Reader — Adobe had just bought Glassbook, its reader and its digital bookstore. In April 2001, Amazon.com partnered again with Adobe to include 2,000 copyrighted books for the Acrobat eBook Reader, mainly titles from major publishers, travel guides and children books.
In November 2000, Amazon had 7,500 employees, a catalog of 28 million items, 23 million clients worldwide and four subsidiaries in United Kingdom (launched in August 1998), Germany (August 1998), France (August 2000) and Japan (November 2000). A fifth subsidiary opened in Canada in June 2002, and a sixth subsidiary, named Joyo, opened in China in September 2004. In July 2005, for its 10th anniversary, Amazon had 9,000 employees and 41 million clients.
1996 > THE INTERNET ARCHIVE, FOR FUTURE GENERATIONS
[Summary] Founded in April 1996 by Brewster Kahle in San Francisco, California, the Internet Archive wanted to offer a permanent access of the web “through the ages” to present and future generations. In October 2001, with 30 billion stored webpages, the Internet Archive launched the Wayback Machine, for internet users throughout the world to be able to surf the archive of a given website by date. In 2004, there were 300 terabytes of data, with a growth of 12 terabytes per month. There were 65 billion webpages (from 50 million websites) in 2006, 85 billion webpages in 2008, and 150 billion webpages in March 2010. The Internet Archive has also defined itself as "a nonprofit digital library dedicated to providing universal access to human knowledge", building up an online library of text, audio, software, image and video content. In October 2005, it launched the Open Content Alliance (OCA) with a number of partner organizations to build a universal digital library of multilingual digitized text and multimedia content.
***
Founded in April 1996 by Brewster Kahle, the Internet Archive wanted to offer permanent access of the web “through the ages” to present and future generations.
As explained on the website at the time, throughout history, societies have sought to preserve their culture and heritage for present and future generations, and libraries have been created to preserve the paper trail of that culture and legacy, and to facilitate its access to the general public and researchers. Therefore it seems essential to extend their mission to new technology. Paradoxically this was done poorly in the early 20th century. Many movies were recycled — and thus lost forever — to retrieve the silver layer. Many radio or TV programs were not saved. It is important not to repeat the same mistakes for the internet, especially for the web, a new medium the extent of which is still unknown in 1996. This is the raison d’être of the Internet Archive, that has defined itself as "a nonprofit digital library dedicated to providing universal access to human knowledge."
The whole web was stored every two months or so on the servers of the Internet Archive in San Francisco, California, for researchers, historians and scholars to be able to access it.
In October 2001, with 30 billion stored webpages, the Internet Archive launched the Wayback Machine, for internet users throughout the world to be able to surf the archive of a given website by date.
In 2004, there were 300 terabytes of data, with a growth of 12 terabytes per month. There were 65 billion pages (from 50 million websites) in 2006, 85 billion pages in 2008, and 150 billion pages in March 2010.
In late 1999, the Internet Archive also became an online library of text, audio, software, image and video content, for example some books of the Million Book Project (10,520 books in April 2005), films for the period 1903-1973, live concerts, software, sites about September 11, sites about elections, and sites about the web pioneers, with all collections freely available on the web.
As a side remark, the Million Book Project, also called the Universal Library or Universal Digital Library (UDL), was launched in January 2000 by the Carnegie Mellon University (Pennsylvania) with the aim to digitize one million books in a number of languages, including in libraries from India and China. The project was completed in 2007, with one million books available on the university website, as image files in DjVu and TIFF formats, and three mirror sites (India, Northern China, Southern China).
In October 2005, the Internet Archive launched the Open Content Alliance (OCA) as a group of cultural, technology, non profit, and governmental organizations, with the aim to build a world public permanent archive of multilingual digitized text and multimedia content. The OCA started to digitize public domain books around the world, and to include them in the collection of the Internet Archive.
1996 > LIBRARIES LAUNCHED WEBSITES
[Summary] The Helsinki City Library in Finland was the first library to launch a website, which went live in February 1994. Two years later, more and more libraries started their own websites as a virtual window for their patrons and beyond. Patrons could check opening hours, browse the online catalog, and surf a broad selection of websites on various topics. Libraries developed digital libraries alongside their standard collections, so that anyone could access their specialized, old, local and regional collections, including for images and sound. Librarians could finally fulfill two goals that used to be in contradiction: preservation (on shelves) and communication (on the internet). Debates were fierce about the assets of the print book versus the digital book, and vice versa.
***
In the mid-1990s, libraries started their own websites as a virtual window for their patrons and beyond, with an online catalog and a digital library.
In his book “Books in My Life”, published by the Library of Congress in 1985, Robert Downs, a librarian, wrote: "My lifelong love affair with books and reading continues unaffected by automation, computers, and all other forms of the twentieth-century gadgetry."
Automation and computers were followed by the internet (1974) and the web (1990), and eased the work of librarians in some way.
The Helsinki City Library in Finland was the first library to launch a website, which went live in February 1994. Other libraries started their own websites as a virtual window for their patrons and beyond. Patrons could check opening hours, browse the online catalog, and surf on a broad selection of websites on various topics.
Libraries also developed digital libraries alongside their standard collections, so that anyone could access their specialized, old, local and regional collections, including for images and sound. Librarians could finally fulfill two goals that used to be in contradiction: preservation (on shelves) and communication (on the internet). People could now leaf through digital facsimiles, and access the original books only if necessary.
# At the British Library
In “Information Systems Strategy”, a document posted on the British Library’s website in 1997, Brian Lang, chief executive of the library, stated: "We do not envisage an exclusively digital library. We are aware that some people feel that digital materials will predominate in libraries of the future. Others anticipate that the impact will be slight. In the context of the British Library, printed books, manuscripts, maps, music, sound recordings and all the other existing materials in the collection will always retain their central importance, and we are committed to continuing to provide, and to improve, access to these in our reading rooms. The importance of digital materials will, however, increase. We recognize that network infrastructure is at present most strongly developed in the higher education sector, but there are signs that similar facilities will also be available elsewhere, particularly in the industrial and commercial sector, and for public libraries. Our vision of network access encompasses all these."
An extensive Digital Library Program was expected to begin in 1999. As explained by Brian Lang: "The development of the Digital Library will enable the British Library to embrace the digital information age. Digital technology will be used to preserve and extend the Library's unparalleled collection. Access to the collection will become boundless with users from all over the world, at any time, having simple, fast access to digitized materials using computer networks, particularly the internet."
# Print book vs. digital book
Debates were fierce about the assets of the print book versus the digital book, and vice versa.
Roberto Hernández Montoya, an editor of Venezuela Analítica, an electronic magazine offering a small digital library, wrote in September 1998: "The printed text can't be replaced, at least not for the foreseeable future. The paper book is a tremendous 'machine'. We can't leaf through an electronic book in the same way as a paper book. On the other hand, electronic use allows us to locate text chains more quickly. In a certain way we can more intensively read the electronic text, even with the inconvenience of reading on the screen. The electronic book is less expensive and can be more easily distributed worldwide (if we don't count the cost of the computer and the internet connection)."
In the February 1996 issue of the Swiss computer magazine "Informatique-Informations", Pierre Perroud, founder of the digital library Athena, explained that "electronic texts represent an encouragement to reading and a convivial participation to culture dissemination", particularly for textual research and text study. These texts are "a good complement to the print book, which remains irreplaceable when for 'true' reading. (…) The book remains a mysteriously holy companion with profound symbolism for us: we grip it in our hands, we hold it against us, we look at it with admiration; its small size comforts us and its content impresses us; its fragility contains a density we are fascinated by; like man it fears water and fire, but it has the power to shelter man's thoughts from time."
1996 > TOWARDS A DIGITAL KNOWLEDGE
[Summary] The information available in books stays “static”, whereas the information available on the internet is regularly updated, thus the need to change our relationship to knowledge. In 1996, more and more computers connected to the internet were available in schools and at home. Teachers began exploring new ways of teaching. Going from print culture to digital culture was changing the way both teachers and students were seeing teaching and learning. Print culture provided “stable” information whereas digital culture provided "moving" information, with information being regularly updated. During a conference organized by the International Federation of Information Processing (IFIP) in September 1996, Dale Spender, a professor and researcher, gave a lecture on "Creativity and the Computer Education Industry", with insightful comments on forthcoming trends.
***
The information available in books stays “static”, whereas the information available on the internet is regularly updated, thus the need to change our relationship to knowledge.
In 1996, more and more computers connected to the internet were available in schools and at home. Teachers began exploring new ways of teaching. Going from print culture to digital culture was changing the way both teachers and students were seeing teaching and learning. Print culture provided “stable” information whereas digital culture provided "moving" information.
During a conference organized by the International Federation of Information Processing (IFIP) in September 1996, Dale Spender, a professor and researcher, gave a lecture on "Creativity and the Computer Education Industry", with insightful comments on forthcoming trends. Here are some excerpts:
"Throughout print culture, information has been contained in books — and this has helped to shape our notion of information. For the information in books stays the same — it endures. And this has encouraged us to think of information as stable — as a body of knowledge which can be acquired, taught, passed on, memorized, and tested of course. The very nature of print itself has fostered a sense of truth; truth too is something which stays the same, which endures. And there is no doubt that this stability, this orderliness, has been a major contributor to the huge successes of the industrial age and the scientific revolution. (…)
But the digital revolution changes all this. Suddenly it is not the oldest information — the longest lasting information that is the most reliable and useful. It is the very latest information that we now put the most faith in — and which we will pay the most for. (…)
Education will be about participating in the production of the latest information. This is why education will have to be ongoing throughout life and work. Every day there will be something new that we will all have to learn. To keep up. To be in the know. To do our jobs. To be members of the digital community. And far from teaching a body of knowledge that will last for life, the new generation of information professionals will be required to search out, add to, critique, 'play with', and daily update information, and to make available the constant changes that are occurring."
Robert Beard, a professor at Bucknell University, in Lewisburg, Pennsylvania, wrote in September 1998: "As a language teacher, the web represents a plethora of new resources produced by the target culture, new tools for delivering lessons (interactive Java and Shockwave exercises) and testing, which are available to students any time they have the time or interest — 24 hours a day, 7 days a week. It is also an almost limitless publication outlet for my colleagues and I, not to mention my institution. (…) Ultimately all course materials, including lecture notes, exercises, moot and credit testing, grading, and interactive exercises will be far more effective in conveying concepts that we have not even dreamed of yet.”
Russon Wooldridge, a professor at the Department of French Studies, University of Toronto, Canada, wrote in February 2001: " My research, conducted once in an ivory tower, is now almost exclusively done through local or remote collaborations. (…) All my teaching makes the most of internet resources (web and email): the two common places for a course are the classroom and the website of the course, where I put all course materials. I have published all my research data of the last 20 years on the web (re-edition of books, articles, texts of old dictionaries as interactive databases, treaties from the 16th century, etc.). I publish proceedings of symposiums, I publish a journal, I collaborate with French colleagues by publishing online in Toronto what they can't publish online at home. In May 2000, I organized an international symposium in Toronto about French studies enhanced by new technologies. (…) I realize that without the internet I wouldn't have as many activities, or at least they would be very different from the ones I have today. So I don't see the future without them."
The Massachusetts Institute of Technology (MIT) officially launched its OpenCourseWare (OCW) in September 2003 to put its course materials for free on the web, as a way to promote open dissemination of knowledge. In September 2002, a pilot version was available online with 32 course materials. 500 course materials were available in March 2004, 1,400 course materials in May 2006, and all 1,800 course materials in November 2007, regularly updated then, with some of them translated into Spanish, Portuguese and Chinese with the help of other organizations. MIT also launched the OpenCourseWare Consortium (OCW Consortium) in November 2005, as a common project for educational institutions that were willing to offer free online course materials, with the course materials of 100 universities worldwide one year later.
1996 > THE @FOLIO PROJECT, A MOBILE DEVICE FOR TEXTS
[Summary] The @folio project is a mobile device for texts designed as early as October 1996 by Pierre Schweitzer, an architect-designer living in Strasbourg, France. It is meant to download and read any text and/or illustrations from the web or hard disk, in any format, with no proprietary format and no DRM. The technology of @folio was novel and simple. It is inspired from fax and tab file folders. The flash memory is "printed" like Gutenberg printed his books. The facsimile mode is readable as is for any content, from sheet music to mathematical or chemical formulas, with no conversion necessary, whether it is handwritten text, calligraphy, free hand drawing or non-alphabetical writing. An international patent was filed in April 2001. The French start-up iCodex was created in July 2002 to develop and promote the @folio project.
***
The @folio project is a mobile device for texts designed as early as
October 1996 by Pierre Schweitzer, an architect-designer living in
Strasbourg, France.
It is meant to download and read any text and/or illustrations from the web or hard disk, in any format, with no proprietary format and no DRM.
The technology of @folio was novel and simple. It is inspired from fax and tab file folders. The flash memory is "printed" like Gutenberg printed his books. The facsimile mode is readable as is for any content, from sheet music to mathematical or chemical formulas, with no conversion necessary, whether it is handwritten text, calligraphy, free hand drawing or non-alphabetical writing. All this is difficult if not impossible on a computer or ebook reader (in the late 1990s and early 2000s).
The screen of the lightweight prototype takes 80% of the total surface and has low power consumption. It is surrounded by a translucent and flexible frame that folds to protect the screen when not in use. @folio could be sold for US $100 for the basic standard version, with various combinations of screen sizes and flash memory to fit different needs.
Intuitive navigation allows to "turn" pages as easily as in a print book, and allows to sort out and search documents as easily as with a tab file folder, and choose one’s own preferences for margins, paragraphs, font selection and character size. There are no buttons, only a round trackball adorned with the world map in black and white. The trackball can be replaced with a long and narrow tactile pad on either side of the frame.
The flash memory allows the downloading of thousands of hypertext pages, either previously linked before download or linked during the download. @folio provides an instant automatic reformatting of documents, for them to fit the size of the screen. For "text" files, no software is necessary. For "image" files, Pierre conceived a reformatting software called Mot@Mot (Word@Word in French) which could be used on any other device. This software received much attention from the French National Library (BNF: Bibliothèque Nationale de France), especially for its old books (published before 1812) and illustrated manuscripts.
An international patent was filed in April 2001. The French startup iCodex was created in July 2002 to develop and promote the @folio project.
To this day, @folio has stayed a prototype, because of lack of funding and because of the language barrier, with only two articles in English in 2007 — one in Project Gutenberg News and one in TeleRead about Pierre Schweitzer’s dream — for dozens of articles in French.
Even the best researchers can’t do much with no support, no funding, and no interpreter (from French to English) to help them get through the language barrier.
1997 > MULTIMEDIA CONVERGENCE
[Summary] Previously distinct information-based industries, such as printing, publishing, graphic design, media, sound recording and film making, were converging into one industry, with information as a common product. This trend was named "multimedia convergence", with a massive loss of jobs, and a serious enough issue to be tackled by the ILO (International Labor Organization). The first ILO Symposium on Multimedia Convergence was held in January 1997 at the ILO headquarters in Geneva, Switzerland, with employers, unionists and government representatives from all over the world. Some participants, mostly employers, demonstrated that the information society was generating or would generate jobs. Other participants, mostly unionists, demonstrated there was a rise in unemployment worldwide, that should be addressed right away through investment, innovation, vocational training, computer literacy, retraining and fair labor rights, including for teleworkers.
***
Previously distinct information-based industries, such as printing, publishing, graphic design, media, sound recording and film making, were converging into one industry, with information as a common product.
This trend was named multimedia convergence, with a massive loss of jobs, and a serious enough issue to be tackled by the International Labor Organization (ILO).
# A symposium
The first ILO Symposium on Multimedia Convergence was held in January 1997 at the ILO headquarters in Geneva, Switzerland. Employers, unionists and government representatives from all over the world came to discuss the Information Society, the impact of the convergence process on employment and work, and labor relations in the information age. The purpose of these debates was "to stimulate reflection on the policies and approaches most apt to prepare our societies and especially our workforces for the turbulent transition towards an information economy."
As stated in the introduction to the symposium’s proceedings: "Today all forms of information — whether based in text, sound or images — can be converted into bits and bytes for handling by computer. Digitalization has made it possible to create, record, manipulate, combine, store, retrieve and transmit information and information-based products in ways which magnetic tape, celluloid and paper did not permit. Digitalization thus allows music, cinema and the written word to be recorded and transformed through similar processes and without distinct material supports. Previously dissimilar industries, such as publishing and sound recording, now both produce CD-ROMs rather than simply books and records."
Multimedia convergence was “creating new configurations among an ever- widening range of industries. The digitalization of information processing and delivery is transforming the way financial systems operate, the way enterprises exchange information internally and externally, and the way individuals work in an increasingly electronic environment."
In the book industry, traditional printing was first disrupted by new photocomposition machines, with lower costs. Text and image processing began to be handed over to desktop publishing and graphic art studios. Impression costs went on decreasing with photocopiers, color photocopiers and digital printing. Digitization speeded up the editorial process, which used to be sequential, by allowing the copy editor, the image editor and the layout staff to work at the same time on the same book.
In the press industry, journalists and editors could now type in their articles online. These articles went directly from text to layout, without being keyed in anymore by the production staff.
# Some contributions
One of the participants of the symposium, Peter Leisink, an associate professor of labor studies at the Utrecht University, Netherlands, explained: "A survey of the United Kingdom book publishing industry showed that proofreaders and editors have been externalized and now work as home-based teleworkers. The vast majority of them had entered self-employment, not as a first-choice option, but as a result of industry mergers, relocations and redundancies. These people should actually be regarded as casualized workers, rather than as self- employed, since they have little autonomy and tend to depend on only one publishing house for their work."
Another participant, Michel Muller, secretary-general of the French Federation of Book, Paper and Communication Industry (FILPAC: Fédération des Industries du Livre, du Papier et de la Communication), stated that, in France, jobs in this industry fell from 110,000 to 90,000 in ten years, from 1987 to 1996, with expensive social plans to re-train and re-employ the 20,000 people who lost their jobs.
He explained that, "if the technological developments really created new jobs, as had been suggested, then it might have been better to invest the money in reliable studies about what jobs were being created and which ones were being lost, rather than in social plans which often created artificial jobs. These studies should highlight the new skills and qualifications in demand as the technological convergence process broke down the barriers between the printing industry, journalism and other vehicles of information. Another problem caused by convergence was the trend towards ownership concentration. A few big groups controlled not only the bulk of the print media, but a wide range of other media, and thus posed a threat to pluralism in expression. Various tax advantages enjoyed by the press today should be re-examined and adapted to the new realities facing the press and multimedia enterprises. Managing all the social and societal issues raised by new technologies required widespread agreement and consensus. Collective agreements were vital, since neither individual negotiations nor the market alone could sufficiently settle these matters."
A third participant, Walter Durling, director of AT&T Global Information Solutions in the United States, had quite theoretical words about the matter: "Technology would not change the core of human relations. More sophisticated means of communicating, new mechanisms for negotiating, and new types of conflicts would all arise, but the relationships between workers and employers themselves would continue to be the same. When film was invented, people had been afraid that it could bring theatre to an end. That has not happened. When television was developed, people had feared that it would do away cinemas, but it had not. One should not be afraid of the future. Fear of the future should not lead us to stifle creativity with regulations. Creativity was needed to generate new employment. The spirit of enterprise had to be reinforced with the new technology in order to create jobs for those who had been displaced. Problems should not be anticipated, but tackled when they arose." In short, humanity shouldn't fear technology.
# Job creation vs. lay-off
In fact, employees were not so much afraid of technology as they were afraid of losing their jobs. In 1996, unemployment was already significant in any field, which was not the case when film and television were invented.
What would be the balance between job creation and lay-off in the near future? Unions were struggling worldwide to promote the creation of jobs through investment, innovation, vocational training, computer literacy, retraining for new jobs in digital technology, fair conditions for labor contracts and collective agreements, defense of copyright for the re-use of articles from the print media to the web, protection of workers in the artistic field, and defense of teleworkers as workers having full rights.
Despite unions' efforts, would the situation become as tragic as suggested in a note of the symposium's proceedings? "Some fear a future in which individuals will be forced to struggle for survival in an electronic jungle. And the survival mechanisms which have been developed in recent decades, such as relatively stable employment relations, collective agreements, employee representation, employer- provided job training, and jointly funded social security schemes, may be sorely tested in a world where work crosses borders at the speed of light."
1997 > A PORTAL FOR EUROPEAN NATIONAL LIBRARIES
[Summary] Gabriel — an acronym for "Gateway and Bridge to Europe's National Libraries" — was launched as a common portal giving access to the internet services of participating libraries. The Gabriel project was conceived during the 1994 CENL (Conference of European National Librarians) meeting in Oslo, Norway, as an common electronic board with updates about ongoing internet projects. Another meeting took place in March 1995 with representatives from the national libraries in the Netherlands, United Kingdom and Finland, who launched a pilot project and were joined then by the national libraries in Germany, France and Poland. A first Gabriel website was launched in September 1995. During the 1996 CENL meeting in Lisbon, Portugal, Gabriel became an official CENL website, with a new trilingual (English, French, German) portal launched in January 1997.
***
Gabriel — an acronym for "Gateway and Bridge to Europe's National Libraries — was launched in January 1997 as a common portal giving access to the internet services of the participating libraries.
As stated on its website: "Gabriel also recalls Gabriel Naudé, whose 'Advis pour dresser une bibliothèque' (Paris, 1627) is one of the earliest theoretical works about libraries in any European language and provides a blueprint for the great modern research library. The name Gabriel is common to many European languages and is derived from the Old Testament, where Gabriel appears as one of the archangels or heavenly messengers. He also appears in a similar role in the New Testament and the Qu'ran."
In 1998, Gabriel offered links to the internet services of 38 participating national libraries (Albania, Austria, Belgium, Bulgaria, Croatia, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Liechtenstein, Lithuania, Luxembourg, Macedonia, Malta, Netherlands, Norway, Poland, Portugal, Romania, Russia, San Marino, Slovakia, Slovenia, Spain, Sweden, Switzerland, Turkey, United Kingdom, Vatican City). These links led to OPACs (Open Public Access Catalogs), national bibliographies, national union catalogs, indexes for periodicals, web servers and gophers, with a section for common European projects.
How did Gabriel begin? During the 1994 CENL annual meeting in Oslo, Norway, it was suggested that national libraries should set up a common electronic board with updates about their ongoing internet projects.
Representatives from the national libraries in the Netherlands
(Koninklijke Bibliotheek), United Kingdom (British Library) and Finland
(Helsinki University Library) met in March 1995 in The Hague,
Netherlands, to launch the pilot Gabriel project. They were joined then
by the national libraries in Germany (Deutsche Bibliothek), France
(Bibliothèque Nationale de France) and Poland (Biblioteka Narodowa).
Gabriel would describe their services and collections, while seeking to
attract other national libraries into the project.
The original Gabriel website was launched in September 1995. It was maintained by the British Library Network Services and mirrored on the servers of the national libraries in the Netherlands and Finland. In November 1995, other national libraries were invited to submit entries describing their services and collections, after they launched their own websites and online catalogs. The number of participating libraries expanded.
During the 1996 CENL annual meeting in Lisbon, Portugal, it was decided that Gabriel would become an official CENL website in January 1997.
The new trilingual (English, French, German) Gabriel portal was maintained by the national library in the Netherlands (Koninklijke Bibliotheek), and mirrored on the servers of four other national libraries, in United Kingdom, Finland, Germany and Slovenia.
What about public libraries? According to “Internet and the Library Sphere”, a document available on the website of the European Commission, 1,000 public libraries from 26 European countries had their own websites in December 1998. The websites ranged from one webpage with a postal address and opening hours to a full website with access to the library's OPAC.
The leading countries were Finland (247 libraries), Sweden (132 libraries), United Kingdom (112 libraries), Denmark (107 libraries), Germany (102 libraries), Netherlands (72 libraries), Lithuania (51 libraries), Spain (56 libraries) and Norway (45 libraries). Russia had a common website for 26 public reference libraries. Newcomers were the Czech Republic (29 libraries) and Portugal (3 libraries).
As for Gabriel’s fate, the portal merged in summer 2005 with the European Library's website (created by CENL in January 2004) to offer a common portal for the 43 European national libraries. Europeana, the European digital library, was launched three years later, in November 2008, with two million documents. Europeana offered 6 million documents in March 2010, and 10 million documents on a revamped website in September 2010.
1997 > E INK, AN ELECTRONIC INK TECHNOLOGY
[Summary] In April 1997, researchers at the MIT Media Lab (MIT: Massachusetts Institute of Technology) founded the company E Ink to develop an electronic ink technology. Very briefly (and not so well) explained, the technology was the following one: caught between two sheets of flexible plastic, millions of micro-capsules, each of them containing black and white particles, are in suspension in a clear fluid. A positive or negative electric field indicates the desired group of particles on the surface, to view, modify or delete data. The first screen using this technology was available as a prototype in July 2002, and marketed in 2004. Other screens followed for various ebook readers (Librié, Sony Reader, Cybook, Kindle, Nook, etc.), as well as prototypes of flexible displays announcing the forthcoming electronic paper.
***
In April 1997, researchers at the MIT Media Lab (MIT: Massachusetts Institute of Technology) founded the company E Ink to develop an electronic ink technology.
The first screen using this technology was available as a prototype in July 2002, and marketed in 2004. Other screens followed for various ebook readers (Librié, Sony Reader, Cybook, Kindle, Nook, etc.), as well as prototypes of flexible displays announcing the forthcoming electronic paper.
As explained on the company's website: "Electronic ink is a proprietary material that is processed into a film for integration into electronic displays. Although revolutionary in concept, electronic ink is a straightforward fusion of chemistry, physics and electronics to create this new material. The principal components of electronic ink are millions of tiny microcapsules, about the diameter of a human hair. In one incarnation, each microcapsule contains positively charged white particles and negatively charged black particles suspended in a clear fluid. When a negative electric field is applied, the white particles move to the top of the microcapsule where they become visible to the user. This makes the surface appear white at that spot. At the same time, an opposite electric field pulls the black particles to the bottom of the microcapsules where they are hidden. By reversing this process, the black particles appear at the top of the capsule, which now makes the surface appear dark at that spot. To form an E Ink electronic display, the ink is printed onto a sheet of plastic film that is laminated to a layer of circuitry. The circuitry forms a pattern of pixels that can then be controlled by a display driver. These microcapsules are suspended in a liquid 'carrier medium' allowing them to be printed using existing screen printing processes onto virtually any surface, including glass, plastic, fabric and even paper. Ultimately electronic ink will permit most any surface to become a display, bringing information out of the confines of traditional devices and into the world around us."
LCD screens of ebook readers were replaced by E Ink screens. Launched in April 2004 by Sony in Japan, the Librié was the first ebook reader with a 6-inch E Ink screen. Launched in October 2006 in the U.S., the Sony Reader had a E Ink screen that gave “an excellent reading experience very close to that of real paper, making it very easy going on the eyes" (Mike Cook, editor of epubBooks.com). The Sony Reader was then available in Canada, United Kingdom, Germany and France, with various models. The Cybook Gen3 launched by Bookeen in July 2007, the Kindle launched by Amazon in November 2007, and the Nook launched by Barnes & Noble in November 2009 also had E Ink screens.
Another display technology was the gyricon, developed since 1997 by PARC (Palo Alto Research Center), the Xerox center in Silicon Valley, California. In December 2000, some researchers at PARC founded the company Gyricon Media to market the SmartPaper, an electronic paper based on the gyricon technology. Very briefly (and not so well) explained, the technology was the following one: in between two sheets of flexible plastic, millions of micro-cells contain two-tone (black and white) beads suspended in a clear liquid. Each bead has an electric charge. An external electrical pulse makes the balls rotate and change color, to display, modify, or delete data. In 2004, Gyricon Media began marketing commercial advertising, for example small posters running on batteries. The company ended its activities in 2005, with R&D activities going on at Xerox.
Another project has been developed by the company Plastic Logic, this time using both proprietary plastic electronics and the E Ink Technology. As explained on the company’s website in 2009: "Technology for plastic electronics on thin and flexible plastic substrates was developed at Cambridge University’s renowned Cavendish Laboratory in the 1990s. In 2000, Plastic Logic was spun out of Cavendish Laboratory to develop a broad range of products using the plastic electronics technology."
1997 > THE ELECTRONIC BEOWULF PROJECT
[Summary] Some digitized versions of treasures from the British Library were freely available online in the late 1990s. One of the first digitized treasures was Beowulf, the earliest known narrative poem in English, and one of the most famous works of Anglo-Saxon poetry. The British Library holds the only known manuscript of Beowulf, dated circa 1000. Brian Lang, chief executive of the library, explained on the website: "The Beowulf manuscript is a unique treasure and imposes on the Library a responsibility to scholars throughout the world. Digital photography offered for the first time the possibility of recording text concealed by early repairs, and a less expensive and safer way of recording readings under special light conditions. (…) This work has not only advanced scholarship; it has also captured the imagination of a wider public, engaging people (through press reports and the availability over computer networks of selected images and text) in the appreciation of one of the primary artefacts of our shared cultural heritage."
***
The British Library began offering digitized versions of its treasures, for example Beowulf, the earliest known narrative poem in English and one of the most famous works of Anglo-Saxon poetry.
The British Library holds the only known manuscript of Beowulf, dated circa 1000. The poem itself is much older than the manuscript — some historians believe it might have been written circa 750. The manuscript was badly damaged by fire in 1731. 18th-century transcripts mentioned hundreds of words and characters which were then visible along the charred edges, and subsequently crumbled away over the years. To halt this process, each leaf was mounted on a paper frame in 1845.
As explained on the website of the British Library, scholarly discussions on the date of creation and provenance of the poem continued around the world, and researchers regularly required access to the manuscript. Taking Beowulf out of its display case for study not only raised conservation issues, it also made it unavailable for the many visitors who were coming to the British Library expecting to see this literary treasure on display. Digitization of the manuscript offered a solution to these problems, as well as providing new opportunities for researchers and readers worldwide.
The Electronic Beowulf Project was launched as a database of digital images of the Beowulf manuscript, as well as related manuscripts and printed texts. In 1998, the database included the fiber-optic readings of hidden characters and ultra-violet readings of erased text in the manuscript; the full electronic facsimiles of the 18th-century transcripts of the manuscript; and selections from the main 19th- century collations, editions and translations.
Major additions to the database were planned for the following years,
such as images of contemporary manuscripts, links to the Toronto
Dictionary of Old English Project, and links to the comprehensive
Anglo-Saxon bibliographies of the Old English Newsletter.
The database project was developed in partnership with two leading experts in the United States, Kevin Kiernan, from the University of Kentucky, and Paul Szarmach, from the Medieval Institute of Western Michigan University. Kevin Kiernan edited the electronic archive and supervised the making of a CD-ROM with the main electronic images.
Brian Lang, chief executive of the British Library, explained on its website: "The Beowulf manuscript is a unique treasure and imposes on the Library a responsibility to scholars throughout the world. Digital photography offered for the first time the possibility of recording text concealed by early repairs, and a less expensive and safer way of recording readings under special light conditions. It also offers the prospect of using image enhancement technology to settle doubtful readings in the text. Network technology has facilitated direct collaboration with American scholars and makes it possible for scholars around the world to share in these discoveries. Curatorial and computing staff learned a great deal which will inform any future programmes of digitization and network service provision the Library may undertake, and our publishing department is considering the publication of an electronic scholarly edition of Beowulf. This work has not only advanced scholarship; it has also captured the imagination of a wider public, engaging people (through press reports and the availability over computer networks of selected images and text) in the appreciation of one of the primary artefacts of our shared cultural heritage."
# Other treasures of the British Library
Other digitized treasures of the British Library were available online as well, for example Magna Carta, the first English constitutional text, signed in 1215, with the Great Seal of King John; the Lindisfarne Gospels, dated 698; the Diamond Sutra, dated 868, sometimes referred to as the world's earliest print book; the Sforza Hours, a Renaissance treasure dated 1490-1520; the Codex Arundel, with notes by Leonardo Da Vinci from 1478 to 1518; and the Tyndale New Testament, as the first English translation of the New Testament, printed in 1526 by Peter Schoeffer in Worms, Germany.
In November 2000, the British Library released a digitized version of the original Gutenberg Bible on its website. Gutenberg printed its Bible in 1454 in Mainz, Germany, perhaps printing 180 copies, with 48 copies still available in 2000, and two full copies at the British Library. A little different from each other, both were digitized in March 2000 by Japanese experts from Keio University of Tokyo and NTT (Nippon Telegraph and Telephone Communications). The images were then processed to offer a digitized version available online a few months later, for the world to enjoy.
# German rare prints
The Bielefeld University Library (Bibliothek der Universität Bielefeld) in Germany offered online versions of German rare prints. Michael Behrens, in charge of the digital library project, wrote in September 1998: " We started digitizing rare prints from our own library, and some rare prints which were sent in via library loan, in November 1996. (…) In that first phase of our attempts at digitization, starting November 1996 and ending June 1997, 38 rare prints were scanned as image files and made available via the web. (…) The next step, which is just being completed, is the digitization of the Berlinische Monatsschrift, a German periodical from the Enlightenment, comprising 58 volumes, and 2,574 articles on 30,626 pages. A somewhat bigger digitization project of German periodicals from the 18th and early 19th century is planned. The size will be about 1,000,000 pages. These periodicals will be not just from the holdings of this library, but the project would be coordinated here, and some of the technical would be done here, also." (NEF Interview)
# The ARTFL Encyclopédie
The same year, the database of the first volume (1751) of the Encyclopédie by Diderot and d’Alembert was available online as an experiment from ARTFL (American and French Research on the Treasury of the French Language), a common project from the CNRS (Centre National de la Recherche Scientifique — National Scientific Research Center) in France and the University of Chicago in Illinois. This online experiment was a first step towards a full online version of the first edition (1751-1772) of the Encyclopédie, with 72,000 articles written by 140 contributors (Voltaire, Rousseau, Marmontel, d'Holbach, Turgot, and others), 17 volumes of text (with 18,000 pages and 21,7 million words) and 11 volumes of plates. Designed to collect and disseminate the entire knowledge of the time, the Encyclopédie was a reflection of the intellectual and social currents of the Enlightenment, and contributed to disseminate novel ideas that would inspire the French Revolution in 1789.
1998 > WEB-EXTENDED COMMERCIAL BOOKS
[Summary] Murray Suid is a writer of educational books and material living in Palo Alto, Silicon Valley, California. He has also written books for kids, multimedia scripts and screenplays. Murray was among the first authors to add a website to his books — an idea that many would soon adopt. He explained in September 1998: "If a book can be web-extended (living partly in cyberspace), then an author can easily update and correct it, whereas otherwise the author would have to wait a long time for the next edition, if indeed a next edition ever came out. (…) I do not know if I will publish books on the web — as opposed to publishing paper books. Probably that will happen when books become multimedia. (I currently am helping develop multimedia learning materials, and it is a form of teaching that I like a lot — blending text, movies, audio, graphics, and — when possible — interactivity)."
***
Murray Suid, a writer of educational books and material based in Palo Alto, California, was among the first authors to add a website to his books — an idea that many would soon adopt.
Murray has also written books for kids, multimedia scripts and screenplays. He explained in September 1998: "The internet has become my major research tool, largely — but not entirely — replacing the traditional library and even replacing person-to-person research. Now, instead of phoning people or interviewing them face to face, I do it via email. Because of speed, it has also enabled me to collaborate with people at a distance, particularly on screenplays. (I've worked with two producers in Germany.) Also, digital correspondence is so easy to store and organize, I find that I have easy access to information exchanged this way. Thus, emailing facilitates keeping track of ideas and materials. The internet has increased my correspondence dramatically. Like most people, I find that email works better than snail mail. My geographic range of correspondents has also increased - - extending mainly to Europe. In the old days, I hardly ever did transatlantic penpalling. I also find that emailing is so easy, I am able to find more time to assist other writers with their work — a kind of a virtual writing group. This isn't merely altruistic. I gain a lot when I give feedback. But before the internet, doing so was more of an effort."
How about web-extended books? "If a book can be web-extended (living partly in cyberspace), then an author can easily update and correct it, whereas otherwise the author would have to wait a long time for the next edition, if indeed a next edition ever came out. (…) I do not know if I will publish books on the web — as opposed to publishing paper books. Probably that will happen when books become multimedia. (I currently am helping develop multimedia learning materials, and it is a form of teaching that I like a lot — blending text, movies, audio, graphics, and — when possible — interactivity)."
He added in August 1999: "In addition to 'web-extending' books, we are now web-extending our multimedia (CD-ROM) products — to update and enrich them."
He added In October 2000: "Our company — EDVantage Software — has become an internet company instead of a multimedia (CD-ROM) company. We deliver educational material online to students and teachers."
1998 > A MORE RESTRICTIVE COPYRIGHT LAW
[Summary] A major blow for digital libraries was the amendment to the 1976 U.S. Copyright Act signed on 27 October 1998, each legislation being been more restrictive than the previous one. As explained in July 1999 by Michael Hart, founder of Project Gutenberg: "Nothing will expire for another 20 years. We used to have to wait 75 years. Now it is 95 years. And it was 28 years (+ a possible 28-year extension, only on request) before that, and 14 years (+ a possible 14-year extension) before that. So, as you can see, this is a serious degrading of the public domain, as a matter of continuing policy." The copyright went from an average of 30 years in 1909 to an average of 95 years in 1998, with an extension of 65 years. Only a book published before 1923 could now be considered for sure as belonging to the public domain in the U.S. The copyright legislation became more restrictive too in the European Union.
***
A major blow for digital libraries was the amendment to the 1976 U.S. Copyright Act signed on 27 October 1998, followed by a more restrictive legislation too in the European Union.
Each legislation was more restrictive than the previous one. As explained in July 1999 by Michael Hart, founder of Project Gutenberg: "Nothing will expire for another 20 years. We used to have to wait 75 years. Now it is 95 years. And it was 28 years (+ a possible 28-year extension, only on request) before that, and 14 years (+ a possible 14- year extension) before that. So, as you can see, this is a serious degrading of the public domain, as a matter of continuing policy. (…) No one has said more against copyright extensions than I have, but Hollywood and the big publishers have seen to it that our Congress won't even mention it in public. The kind of copyright debate going on is totally impractical. It is run by and for the 'Landed Gentry of the Information Age.' 'Information Age'? For whom?"
John Mark Ockerbloom, founder of The Online Books Page, wrote in August 1999: "I think it is important for people on the web to understand that copyright is a social contract that is designed for the public good — where the public includes both authors and readers. This means that authors should have the right to exclusive use of their creative works for limited times, as is expressed in current copyright law. But it also means that their readers have the right to copy and reuse the work at will once copyright expires. In the U.S. now, there are various efforts to take rights away from readers, by restricting fair use, lengthening copyright terms (even with some proposals to make them perpetual) and extending intellectual property to cover facts separate from creative works (such as found in the 'database copyright' proposals).“
The shrinking of public domain also affected the European Union, where copyright laws went from "author's life + 50 years" to "author's life + 70 years", following pressure from content owners who successfully lobbied for "harmonization" of national copyright laws as a response to "globalization of the market".
To regulate the copyright of digital editions in the wake of the relevant WIPO international treaties signed in 1996, the Digital Millenium Copyright Act (DMCA) was ratified in October 1998 in the United States, and the European Union Copyright Directive (EUCD) was ratified in May 2001 by the European Commission. Each country in the European Union was requested to draft and pass its own legislation within a given time frame. In France, DADVSI (Droit d'Auteur et Droits Voisins dans la Société de l'Information) passed in August 2006, with the general public being not so happy about it.
1998 > THE FIRST EBOOK READERS
[Summary] How about a book-sized electronic device that could store many books at once? The first ebook readers were developed in Silicon Valley, California. The Rocket eBook was launched in 1998 in Palo Alto by NuvoMedia, whose investors were Barnes & Noble and Bertelsmann. Shortly afterwards, the SoftBook Reader was launched by SoftBook Press, whose investors were Random House and Simon & Schuster. These two ebook readers were the size of a (large and thick) book, with batteries and a black and white LCD screen. They could connect to the internet through a computer (for the Rocket eBook) or directly with a built-in modem (for the SoftBook Reader) to download books from the digital bookstores available on the companies’ websites. Other models followed in 1999, for example the EveryBook Reader, launched by EveryBook, and the Millennium eBook, launched by Librius. The Gemstar eBook was launched in the U.S. in November 2000. The Cybook (1st generation) was in Europe in January 2001.
***
How about a book-sized electronic device that could store many books at once? The first ebook readers were the Rocket eBook and the SoftBook Reader, launched in Silicon Valley in 1998.
These dedicated electronic readers were the size of a (large and thick) book, with a battery, a black and white LCD screen, and a storage capacity of ten books or so. They could connect to the internet through a computer (for the Rocket eBook) or directly with a built-in modem (for the SoftBook Reader).
They got much attention from book professionals and the general public, with few of them buying them though, because of their rocket-high price — several hundreds of dollars — and a small choice of books in the digital bookstores available on the companies’ websites. Publishers were just beginning to digitize their own books, still wondering how to market them, and worried with piracy concerns.
# The Rocket eBook
The Rocket eBook was launched in 1998 as the first dedicated ebook reader by NuvoMedia, a company founded in 1997 in Palo Alto. The investors of NuvoMedia were Barnes & Noble and Bertelsmann. NuvoMedia wanted to become "the electronic book distribution solution, by providing a networking infrastructure for publishers, retailers and end users to publish, distribute, purchase and read electronic content securely and efficiently on the internet". The Rocket eBook could connect to a computer (PC or Macintosh) through the Rocket eBook Cradle, a device with two cables, a cable for power through a wall transformer, and a serial cable for the computer.
# The SoftBook Reader
Shortly afterwards, SoftBook Press launched the SoftBook Reader, along with the SoftBook Network, “an internet-based content delivery service”. The investors of Softbook Press were Random House and Simon & Schuster. With the SoftBook Reader, "people could easily, quickly and securely download a wide selection of books and periodicals using its built-in internet connection". The device, "unlike a computer, was ergonomically designed for the reading of long documents and books."
# Other ebook readers
Other ebook readers were launched in 1999, for example the EveryBook Reader, launched by EveryBook, and the Millennium eBook, launched by Librius.
The EveryBook Reader was "a living library in a single book", with a "hidden" modem to dial into the EveryBook Store, for people “to browse, purchase, and receive full text books, magazines, and sheet music”.
The Millennium eBook was a "small low-cost" ebook reader launched by Librius, a "full service e-commerce company". On the company website, a World Bookstore "delivered digital copies of thousands of books" via the internet.
All these ebook readers didn’t last long. People would have to wait to get through the millenium to see the Gemstar eBook in the U.S. and the Cybook (1st generation) in Europe.
# The Gemstar eBook
The Gemstar eBook was launched in November 2000 after Gemstar bought in January 2000 Nuvomedia (author of the Rocket eBook) and SoftBook Press (author of the SoftBook Reader), the two companies that created the first ebook readers. Two versions of the Gemstar eBook were available for sale in the U.S., the REB 1100 (successor of the Rocket eBook) with a black and white screen, and the REB 1200 (successor of the SoftBook Reader) with a color screen, both produced under the RCA label, belonging to Thomson Multimedia. Gemstar tried to launch them in Europe too, beginning with Germany, while buying 00h00, a French publisher of ebooks, in September 2000. In fall 2002, cheaper ebook readers were launched as GEB 1150 and 2150, produced by Gemstar instead of RCA. Sales were still far below expectations. The company stopped selling ebook readers in June 2003, and stopped selling ebooks the following month.
# The Cybook
The first European ebook reader didn’t work well either. Developed by Cytale, a French company created by Olivier Pujol, the Cybook (21 x 16 cm, 1 kilo) was launched in January 2001. Its memory — 32 M of SDRAM and 16 M of flash memory — could store 15.000 pages, or 30 books of 500 pages. Sales were far below expectations, and Cytale closed its doors in July 2002. This model was later renamed Cybook 1st generation, waiting for more generations to come. The Cybook project was taken over by Bookeen, a company created in 2003 by Michael Dahan and Laurent Picard, two former engineers from Cytale. The Cybook 2nd generation was available in June 2004. The Cybook Gen3 (3rd generation) was available in July 2007, with a screen using the E Ink technology.