As Encarta was being developed (~1991 per another comment here), the Internet was still highly nascent. I was at uni during this period, and a major bragging right of the campus I was on was dedicated high-speed network connections to other schools within the university system ... over 56K leased lines. Those were shared amongst the 100k+ student, faculty, and staff population (though a very small fraction of those used it).
The takeaways I get of this:
1. Exponentially-developing technology can pass you by quickly. The Internet went from exceedingly obscure to global in the ten years of the 1990s. Broadband wasn't ubiquitous by 1999, but it was increasingly available.
2. Standards matter. Even constructing a workstation that could handle reading Encarta was a challenge, and the tools to compose, render, and especially, present multimedia content (images, audio, video) were not common. Microsoft went the closed-source proprietary route, dooming them to the dustbin (though pieces were salvaged).
3. Standards are hard. Re-read above.
4. There are thresholds of utility that make or break things. I've been around infotech long enough (somewhat pre-dating the periods discussed here) that I've seen numerous technologies go from extreme cutting edge to widely adopted to passe. (And quite a few proposed but never gaining critical mass.) The reasons why any given tech fails right now are numerous. Luck plays a major role.
The present has exceptionally cheap bulk storage (my tablet has a 128GB removable microSD card for about $50), high-speed, ubiquitous, and wireless networking, and tools for sharding and distributing updated documents and file formats (git, rysnc, etc.). This makes distributed updatable large-scale works possible.
The technology of written works has undergone several seismic shifts, from clay tablet to papayrus roll to codex to moveable type. Less well-known (but only slightly) are the updatable formats: loose-leaf, three-ring, and replaceable bindings, all introduced in the late 19th century, which enabled updatable works. These were true "periodicals", where sections could be updated with amendments or replacements as information changed.
The database, digital file, early version control, Wiki, and distributed version control are, IMO, all legitimately novel forms of written works, which should be recognised as such. They have and will continue to change how content is created and used, and affects and interacts with society.
Makes you wonder, will there ever be a replacement for the Internet?
What would that look like? An incompatible sister network that offers something the existing internet does not offer? A new network created after major disasters and world wide nuclear war destroy all existing infrastructure? I can’t imagine the existing internet would ever be replaced, certainly not by something offering privacy like ToR, because not enough people really care about privacy. The average person doesn’t really give a damn.
Chances are, just like modern internet took advantage of old communication technology (telephone wires) to spread, the "next" internet will take advantage of the "current" one.
"Replacement in what sense?" probably deserves exploration.
The underlying fundamental concepts of the Internet are 1) packet-switched (as opposed to through-circuit) communications, and inter-connected networks, via BGP.
There's a lot that's layered on top of this which is seen as fundamental, but is not entirely so, most especially end-to-end connectivity and universal point-to-point access. The Internet formed under tremendously different conditions than exist today, with only a handful of nodes through 1980, and even as late as the late 1980s only a few thousands.
Roughly each order-of-magnitude increases since then seem to have come with its own set of additional headaches and concerns, mostly regarding management and mitigation of abuse. The idea of guarded borders has long been seen as anathema. I see it as all but inevitable, and the question is whether that's done well or poorly.
There are a number of earlier networking ideas which might resurface or be adapted for new use, and a survey of history might be useful here. (John S. Quarterman's The Matrix, 1990, is a fascinating time-capsule exploration of these just at the cusp of the modern Internet -- the World Wide Web does not even make mention, though "The Web, a national Canadian nonprofit conferencing system formed in 1987, does.
I'd also look at uses of the Internet and user needs.
Fundamentally, digital networks serve as communications and control media. Whether or not these need to (or can be) segregated is an open question, but splitting off, say, the IoT from other communications, might make sense, along with SCADA and military communications -- all largely control networks.
Splitting out video and voice from text and data, likewise. Much of high-demand comes from those.
As the Internet heads from the first billion or two users to the remaining five or so, questions of what technology, interfaces, and devices are appropriate for a set of digital newcomers may be worth consideration.
Or, looking at this differently: the degree to which those already here may be interested in maintaining a separate space for themselves. Not that this is necessarily equitable, but it may well prove to be attractive, say, in particular to minimise fraud and other malicious use or attacks originating from the global poor, who are in many ways justified in wanting some of the pie that's been denied to, or taken from, them.
How the needs, wants, attractions, and/or detractors or aversions affect technology development and adoption remains to be seen.
I think the phase that usurped Ecarta's position had more to do with good quality web search and to a lesser extent the crowdsourcing model of Wikipedia (run by a foundation rather than a for profit mega corporation) than to content or data standards.
Search within an encyclopedia (e.g., Wikipedia) doesn't rely on Web search generally (e.g., DDG).
I can see your point that content outside a curated context replaces an encyclopedia to an extent, and up until about 5-10 years ago, as clickbait and black-hat SEO finally won out, much noncommercial Web content was at least informative, if of generally lower quality than traditional printed sources. Convenience has a huge edge over quality, though. These days, it's Wikipedia's curated content, especially for complex breaking news stories, that is my first go-to. After Idleword's (Maciej Cezglowski's) piece on Hong Kong hit HN last week, I finally took a look at Wikipedia's article on the protests there. A full 73 page long article with references on a protest movement only a couple of months old. That's staggering, and exceeds all but the very best news sources. (Another case I'd noted was the Oroville Dam case, where Brad Plumer's article for Vox was the only trad med piece I could find even remotely approaching Wikipedia's article. The first instance of this I recall was the 2004 Boxing Day quake/tsunami in Indonesia / Indian Ocean.)
Wikipedia, on the other hand, greatly enhances Web search, though it also benefits generally from the high profile resulting from that.
Wikipedia's crowdsourcing, the reliance on underlying technologies (Wiki, HTML, mediawiki markup), and on a huge set of organisational systems, standards, practices, and solutions Wikipedia and the Wikimedia Foundation have arrived at, were transformational. Though those too strongly resemble the largely equivalent or analogous practices of earlier encyclopaedic efforts dating back to at least Diderot, as well as other reference works (OED, see Simon Winchester's The Professor and the Madman).
(Note: updated to add 2nd 'graph beginning "I can see your point...")
The arrival of good internet search along with the ever increasing pool of web content doomed a curated content approach such as was used by Encarta (which could never aspire to be as broad or deep as the web). I can tell you, based on watching sales figures and talking to many customers, that web search was a much bigger factor for the business than was Wikipedia. In any case it would have been much trickier to get the level of crowd input given to Wikipedia by a for profit enterprise. Regardless of tools or formats used.
Btw despite the brief lifespan of the product, Encarta gave Microsoft reasonable return on investment. And arguably more importantly, it helped to quickly entrench the Multimedia PC standard, especially in homes. Which helped reduce Cost Of Goods Sold for all of the company's products, which netted a very nice return in cost savings.
> The technology of written works has undergone several seismic shifts, from clay tablet to papyrus roll to codex to moveable type.
This confuses a number of different types of technology. For example, we haven't undergone a shift from codices -- we still use that form for all our written works today.
Clay and papyrus are materials on which text is written. Scroll and codex are physical forms in which long texts are organized. Movable type is a technology for recycling printing blocks. None of those three types of things are related to the other two, except that it's impossible to store clay as scrolls because it will dry. Clay is stored looseleaf.
Interestingly, the history of writing in the Near East shows exactly the same confusion -- a big reason written Aramaic grew more popular than Akkadian was that it was drawn on paper and Akkadian was carved into clay, despite the fact that logically nothing's stopping you from carving letters into clay or drawing cuneiform on paper.
I'm compressing thoughts for brevity. The codex, handwritten, allowed random access, but was still phenomenally expensive to create. On the order of a million dollars per copy. They were chained, the language was standardised -- you brought readers to the work (Latin) rather than works to readers (vernacular).
Moveable type, cheap paper, mass literacy, vernacular language, typographic conventions, high-speed presses, and mass distribution, create a wholly different impact.
As Elizabeth Eisenstein noted, the printing press is an agent of social change. Generalised, all comms tech is.
So do scrolls. You're not supposed to open the entire scroll at once. The codex is better at this.
> but was still phenomenally expensive to create. On the order of a million dollars per copy.
This is nonsense. You can prepare a very expensive book, but you don't have to. The idea that books were necessarily earth-shakingly expensive to create conflicts with the known reality of commercial popular novels in the ancient world.
The takeaways I get of this:
1. Exponentially-developing technology can pass you by quickly. The Internet went from exceedingly obscure to global in the ten years of the 1990s. Broadband wasn't ubiquitous by 1999, but it was increasingly available.
2. Standards matter. Even constructing a workstation that could handle reading Encarta was a challenge, and the tools to compose, render, and especially, present multimedia content (images, audio, video) were not common. Microsoft went the closed-source proprietary route, dooming them to the dustbin (though pieces were salvaged).
3. Standards are hard. Re-read above.
4. There are thresholds of utility that make or break things. I've been around infotech long enough (somewhat pre-dating the periods discussed here) that I've seen numerous technologies go from extreme cutting edge to widely adopted to passe. (And quite a few proposed but never gaining critical mass.) The reasons why any given tech fails right now are numerous. Luck plays a major role.
The present has exceptionally cheap bulk storage (my tablet has a 128GB removable microSD card for about $50), high-speed, ubiquitous, and wireless networking, and tools for sharding and distributing updated documents and file formats (git, rysnc, etc.). This makes distributed updatable large-scale works possible.
The technology of written works has undergone several seismic shifts, from clay tablet to papayrus roll to codex to moveable type. Less well-known (but only slightly) are the updatable formats: loose-leaf, three-ring, and replaceable bindings, all introduced in the late 19th century, which enabled updatable works. These were true "periodicals", where sections could be updated with amendments or replacements as information changed.
The database, digital file, early version control, Wiki, and distributed version control are, IMO, all legitimately novel forms of written works, which should be recognised as such. They have and will continue to change how content is created and used, and affects and interacts with society.