Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a long term collector of old PC hardware, software and games, it's becoming pretty obvious that keeping aging hardware around has a very tangible expiration date to it. We are only able to keep this tech going because a lot of it was mass produced and we can find replacement parts, but we're already seeing massive failure in parts which render a device unusable regardless of how many blown capacitors or resistors we change.

There comes a point when the only way to keep this stuff running is either through manufacturing replacement parts (prohibitingly expensive) or through emulation. Emulation is cheap but completely unrepresentative of the original experience. For example, you can't emulate different displays properly. As anyone who enjoyed the Vectrex's beautiful vector graphics will agree, you simply cannot emulate this on a modern display. The bright phosphorus coating with the smooth analog beam is not reproducible on something like LCD.

So, I still collect this stuff and enjoy it. But I have given up my ambition for hardware preservation. It will take an organization much better funded than me to keep this stuff alive in the long term. And even then, I suspect a lot of it is going to be lost to time in the coming decades.



> manufacturing replacement parts (prohibitingly expensive)

Some are more than others. PALs and ASICs can be reimplemented in CPLDs and FPGAs. At come point it becomes Theseus' computer, but, if the goal is to preserve the boat, that works. If you want to preserve the wood planks, then you need to keep it powered down in a controlled environment for future generations to apply atomic-resolution sensing eventually.

> For example, you can't emulate different displays properly. As anyone who enjoyed the Vectrex's beautiful vector graphics will agree, you simply cannot emulate this on a modern display. The bright phosphorus coating with the smooth analog beam is not reproducible on something like LCD.

Not right now, but, soon-ish, we'll be able to do that very well with HDR displays. I already can't see pixels or jaggies on my LCDs. With proper computation, one could simulate the analog aspects of CRT tubes - its a lot for GPUs to do now, but, in a decade or so, I assume a mobile GPU would be able to do that without breaking a sweat.

Not long ago I was thinking about a CRT+deflection replacement that would take the analog signals used in CRTs (from an adapter on the neck, where the pins have a lot of variation, and the inputs to the coils), and, maybe, an extra power supply input to power the electronics, and spit out an HDMI signal on the other end.

This should be possible with modern flat panels to a point the image is hard to distinguish for all but the most extreme (think X-Y displays and Tektronix DVSTs) cases.

Curvature is an issue, but flat and Trinitron CRTs should be trivial.


That really depends on what you’re trying to emulate about a display. You can see artifacts from how the electron beam on a CRT paints the image by holding your hand in front of the screen fingers spread and shaking your fingers back and fort. Emulating that well might take a ~10,000fps display which I doubt anyone is ever going to produce.


I suspect even most hardcore retrocomputer hobbyists care most about emulating the parts of a display that actually came up in use of the machine. If I were an eccentric billionaire who wanted a replica of the Mona Lisa to hang on my wall and enjoy regularly without the inconvenience of weekly flights to France, I'd care much more about my money going into making a product that got the visible details of the canvas right (https://hyper-resolution.org/view.html?pointer=0.055,0.021&i...) than spoofed the proper results if I carbon-dated it or something. I think the same concept applies here. I don't really care as much if the only thing an emulator can't replicate is a clever but (to an observer) comically specific physics-based test for authenticity if I get everything I'd notice while using the computer correct at a fraction of the price. In the context of preservation, just knowing some other (far richer than me) person or org keeping a single-digit number of the actual artifact maintained for future reference is good enough for me.


I'm not sure I follow. CRTs draw the image to the screen in a fundamentally different way than modern displays due to how the electron beam moves sequentially left to right/top to bottom. This analog process, happening at 15 or 25hz, is what gives authentic arcade machines their look and feel. Same for old computer terminals. My understanding is that to reproduce this effect on a modern display, you'd need an extremely high refresh rate. To properly replicate this requires some pretty low level aspects of the system to be addressed. Hardware limitations are bound by the laws of physics after all.

Beyond just the aesthetics, there are practical reasons why this is important, whether it be lighgun idiosyncrasies or how the game "feels," which can affect timing and such for competitive players. There's a lot more to preserving the look, feel, and compatibility of displays for old computer systems than most realize and the rabbit hole can go quite deep on this one.


    there are practical reasons why [how tthe electron gun works is] 
    important,  whether it be lighgun idiosyncrasies or how the 
    game "feels,"
This is always interesting to discuss because there are so many factors at play! To put it in less than a zillion words,

The way a game "feels" in this context is essentially a function of input latency. The old-style "chasing the beam" hardware, plus a CRT display, equals something very close to a true zero lag environment.

Here's a breakdown of the input lag in a modern environment, for contrast. These are all latencies that don't exist in something like, say, a Sega Genesis/Megadrive hooked up to a CRT: http://renderingpipeline.com/2013/09/measuring-input-latency...

In an ideal emulation situation, you could theoretically recreate something close to a zero-lag analog environment (in terms of latency) without necessarily simulating the path of the electron beam itself.

Although, as the linked article implies, there are a lot of bits in the emulation stack that would need to be optimized for low latency. High refresh rate displays get you part of the way there "for free."


Not everything is games that require minimum latency, though. For, say, a terminal, or a CDC 6x00 console, some lag is perfectly acceptable.


Sure, and even many games don't particularly benefit from it. However, it's a really remarkable thing to play e.g. Mega Man or Smash Bros. in a true lag-free environment.


I wonder about that, might be that having specialized display controller on say OLED display could've been enough ?

You could then have the controller artifically drive it line by line instead refreshing whole screen


Perhaps. One issue I foresee is the way CRTS glow. The phosphor doesn't light/dim immediately the way an LED does. So there's some amount of fade in/out that happens on a CRT as the beam moves across the screen. I imagine this could be difficult or impossible to reproduce with a traditional OLED screen. Some old games rely on this technique along with the slow refresh rates to to create a sort of dithering/aliasing effect.


Phosphor decay is not terribly difficult to simulate to an acceptable degree. Doing it at the pixel level is pretty easy, doing it at the phosphor level is computationally harder but not much more complicated.

The larger issue w.r.t. this specific quirk of CRTs is that we're running out of human beings that are familiar with what this is "supposed" to look like, and actually care.

I care a lot, but I'm old.


I'm not aware of any cases where it's been emulated in any acceptable manner. I can't be bothered to do the math myself, but I imagine doing this well would be beyond the capabilities of modern displays (probably in the 1000s of hz refresh rate). Maybe some special FPGA based controller with an OLED like was suggested above could make it possible. I'm not sure.


Can you talk more about why you feel it would be infeasible? I'm a guy with a house full of CRTs so I am genuinely interested.

What sorts of things are advanced filters like CRT-Royale are missing? https://www.google.com/search?q=crt-royale

Each individual phosphor dot on a CRT is not terribly tricky to emulate.

The brightness at any given moment is a fairly simple decay function based on how long it's been since you lit it up with the electron gun. On top of that, you would typically want to apply some level of bloom to simulate the way light is diffused by the glass. Sure, you've got a few million dots to simulate, but this is also anembarrassingly parallel problem.

Now of course, admittedly, you're only simulating that phosphor glow decay at the refresh rate of your monitor -- 60hz, 144hz, 240hz, whatever -- instead of an effectively infinite level of steps as would be the case in real life. However, I don't think that is a practical issue.

You're clearly thinking of factors I'm not and I'm genuinely interested. To my mind, the visual aspects of CRTs are pretty easy to simulate, but not the near-zero lag.


The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining. And in vector graphics you don't have pixels at all, the light shines quite beautifully in a way I don't think is possible at all with backlit displays.


> The thing you can't emulate is the phosphorus coating. It simply looks different because light isn't coming from a backlight, but the front display is actually shining.

I did said OLED not LCD precisely because of that


Are you sure it's the decay they're using, and not the natural blurriness and texturing?

And some phosphors have a much longer decay than others, but you could easily emulate those long-term effects on a normal screen.


That and the fade follows a non linear curve. It’s pretty cool, but quite a lot of math to match the physics going on.


It would need to redraw the whole screen to account for the phosphor decay. To do that with line resolution and an NTSC signal, you’d have to redraw it roughly 1500 times per second (60 fields of about 250 lines). You’d draw the current line at full brightness and “decay” the rest of the frame according to that phosphor persistence. Since there is some quantization, you could reduce the frequency of decays as the line gets older.


On the concept of these very weird displays, I remember an HP oscilloscope that had a monochrome CRT and a polarizer in front cycling between R, G, and B components on every refresh cycle. Overall, the screen resembled a DLP projection when you'd see different color frames when your eyes moved, but a stable color when you were looking at a part of the screen. A very neat way of producing crazy small color pixels on a 7"ish CRT.

And yes, that device cost about the same as my house back then (2002).

https://hackaday.com/2019/01/17/sharpest-color-crt-display-i...


I'll give you an example from the LCM. They had a PLATO terminal with its original plasma flat panel display. I'd been reading about PLATO for years and had even run it in emulation but I'd never seen actual hardware before visiting the LCM.

The experience on the original terminal was way different than emulation. The way the screen worked and the tactile feel of the keyboard was the core of the experience of it. Sitting at an actual terminal really changed my understanding the system because it gave me a physical context that software emulation could not provide. You'd be hard pressed to emulate the eye melting nature of the original plasma display or the stiffness of the keyboard.


The physical experience is a huge part of the overall thing. I have a C64 Maxi and it's absolutely amazing, exquisitely close to the original (but with an HDMI output and USB ports)


    I'd care much more about my money going into 
    making a product that got the visible details 
    of the canvas right than spoofed the proper 
    results if I carbon-dated it or something
You've inadvertently highlighted one of the challenges of preservation: identifying which aspects matter.

Does fooling a carbon dating test matter? This is purely subjective, but for most people surely not.

But interestingly you've linked to an ultra high resolution image viewer that lets the viewer drill down into a nearly microscopic view of the painting. If a person doesn't know much about art, they might think that if you could take something like this and hang it on your wall, it would be a pretty damn good replica of the real thing. It would certainly be cool, I have to admit. Hell, I'd love it on my wall.

And yet, it's utterly different than the real thing. Paintings in real life are three dimensional. Van Gogh in particular is one who used thick gobs of paint. Each fraction of a micron of the painting has height and its own reflective properties which interact with the light in the room as you walk around and observe it.

   if I get everything I'd notice while using the 
   computer correct at a fraction of the price.                                                                                                                                                                                                                                                                                                                         
Well, that's the thing. It's certainly up to the individual whether or not they give a crap about any particular detail.

If you don't care about how oil paintings actually look in real life, or what video games actually looked and felt like, and you choose to brand all of the things you don't understand or don't care about as "comical", then... well, more power to you. That's your choice.

But some people choose to care.


    Not right now, but, soon-ish, we'll be able 
    to do that very well with HDR displays [...]

    flat and Trinitron CRTs should be trivial. 
Visually I think we're really close in most of the ways that matter, with advanced shaders like CRT-Royale.

However, there's an entire additional dimension missing from this discussion so far - latency. When paired with original game hardware up until the 16-bit era or so, CRTs offer close to a true zero latency experience.

It's not possible to recreate this with modern tech. When we add up everything in the stack (display, input drivers, everything) we're looking at over 100ms of lag.

http://renderingpipeline.com/2013/09/measuring-input-latency...

We're not totally without hope. As the (ancient!) article notes, higher refresh rate displays reduce many of these latencies proportionally. And for non-action games, latency doesn't matter too much in the first place.

    At come point it becomes Theseus' computer, 
    but, if the goal is to preserve the boat, 
    [CPLDs and FPGAs] work
Well, for some parts of the boat.


> However, there's an entire additional dimension missing from this discussion so far - latency. When paired with original game hardware up until the 16-bit era or so, CRTs offer close to a true zero latency experience.

It's hard to even match the input latency and general responsiveness of an NES hooked to a CRT TV with a composite cable with modern hardware, let alone something more-integrated.

My usual test is "is it very, very hard to get past Piston Honda on Punch Out?" Often, with an initial, naive set-up, it's nearly impossible. Get it dialed in a little and he becomes easy (if you spent like a billion hours playing that game as a kid, anyway). But with many display + computer + controller combos it's just impossible to get it close enough to right, no matter how you tune it.

That's my test because it's really easy to tell if the latency is bad, but if it is I'll find myself falling off things constantly in Mario, too, it's just harder to tell if I'm playing poorly or if the system's the problem. The NES is hard enough, add just a little latency and it's hellish.


Latency is one dimension. Another one is peripheral compatibility. Devices such as light pens and light guns are incompatible with LCDs. These peripherals depend on the timing information encoded by the raster scan of CRTs.

This means classic light gun games such as Duck Hunt for the NES are impossible to play without a CRT.


Doesn't duck hunt use an entire frame of black then white? You don't need raster scan emulation for that, just very low latency.

Edit: Apparently there is a 15kHz filter inside the light gun. So it's not really about beam accuracy or brightness, it's about pulsing at the right frequency.


LCDs aren't capable of pulsing at 15kHz. The twisting/untwisting of the liquid crystals is an electromechanical process and very slow (compared to a racing electron beam). Even though the fastest gaming LCD monitors claim a 360Hz refresh rate, they cannot get anywhere near this 2.8ms latency (implied) when going from black to white (0 to 100% brightness). Of course, the monitor manufacturers go to great lengths to avoid talking about it, so the whole industry is flooded with a bunch of marketing material to distract from the issue.


> LCDs aren't capable of pulsing at 15kHz.

Yeah, I know. But you don't need the LCD to pulse, you need the backlight to pulse.

An LCD might still have issues switching fast enough, but an HDR OLED tuned to 15kHz PWM might be able to handle it. If it was designed with minimum latency in mind of course. Most screens buffer a full frame and that won't work. But playing duck hunt doesn't require timing the actual beam as it sweeps across the screen. You just need to be displaying rows with a buffer that's no more than a few rows high, and have the rows flicker. Also many third party controllers don't care that much about the flicker.


Oh, with OLED you could probably design it to mimic the raster scan of a CRT perfectly, cascading the row and column signals along at 15kHz. The issue is, who is going to build that? I don't think Duck Hunt is too high on the priority list for OLED panel makers.

The really sad thing is that some day all of the CRTs will be dead and all of the expertise to build them too. The tooling and factories are already gone, so it's unlikely new CRTs will ever be built, unless some Ben Krasnow-esque super-hobbyist gets really passionate about it.


> Oh, with OLED you could probably design it to mimic the raster scan of a CRT perfectly, cascading the row and column signals along at 15kHz.

You could but it wouldn't have the same brightness as it sweeps so I don't know if that's good enough to trick a light gun by itself.

But I still think you shouldn't discount LCD. If you can get an LCD to switch a good fraction of the way in half a millisecond, and use the right backlight, you could make duck hunt work.

> The issue is, who is going to build that? I don't think Duck Hunt is too high on the priority list for OLED panel makers.

To trick the frequency filter might be too much effort, but the latency of having each row come along instantly might get some effort into it. Marketing loves low latency.


The low latency claimed by LCD marketers concerns narrow grey-to-grey transitions. Black to white remains as slow as ever.

The other issue is all of the other causes of latency along pipeline. The NES emits an composite video signal directly from its video chip, the PPU. This composite signal travels along a coax cable into the back of the TV where it’s split by the analogue circuitry into signals driving the luminance and colour, synchronized to the horizontal and vertical retrace. The whole process happens in less time than it takes to convert that signal into digital before it could even be sent to an LCD.

That is, before our LCD display even receives a frame it’s already on the screen of the CRT. The NES is explicitly designed around the NTSC timing structure, with the rendering in the PPU happening “just in time” to be sent to the CRT. There is no place in the NES to buffer a frame.


> The whole process happens in less time than it takes to convert that signal into digital before it could even be sent to an LCD.

While that's true, doing the conversion doesn't need to add more than a microsecond of latency.

> That is, before our LCD display even receives a frame it’s already on the screen of the CRT. The NES is explicitly designed around the NTSC timing structure, with the rendering in the PPU happening “just in time” to be sent to the CRT. There is no place in the NES to buffer a frame.

An LCD doesn't have to buffer a frame either. I believe there are models that don't. It can display as it receives, limited by the crystal speed which is still an order of magnitude improvement.


Current consumer LCDs, sure, but there's no realson a high-refresh-rate LCD couldn't emulate the flying spot of a CRT, and thus be compatible with light pens / guns.


The horizontal scan rate of a CRT TV is 15,625 Hz. Good luck getting an LCD to refresh that quickly.


In order for that to work, it’d need to be able to switch individual pixels in sequence, one at a time. The display panel would need to be designed for this - current panels aren’t, but as long as a screen position could switch to 100% in about 250ns, a sensor could tell precisely which pixel it’s looking at.


Liquid crystals cannot switch from 0 to 100% in less than 10ms, never mind 250ns. They’re electromechanical devices that need to physically twist/untwist to affect the polarization of light.

Contrast that with a CRT which uses a 25kV acceleration voltage to drive an electron beam up to 30% the speed of light (takes about 3.3 nanoseconds to travel 1 foot from the back of the CRT to the screen), which then strikes a phosphor that glows due to its valence electrons falling from excited states (which takes a few nanoseconds).


High-res OLED displays can do a pretty darn good imitation of a CRT if the software is tuned for it.


> ...then you need to keep it powered down in a controlled environment

Mold and water damage can be mitigated with environmental controls but even then you're going to have decay issues because so many components just break down overtime. Many plastics just become brittle and disintegrate over time with or without exposure to UV. 10 year old devices have their softtouch rubber turning to a gooey sticky melting mess. Electrolytic capacitors and batteries leaking are commonly known but lesser known issues occur. The MFM Drive testing that Adrian's Digital Basement did recently comes to mind, 5 of 5 drives he tested were dead. One was a drive he'd validated a year prior and stored correctly.


I’ve never seen a Vectrex, but having seen other vector displays on old arcade machines, it’s truly mind blowing.

The warm glow of those smooth beams has no modern equivalent in consumer hardware. Despite technically being old technology, and some machines being over 4 decades old, it can still blow young minds seeing a way of rendering so different from what we accept as the norm today and how it’s in many aspects even more advanced than modern screens.

Knowing it’s something that’s gradually being lost with time is pretty sad. Previous human technology advancements would often be lost in a hole and still be recognizable for what they were centuries later when they were dug up. It’s kind of strange that the tech we make since the computer revolution basically dies along with their creators and maintainers and basically just turn into future door stoppers.


I’m not sure I follow your final conclusion… we mostly don’t use archeological artifacts do we? They are “recognizable” but so will old computers be 1000 years from now. With a good microscope you can still study the chips, etc.

If anything I expect computers from 1985 to be much better “preserved” than physical history from, say, 885, no?

Granted you have to resist the urge to use them as door stoppers and keep them in a controlled environment. But an archive is a lot cheaper/more stable than a living museum.


Just to use your example, you could dust off a 200 year old microscope and it’d be perfectly fine. If there’s a cracked lens, that’s an easy fix without any significant expertise.

Same with any old blacksmithing tools and so on. They’ll look old, but generally retain their functionality and a novice can repair them.

Old electronics require expert knowledge and specifically manufactured parts to fix. Knowledge of old electronics dies each year and the parts go away. A person who’s never heard of a Vectrex or see one in action would have no idea what it does. Even if they recognize that it has a screen, they wouldn’t be aware that it renders vectors and not pixels like nearly every other screen. Another 1000 years removed from the computing standards of today, people might not even be able to imagine what a typical desktop computer does when they look at it. But they’ll see tools from 885 and generally be able to guess what they’re for.

High tech things are very abstracted tools that are just going to look like useless boxes to future people. Most won’t know or care what they did. It’s weird to think about. Advances will also be lost with time without us realizing it. Vector screens are truly impressive, but people don’t know they exist. Will we someday get monochrome retinal implants and they’re convenient enough that people give up on traditional computer screens, and someday humanity forgets we all used to have full color displays with sound in our pockets? Who knows.


    we mostly don’t use archeological artifacts do we?
Well that's the thing, right? A 5000 year old sword or a 50000 year old pot are static. It's not quite the same as actually using and touching the objects, but you can see them in the museum and it's fine.

Something like a car engine or a video game machine, that's a collection of moving parts. No single part of them is interesting. It's how they work together. Problem is, each time you use them they get a little bit closer to death and they also get closer to death just sitting there in a climate-controlled environment.


You do notice a lot of things about older items by actually using them. Little touches, weight, balance, the way the design is remarkably nice for some particular use or other in ways and for reasons you wouldn't have guessed just from looking at it, the satisfying feel of a hinge or knob or pull. Our local art museum has a collection of elaborate silver tea services—originally intended for actual use, not just decorative—and I bet one would notice a lot of things about them by actually using them, that one is unlikely to spot or understand just by looking, but no-one's likely to ever use them again—at least not for quite a while.


Absolutely. You can't appreciate e.g. how wonderfully balanced a knife is until you use it.

But, it's a narrower gap than with a more complex machine.


I love the Atari vector games! There’s an excellent video by Retro Game Mechanics Explained on how they work: https://youtu.be/smStEPSRKBs

Recently I stumbled on some Tektronix videos on YouTube. Those vector displays are even more mind blowing than Atari’s arcade games. I hope I can get to see one in person someday.


This can really be a labor of love. The volunteers at the Rhode Island Computer Museum, which is much more modestly funded than LCM, have ongoing restoration projects for rare vintage machines like the PDP-9 and PDP-12. They literally have a warehouse full of stuff like this[].

[] https://www.ricomputermuseum.org/collections-gallery


In the dark of winter, while I work to fill the hours, I troll ebay and Craigslist...looking at the $200 Macs and $400 amigas and $800 Apple ][s and $2000 complete NeXT stations, think it's another hobby that's gotten more expensive faster than I have the stomach for.

I also realize I'll get 95% of the enjoyment by just hitting the emulators at archive.org.

I gifted myself an Amiga Mini500 for xmas and once you play the games, and mess with Amiga Forever for an afternoon, and Amibian for ANOTHER afternoon...that scratch is pretty well itched.


A bit off-topic but relevant to the aging problem of tech: is there any commonly-available persistent storage medium you would use to store say, family movies for 100 years? Or is that a hopeless cause? For instance, will BluRay players be around for much longer? USB drives?

At what point are we where "Either put it in the cloud or lose it" is the law of tech?


I think it is safe to say you have to plan on moving media you care about to a different format every couple decades. From the first Edison wax cylinders, to today's latest format, the technology has changed that often, and if you care you have to follow. Sure you can keep older stuff working and some do. You can also say the same about photographs and books: most degrade after a few decades, but there if you spend $$$ you can get archival grade that will last for a century or two (many centuries if you store in a desert, but in human friendly places a century is all you should count on).

I have some archival grade CDs and DVDs that claim they will last longer. However I'm not sure if readers will exist as long as the media.


I've always wondered if you could do something with dot-matrix-on-aluminum-foil. It wouldn't look the prettiest and the information density would be low, but it seems like it would be cheap and last forever.


The scene mentioning the big EMP-reset in Blade Runner 2049 made me wonder just how much would be lost to us in such a scenario. Much that we care about is currently on highly sensitive storage.

What format should I invest in (developing/purchasing/etc) that would A) be easy-ish to read from B) be dense enough to exceed the point I might as well just be using paper/glass slides/microfiche/and so on, and could be ramped back up from pretty crude tech.

Combine "end of civilization", "time-travelling with tech", and other thought-experiment scenarios for fun.


If it is end of civilization you care about then print out survival instructions. Basic low tech things that one person can do in a farm sized area with minimal help.

Don't worry about advanced stuff. How to make a transistor will be lost long before we develop a new civilization capable of needing transistors. So don't waste paper on it.


Every couple decades? More like every 5 years!


Depends on the format. Some have lasted longer than others. I just migrated my grandparents slides from the 1960s and 1970s.


I bet I could probably put movies and media on a BluRay using whatever format was the most stable over the past ten years and be sure that it would be readable by somebody without too much trouble 20 years from now. After that, however, planned obsolesce gets you.

I guess this is the first time I've faced the fact that unless it is printed out at archival quality in some manner that humans can read, it basically won't exist 200 years from now. The data will perhaps be around in some kind of conglomerated and bowdlerized format, but the "you" part of the data will be lost in obscurity, and that's assuming that some version of the cloud stays intact that long.

Odd.


Copy it to all your computing devices, and keep doing that when you get new ones.


Thanks. I was looking for something that would work without me being around or there being any kind of script or computer for somebody to maintain. Something akin to a family picture album, only with digital media


I suppose you could always laser etch your bits on gold film and store it in a secure container. There are some products out there like the M disk that claim to last up to 1,000 years, but typical consumer storage media are only expected to last a few decades under optimal conditions.

https://www.mdisc.com/


We had a conversaton about this on Ars Technica and I can't currently find it. We ran into the same findings. I don't think you can fully set and forget it....3 copies, two formats, 1 off-site

And plan to copy to new media every 8-10 years.


I wouldn't count on 'put it in the cloud' being the answer, for a number of reasons.

Personally, I think that the only answer to the "preserve the family stuph for a 100 years" is to migrate it all to the most reliable media available every 5-10 years (it used to be every 3-5 years). Even if there was an archival media that was 100% reliable 100 years from now, will there be something that can read it?


> Emulation is cheap but completely unrepresentative of the original experience.

Never enough, but THANK GOODNESS for emulation!

My favorite old skool computer is the IBM 704 that LISP originally ran on. This machine didn't have a visual display or anything fancy. It was the sort of thing where you'd hand in your punch cards and an operator would run your job, then give you back the program results in your mailbox or something.

When I was doing research for writing my blog posts about LISP about a year ago, it was so helpful that a turnkey SIMH environment for this machine existed, that let me run the original software and then get definite concrete answers about its behavior. There were things I was curious about, particularly its handling of the T atom (and whether that remaps to TRUE which is the true truth) that I wouldn't have been able to talk about, had the emulator not been available.


For those unfamiliar with the unique look of vector displays, you've simply got to see one in motion to understand.

Asteroids is probably the most famous example. Look at the "ultra bright" bullets and the glowy trail they leave.

https://youtu.be/w60sfReTsRA?t=247


Persistent phosphors are magical.


The situation is only getting worse, as bootloaders are increasingly locked-down and internal hardware pairing is a thing (hello, Apple, but even AMD has joined that game with Epyc processors getting locked to motherboards). And then there's the soldered-in hardware that can't be replaced without a lot of expense and risk. Preservation at this point is going to be VMs and emulators.

And, in all honesty, while a working PDP-11/70 would be kind of neat, by today's standards it's dreadfully underpowered in speed even as it sucks down the kWh, and requires a lot of space, while SIMH on a modern SBC would run circles around it. A museum is a better place for it than a living room.


Some of these computers rely on parts that are hilariously simple individually, it's in aggregate that it becomes a computer.

For example, even a child could manufacture this with $10 worth of tools and parts:

https://en.wikipedia.org/wiki/PDP-10#/media/File:KA10_mod_en...

Keeping machines from the 00s running is going to be a hell of a lot harder than keeping machines from the 70s running


If you read the LCM tech blog it is exactly in this 'manufacturing replacement parts' where they absolutely shone.

Of course you will end up in a ship-of-Theseus like situation, how many replacement parts does it take before it is no longer the original. But that's because these are working systems, unlike you average museum where you just get to look at stuff and the cases might as well be empty.


>> Vectrex

My surviving stuff consists of a Vectrex, an Interact with many tapes (including Microsoft Basic), and an I Robot. I always thought the Interact belongs in a museum but they all seem to have one already or don't care.

Maybe the solution to hardware failure is to keep the hardware, working or not, AND have emulation. But as you say vectors don't emulate well.


There’s this pervasive idea that computers are solid-state machines meant to work forever. But they’re full of moving parts, grease, fluids, and fans.

Only now with our fanless machines are we approaching the true idea that these machines just turn on and work. I’d bet an Apple TV has a better chance of working in 400 years than a Commodore 64.


There's also a pervasive idea solid-state means works forever. Ask anyone who collects old computers and you'll find that ain't true.

Solid state devices fail with age too, via electromigration, temperature cycling fatigue, hot carrier injection, NBTI/PBTI and a host of other causes. And it gets worse the smaller the geometries we are working with. It's huge topic of concern in the industry.

We won't be around to know, but I'd be confident enough to bet you a beer that Apple TV will be dead as a doornail in years or decades, not centuries.


Yeah. I've got quite a few old machines from various periods, but I mostly just store them or have them on static display. I'm afraid to turn most of them on. The last one I tried was a Mac IIcx and it let the smoke out pretty quickly.


I wonder what could be done to maximize life?

Something like run it in a well cooled and humidity controlled, nobel gas filled enclosure.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: