Language Selection

English French German Italian Portuguese Spanish

Taking a trip down memory-chip lane

Filed under
Sci/Tech

REMEMBER your first time, when you sat in front of a keyboard and monochrome screen and joined a brave new world? You may have been playing Pong or Manic Miner, or carefully crafting your first lines of code. But you won't have forgotten the joy of discovering personal computers.

It's time to revisit your youth, because BBC Bs, ZX81s, Spectrums and Commodores are cool again, part of a wave of computing nostalgia. Today's stylish PCs may perform billions of calculations a second and store tens of billions of bytes of data, but for many, they have got nothing on the 32, 48 or 64-kilobyte machines that were the giants of the early 1980s.

This renewed interest in old-school computing is more than just a trip down memory-chip lane. Early computers are a part of our technological heritage, and also offer a unique perspective on how today's machines work. And within growing collections of original computers and home-made replicas, and the anecdote-filled web pages and blogs devoted to them, lies the equipment and expertise that will one day help unlock our past by reading countless computer files stored in outmoded formats.

Enthusiasts say they are inspired by old machines not just because the computer era was ushered in by monumental developments in electronics, mathematics and information science but also because the digital computer changed the course of the 20th century. During the second world war one of the earliest electronic computers, Colossus, enabled Allied code breakers in the UK to decipher Nazi messages. In 1941 another of the earliest programmable machines, ENIAC, was used by the US army to calculate the trajectory of ballistic weapons with unprecedented accuracy. The rest, as they say, is history.

"They hark back to another time," says Hamish Carmichael, secretary of the UK's Computer Conservation Society, which works with the Science Museum in London to restore and rebuild classic machines. "And there's an element of detective work, in finding out how things were done originally." The society has helped the museum reconstruct the oldest working computer anywhere, an original Pegasus, made by British firm Ferranti in 1956. And it is working on an even older machine, an Elliot 403 dating from 1955.

What computers did for the military, they also did for the workplace, although the earliest models were a far cry from today's sleek laptops. The first commercial machine, UNIVAC I, was delivered to the US census bureau in 1951. Much larger than an SUV, it contained 2500 vacuum tubes and consumed 125 kilowatts of power, yet could perform just 1905 operations per second and store 1000 different characters.

Most enthusiasts, however, are more familiar with the computers that appeared in their homes during the 1970s and 1980s. The Altair 8800 is often credited with kick-starting the personal computer revolution. Sold in kit form in 1975, the 8800 consisted of several circuit boards slotted together inside a blue box the size of an old record player.

Programming the 8800 involved configuring several switches to correspond to a primitive command and then flicking another to store it in the computer's memory. The designers at Micro Instrumentation Telemetry Systems (MITS) only expected to sell a few hundred kits to keen electronics hobbyists. But the idea of owning a programmable "electronic brain" proved so irresistible that they received thousands of orders for kits within weeks of launch.

Build your own

Full Story.

More in Tux Machines

digiKam 7.7.0 is released

After three months of active maintenance and another bug triage, the digiKam team is proud to present version 7.7.0 of its open source digital photo manager. See below the list of most important features coming with this release. Read more

Dilution and Misuse of the "Linux" Brand

Samsung, Red Hat to Work on Linux Drivers for Future Tech

The metaverse is expected to uproot system design as we know it, and Samsung is one of many hardware vendors re-imagining data center infrastructure in preparation for a parallel 3D world. Samsung is working on new memory technologies that provide faster bandwidth inside hardware for data to travel between CPUs, storage and other computing resources. The company also announced it was partnering with Red Hat to ensure these technologies have Linux compatibility. Read more

today's howtos

  • How to install go1.19beta on Ubuntu 22.04 – NextGenTips

    In this tutorial, we are going to explore how to install go on Ubuntu 22.04 Golang is an open-source programming language that is easy to learn and use. It is built-in concurrency and has a robust standard library. It is reliable, builds fast, and efficient software that scales fast. Its concurrency mechanisms make it easy to write programs that get the most out of multicore and networked machines, while its novel-type systems enable flexible and modular program constructions. Go compiles quickly to machine code and has the convenience of garbage collection and the power of run-time reflection. In this guide, we are going to learn how to install golang 1.19beta on Ubuntu 22.04. Go 1.19beta1 is not yet released. There is so much work in progress with all the documentation.

  • molecule test: failed to connect to bus in systemd container - openQA bites

    Ansible Molecule is a project to help you test your ansible roles. I’m using molecule for automatically testing the ansible roles of geekoops.

  • How To Install MongoDB on AlmaLinux 9 - idroot

    In this tutorial, we will show you how to install MongoDB on AlmaLinux 9. For those of you who didn’t know, MongoDB is a high-performance, highly scalable document-oriented NoSQL database. Unlike in SQL databases where data is stored in rows and columns inside tables, in MongoDB, data is structured in JSON-like format inside records which are referred to as documents. The open-source attribute of MongoDB as a database software makes it an ideal candidate for almost any database-related project. This article assumes you have at least basic knowledge of Linux, know how to use the shell, and most importantly, you host your site on your own VPS. The installation is quite simple and assumes you are running in the root account, if not you may need to add ‘sudo‘ to the commands to get root privileges. I will show you the step-by-step installation of the MongoDB NoSQL database on AlmaLinux 9. You can follow the same instructions for CentOS and Rocky Linux.

  • An introduction (and how-to) to Plugin Loader for the Steam Deck. - Invidious
  • Self-host a Ghost Blog With Traefik

    Ghost is a very popular open-source content management system. Started as an alternative to WordPress and it went on to become an alternative to Substack by focusing on membership and newsletter. The creators of Ghost offer managed Pro hosting but it may not fit everyone's budget. Alternatively, you can self-host it on your own cloud servers. On Linux handbook, we already have a guide on deploying Ghost with Docker in a reverse proxy setup. Instead of Ngnix reverse proxy, you can also use another software called Traefik with Docker. It is a popular open-source cloud-native application proxy, API Gateway, Edge-router, and more. I use Traefik to secure my websites using an SSL certificate obtained from Let's Encrypt. Once deployed, Traefik can automatically manage your certificates and their renewals. In this tutorial, I'll share the necessary steps for deploying a Ghost blog with Docker and Traefik.