Language Selection

English French German Italian Portuguese Spanish

My Linux Story

Filed under
Linux

My Introduction Computers

I got my first computer around 1992, after the personal computering hobby had been around quite a while. I know I felt so far behind. I remember feeling so lost and thinking I'd never know what I was doing. But it didn't really matter because I never dreamed of doing anything actually computer related. The truth was, I just wanted a nice word processor from which I could print nice papers. I was in college and type written papers were required and typing for me meant spending more time correcting errors that actual research and writing. A computer seemed like the perfect answer.

I bought a second hand Tandy 2000, by Radio Shack. It had everything and only cost $500. It had some version of DOS on it and the computer guy put on a few programs for me. Well, I used it without ever thinking of checking for a modem until I quit college in my Junior year of nursing school. That's another story, but two words will suffice - sponge bath! Well, and two more - dying patients. I would never be happy nursing.

But I returned to college in Fall 1998 and realized my old Tandy was a dinosaur. The papers the other students turned in weren't in Dot Matrix. They had fancy fonts and pictures! I had to get a new computer.

I purchased my second computer at the local army base PX for $500, a Pionex I believe. It wasn't top of line, but about second. It came with Windows 98, and excellent tech support. It wasn't long before their friendly support staff was walking me through modem replacement, driver installation, and ultimately reformats/reinstalls.

I wasn't into a year using it before I became disenchanted and paranoid. Very paranoid. I feel so silly now, but I was so worried that some "hacker" would break into my machine in the few minutes I spent on the dial-up connection that I had bogged the machine down with a firewall, antivirus, and spyware hunter. grc.com mean anything to anyone? I had hacked this file and that file trying to keep my private information private and cover tracks and reformating about every 3 months. That's what it was averaging. About every 3 months. Then someone mentioned Linux.

Enter Linux

I spent much of 2000 trying to convert to Linux. I tried SUSE, Red Hat, and Mandrake among a few others. I always tucked tail and went back to Windows. ...until Mandrake 7.2 hit Wal-Mart shelves in the Fall of 2000.

All of a sudden I could get a decent screen resolution and KDE 1.99 (a 2.0 beta that it shipped with) was actually usable. ...or understandable. There was a list of applications in the menu instead of a bunch of directories (anyone remember the old 1.x menus?) Well, I was inspired enough to try and get my modem and sound working. See, everything else was working just fine either out-of-the-box or using the Mandrake Control Center. I never went back to Windows again.

Well, it took a week of booting to Windows to look something up on the internet and booting back to Mandrake to try something before my modem finally dialed, but I've never felt so excited and proud in all my life.

My ISA sound card took another week, but it was much easier now. I could search from Mandrake and by then someone had said the fateful word "Google" to me. It took a bit more voodoo to get ISA cards to work, but within another week I was listening to system sounds and music files.

Well, with the release of the 2.4 kernel that same fall I was able to use NVIDIA proprietary drivers and I was well on my path of discovery. I spent a few years learning Linux and helping others on the Mandrake mailing lists and Usenet users group.

Broadband had hit during 2001 and I had my firewall making me invisible, but mainly I felt this big sigh of relief. My paranoia was gone as was the weight of the world. I loved Linux and the commandline made sense. I never was able to form any patterns or retain all those mouse clicking routines used in Windows, but I could understand what was going on in that terminal. I was home.

I switched to Gentoo in 2003 and started this website in 2005. The rest is probably history.

It's All in the Timing

I was lucky to have switched to Linux when I did. I tried lots of other distros just for fun. I began experimenting with Howtos I'd run into. Back then howtos weren't "apt-get this, tick this box, and click OK." Back then they were 'get the source here, and the patch here and this one here, and open this file and replace this function with that one, then do this and that to this config file and then recompile this other program using this patch and edit its config file, then patch the kernel...' No mention of dependencies usually, you discovered them when you started to compile. Me and Texstar used to write back forth - "Hey hey! I got this to work! Here's a screenshot." Those were the days. It's all too easy now.

But that was considerably easier than it had been in the '90s. This is why I have such respect for the really old guys that have been using Linux since "real men write their own drivers" and why Linus is my hero. This is why I shake my head at those who ask, "Is Linux ready for Prime Time?"

Hell yeah, it's ready - and has been for quite a while.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

What's your story?

Join Don, Lisa, and me and share your Linux story? When, why, and how did you switch to Linux? Have you switch recently, have you been around since the 90's, or somewhere in-between? Share your story.

You can comment here or use your own TuxBlog, free with your account here at Tuxmachines.org.

Don't be shy!

Year 2000

That's when I got exposed to it too... and to better options, better choice.

A very good story!

I too did something with my PC when i got pissed of windows... I got to work with Ubuntu 8.04. the rest is little similar to your story... i got so excited by working with Ubuntu because u know what actually is happening with a greater transparency than in windows. U can see my love story with my pc after installing Linux here

Su
http://harshasrisri.wordpress.com

More in Tux Machines

KVM and Xen Project: Commercial Exploitation and Unikraft Work

  • Cloud, Linux vendors cash in on KVM-based virtualization

    Vendors such as Red Hat, IBM, Canonical and Google rely on KVM-based virtualization technology for many of their virtualization products because it enables IT administrators to execute multiple OSes on the same hardware. As a result, it has become a staple in IT admins' virtual systems. KVM was first announced in October 2006 and was added to the mainline Linux kernel in February 2007, which means that if admins are running a Linux machine, they can run KVM out of the box. KVM is a Type 1 hypervisor, which means that each individual VM acts similar to a regular Linux process and allocates resources accordingly. Other Type 1 hypervisors include Citrix XenServer, Microsoft Hyper-V, Oracle VM Server for x86 and VMware ESXi.

  • Unikraft: Building Powerful Unikernels Has Never Been Easier!

    Two years ago, the Xen Project introduced Unikraft (http://unikraft.org) as an incubation project. Over the past two years, the Unikraft project has seen some great momentum. Since the last release, the community has grown about 20% and contributions have diversified a great deal. Contributions from outside the project founders (NEC) now make up 63% of all contributions, up from about 25% this time last year! In addition, a total of 56,739 lines were added since the last release (0.3). [...] Finally, the Unikraft team’s Simon Kuenzer recently gave a talk at FOSDEM titled “Unikraft: A Unikernel Toolkit”. Simon, a senior systems researcher at NEC Labs and the lead maintainer of Unikraft, spoke all about Unikraft and provided a comprehensive overview of the project, where it’s been and what’s in store.

Gopher: When Adversarial Interoperability Burrowed Under the Gatekeepers' Fortresses

In the early 1990s, personal computers did not arrive in an "Internet-ready" state. Before students could connect their systems to UMN's network, they needed to install basic networking software that allowed their computers to communicate over TCP/IP, as well as dial-up software for protocols like PPP or SLIP. Some computers needed network cards or modems, and their associated drivers. That was just for starters. Once the students' systems were ready to connect to the Internet, they still needed the basic tools for accessing distant servers: FTP software, a Usenet reader, a terminal emulator, and an email client, all crammed onto a floppy disk (or two). The task of marshalling, distributing, and supporting these tools fell to the university's Microcomputer Center. For the university, the need to get students these basic tools was a blessing and a curse. It was labor-intensive work, sure, but it also meant that the Microcomputer Center could ensure that the students' newly Internet-ready computers were also configured to access the campus network and its resources, saving the Microcomputer Center thousands of hours talking students through the configuration process. It also meant that the Microcomputer Center could act like a mini App Store, starting students out on their online journeys with a curated collection of up-to-date, reliable tools. That's where Gopher comes in. While the campus mainframe administrators had plans to selectively connect their systems to the Internet through specialized software, the Microcomputer Center had different ideas. Years before the public had heard of the World Wide Web, the Gopher team sought to fill the same niche, by connecting disparate systems to the Internet and making them available to those with little-to-no technical expertise—with or without the cooperation of the systems they were connecting. Gopher used text-based menus to navigate "Gopherspace" (all the world's public Gopher servers). The Microcomputer Center team created Gopher clients that ran on Macs, DOS, and in Unix-based terminals. The original Gopher servers were a motley assortment of used Macintosh IIci systems running A/UX, Apple's flavor of Unix. The team also had access to several NeXT workstations. Read more Also: The Things Industries Launches Global Join Server for Secure LoRaWAN

IBM/Red Hat and POWER9/OpenBMC

  • Network Automation: Why organizations shouldn’t wait to get started

    For many enterprises, we don’t need to sing the praises of IT automation - they already get it. They understand the value of automation, have invested in a platform and strategy, and have seen first-hand the benefits IT automation can deliver. However, unlike IT automation, according to a new report from Forrester Research 1, network automation is still new territory for many organizations. The report, "Jump-Start Your Network Automation," found that 56% of global infrastructure technology decision makers have implemented/are implementing or are expanding/upgrading their implementation of automation software, while another 19% plan to implement it over the next 12 months. But those same organizations that are embracing IT automation haven’t necessarily been able to take that same initiative when it comes to automating their networks. Even if they know it will be beneficial to them, the report found that organizations often struggle with even the most basic questions around automating their networks.

  • Using a story’s theme to inform the filmmaking: Farming for the Future

    The future of farming belongs to us all. At least that’s the message I got from researching Red Hat’s most recent Open Source Stories documentary, Farming for the Future. As a self-proclaimed city boy, I was intrigued by my assignment as director of the short documentary, but also felt like the subject matter was worlds away. If it did, in fact, belong to all of us how would we convey this to a general audience? How could we use the film’s theme to inform how we might approach the filmmaking to enhance the storytelling?

  • Raptor Rolls Out New OpenBMC Firmware With Featureful Web GUI For System Management

    While web-based GUIs for system management on server platforms with BMCs is far from anything new, Raptor Computing Systems with their libre POWER9 systems does now have a full-functioning web-based solution for their OpenBMC-powered systems and still being fully open-source. As part of Raptor Computing Systems' POWER9 desktops and servers being fully open-source down to the firmware/microcode and board designs, Raptor has used OpenBMC for the baseboard management controllers but has lacked a full-featured web-based system management solution on the likes of the Talos II and Blackbird systems up until now.

  • Introduction to open data sets and the importance of metadata

    More data is becoming freely available through initiatives such as institutions and research publications requiring that data sets be freely available along with the publications that refer to them. For example, Nature magazine instituted a policy for authors to declare how the data behind their published research can be accessed by interested readers. To make it easier for tools to find out what’s in a data set, authors, researchers, and suppliers of data sets are being encouraged to add metadata to their data sets. There are various forms for metadata that data sets use. For example, the US Government data.gov site uses the standard DCAT-US Schema v1.1 whereas the Google Dataset Search tool relies mostly on schema.org tagging. However, many data sets have no metadata at all. That’s why you won’t find all open data sets through search, and you need to go to known portals and explore if portals exist in the region, city, or topic of your interest. If you are deeply curious about metadata, you can see the alignment between DCAT and schema.org in the DCAT specification dated February 2020. The data sets themselves come in various forms for download, such as CSV, JSON, GeoJSON, and .zip. Sometimes data sets can be accessed through APIs. Another way that data sets are becoming available is through government initiatives to make data available. In the US, data.gov has more than 250,000 data sets available for developers to use. A similar initiative in India, data.gov.in, has more than 350,000 resources available. Companies like IBM sometimes provide access to data, like weather data, or give tips on how to process freely available data. For example, an introduction to NOAA weather data for JFK Airport is used to train the open source Model Asset eXchange Weather Forecaster (you can see the model artifacts on GitHub). When developing a prototype or training a model during a hackathon, it’s great to have access to relevant data to make your solution more convincing. There are many public data sets available to get you started. I’ll go over some of the ways to find them and provide access considerations. Note that some of the data sets might require some pre-processing before they can be used, for example, to handle missing data, but for a hackathon, they are often good enough.

  • Red Hat Helps Omnitracs Redefine Logistics And Transportation Software

    Fleet management technology provider Omnitracs, LLC, has delivered its Omnitracs One platform on the foundation of Red Hat OpenShift. Using the enterprise Kubernetes platform along with Red Hat Ansible Automation Platform, Omnitracs One is a cloud-native offering and provides an enhanced user experience with a clear path towards future innovations. With Red Hat’s guidance, Omnitracs said it was able to embrace a shift from on-premises development technologies to cloud-native services, improving overall operations and creating a more collaborative development process culture.

today's howtos