Language Selection

English French German Italian Portuguese Spanish

Why everybody should use GNU/Linux, and how?

Filed under
Linux

GNU/Linux is getting bigger and bigger. Microsoft’s recent patent threats are definitely helping GNU/Linux to gain mainstream popularity. Unfortunately, new users are often confused by why they should actually use GNU/Linux, and how to go about the transition. Hopefully, this article will fill that gap!

Why should everybody use GNU/Linux?

Number one reason, it’s fun. Windows XP has been around so long and changed so little, it’s boring. Vista looks slightly more interesting, but it’s expensive and won’t run on a lot of hardware (including mine!). GNU/Linux caught up with XP a few years ago and beat Vista to things like 3D desktops and windows with variable transparency. Even if you stick to basics, it’s fun learning a new operating system and new applications. It’s especially fun if it doesn’t cost you anything.

Reason two, therefore: it’s free.

Reason three: community.

More Here.




More in Tux Machines

Mesa 10.3 release candidate 2

Mesa 10.3 release candidate 2 is now available for testing. The current plan of record is to have an additional release candidate each Friday until the 10.3 release on Friday, September 12th. The tag in the GIT repository for Mesa 10.3-rc2 is 'mesa-10.3-rc2'. I have verified that the tag is in the correct place in the tree. Mesa 10.3 release candidate 2 is available for download at ftp://freedesktop.org/pub/mesa/10.3/ Read more

Linux 3.17-rc3

I'm back to the usual Sunday release schedule, and -rc3 is out there now. As expected, it is larger than rc2, since people are clearly getting back from their Kernel Summit travels etc. But happily, it's not *much* larger than rc2 was, and there's nothing particularly odd going on, so I'm going to just ignore the whole "it's summer" argument, and hope that things are just going that well. Please don't prove me wrong, Linus Read more

Revisiting How We Put Together Linux Systems

Traditional Linux distributions are built around packaging systems like RPM or dpkg, and an organization model where upstream developers and downstream packagers are relatively clearly separated: an upstream developer writes code, and puts it somewhere online, in a tarball. A packager than grabs it and turns it into RPMs/DEBs. The user then grabs these RPMs/DEBs and installs them locally on the system. For a variety of uses this is a fantastic scheme: users have a large selection of readily packaged software available, in mostly uniform packaging, from a single source they can trust. In this scheme the distribution vets all software it packages, and as long as the user trusts the distribution all should be good. The distribution takes the responsibility of ensuring the software is not malicious, of timely fixing security problems and helping the user if something is wrong. Read more

See How Your Linux System Performs Against The Latest Intel/AMD CPUs

This holiday weekend (in the US) can be a great time to test your Linux system to see how it's performing against the latest AMD and Intel processors to see if it's time for a good upgrade. This weekend I'm working on many Linux CPU benchmarks for the upcoming Linux review of the Intel Core i7 5960X Haswell-E system (still waiting for Intel's review sample to arrive though...) and also have some other hardware in preparation for an unrelated launch that's happening next week from another vendor. I'm testing several different Intel/AMD CPUs from the latest desktop CPUs to the Extreme Edition models to some slightly older parts. Beyond the raw performance results are also the power consumption data and much more. Read more