Language Selection

English French German Italian Portuguese Spanish

Kubuntu 6.06

Filed under
Linux
Reviews
-s

With all the Ubuntu excitement passed few days it occurred to me that being a KDE fan moreso than gnome, perhaps Kubuntu might be more my cup of tea. When perusing the downloads it also occurred to me that 'hey I have a 64bit machine now!' So, I downloaded the Kubutu 6.06 desktop amd64 iso. Was it more appealing to a diehard KDE fan? Does 64bit programming make much difference?

The boot is similar to Ubuntu's, in fact it's almost identical except for the more attractive blue coloring and instead of the Ubuntu goldish logo we have the blue kubuntu. Otherwise there didn't seem to be much difference until we reached the splash screen. As attractive as Ubuntu's gui splash might be, kubuntu's is much more so. It's clean and crisp, and I just personally prefer blue.

        

Once you reach the desktop, one finds a blue background that looks like large faint bubbles as a foundation for KDE 3.5.2. It is your basic KDE desktop consisting of kapps for most popular tasks. What's not included on in the iso is installable. Included graphics are Kooka, krita, kpdf, and gwenview. For Internet we find akregator, bluetooth chat, Konqueror, Konversation, kopete, Kppp, krdc, drfb, ktorrent and a wireless lan manager. Multimedia includes amaroK, K3b, Kaffeine, KAudioCreator, kmix, and KsCD.

        

OpenOffice.org 2.02 rounds up the office category as well as KDE deciding kontact is an office application. There are plenty of system tools and utilities as well. There are utils for software packaging, setting alarms, configuring groupware connections, managing your printing jobs, and calculating. System tools consists of Kcron, Keep, KInfocenter, KSysGuard, KSystemLog, Konsole and QTParted.

        

On the desktop as well as in the System menu is an icon for Install. Installing to the harddrive is simplified over comparable Linux system, and in this case it is very similar if not identical to the process found in Ubuntu. It starts with answering a few configuration questions such as language, timezone, keyboard, and user and machine name.

        

Next comes partitioning if necessary and setting the target partition and swap. Confirm settings and press Install. All one does now is wait. It takes about 10 minutes for the installer to complete it's work before asking if you'd like to reboot. That's it.

        

Like Ubuntu, the installer presumes you would like grub installed so doesn't bother to ask and my first install attempt wasn't successful. The newly installed system would not boot. It just sat at the 'loading grub' screen blinking at me, in much the same manner as I encountered with the Ubuntu release candidate. After replacing grub with lilo, kubuntu tried to boot, but lots of things failed including the loading of needed modules and the start of the gui. I booted the livecd and tried again, this time doing nothing else in the background and achieved a bootable install. The first time I was taking a bunch of screenshots. I think I'm beginning to see a pattern emerge here in all my installs of the Ubuntu family and can sum it up in a few words of advice. Do not do anything else while your new Ubuntu system installs. This of course detracts from the main advantage of using a livecd as an install medium, but on the other hand, it takes such a short span of time to install that it's not a major sacrifice.

The installed system affords one the opportunity to install whatever applications one might need as well any 3rd party or proprietary drivers. (k)ubuntu software is installed thru an app called adept. Not only is it an software manager, but it also takes care of system or security updates. In fact one of the first thing I saw when I booted Kubuntu the first time was an icon in the System tray for adept and clicking on it brought up an updater. Click to fetch list of updates and in a few seconds it will inform you if anything needs updating. In this case there were updates to the adept software manager and gnome install data. One can Apply Updates or Forget Changes and Quit. I clicked Apply Changes and the updates were downloaded and installed in seconds without issue.

        

In the menu is an entry for Adept which opens a window similar to Synaptic. You can search for specific packages by keywork with tickable options, and right click package name to "Request Install." Then click on the Apply Changes button and your package as well as dependencies are downloaded and installed.

        

Clicking on "Add and Remove Programs" also brings up adept, but in a different layout. In this layout one finds the applications available or installed listed by category. Ticking the little checkbox and clicking Apply Changes will install or remove chosen programs.

        

The hardware detection was good and pretty much everything worked out of the box. Kaffeine was able to play mpgs and the example files but not avis. OpenOffice crashed and disappeared my first attempt at using it, but functioned properly in all subsequent tests. The KDE that was included was a bit stripped down and included no games at all, but lots of choices are available through the software manager. The desktop itself was pretty even if customized very little. Under the hood is a 2.6.15 kernel, Xorg 7.0 and gcc 4.0.3 is installable.

The performance of the system was well above average. In fact, I'll just say it, that thing flies. Applications opened up before I could move my mouse. There was no artifacting or delay in redrawing windows, no delay at all in switching between windows, or "jerkiness" when moving windows around. The menu popped right open without delay as well. The whole system felt light and nimble. I was quite impressed. Comparing the performance of kde kubuntu to gnome ubuntu is almost like comparing peaches to nectarines and since I didn't test the x86 version of kubuntu, I can't say with any authority or expertise that kubuntu 64 out-performs the others. But I can say this is one of the, if not the, fastest full-sized systems I've tested. Yes sir, kubuntu was quite impressive.

Kubuntu 6.06 Review

Thanks for your review.
Just a few personal comments about this new release.

FIRST, THE BAD . . .

1) Yes, I profoundly dislike NOT being able to log in as root. Granted, I don't spend much time logged-in as user root, but when I'm going to do an extended session of system configuration and maintenance, it's the fastest, most efficient way. And, of course, Linux is supposed to be about choice.

So I googled "Kubuntu root login", and got to a Kubuntu forum where some user had asked the same question. In the forum, the next person had posted in reply (I'm paraphrasing here): I know how to enable root logins for Kubuntu, but I'm not going to tell you how because this is not a good idea.

I just couldn't believe my eyes. OSS is all about freedom and the ability to control your own machine. "THERE'S NO INFORMATION HIDING IN LINUX!!!" (Imagine Tom Hanks in the movie, A League of Their Own saying "There's no crying in baseball!")

As I sat there thinking about this, I began to think that these Ubuntu/Kubuntu folk are a different kind of folk than me--alien, strange, ungracious, and infuriating. All that wondering about the popularity of Ubuntu/Kubuntu increased. Why would anyone want to use a distro where a simple request for information was received with such an ignorant, uptight, shortsighted, narrowminded, hypocritical, and outright anal response?

Working myself up to a good simmer, now, I then look at the next post in the forum, where a user quietly and considerately told exactly how to do it. OK, maybe these Kubuntu folk aren't jerks. Too much rush to judgement and stereotyping going on in our world anyway.

2) I'm not too familiar with Debian, or Debian based distros--so this one is probably my fault. I couldn't get Nvidia's accelerated drivers working properly. I have an older 17" LCD monitor that still works great, but it's very fussy about sync rates. I can usually tinker around and get things working, but no go here. I admit I was impatient here, and if I'd spent more time, I could have gotten it to work.

3) Development compilers and libraries are not included with the basic live CD install. There is such a thing as designing a distro for beginners, but not treating them like idiots. Can't find a package for your distro that works? Then, you can get the source and compile your own. Basic tools to do this, in my opinion, should always be included with the basic install of a Linux distro.

NOW, THE GOOD.

1) I like the graphical package installer, Adept. It works well, and is well designed and integrated.

2) Despite the heavy load the Ubuntu/Kubuntu servers must have been undergoing with a new release, they were quick and responsive.

3) I agree with every thing srlinux had to say in her (typically excellent) review, particularly when she says Kubuntu is very fast and responsive. I installed the amd64 version, and speed was excellent.

4) Kubuntu very courteously found all the other distros on my hard disks, and added them to the Grub boot menu. If all distros would do this, installing and trying out multiple distros would certainly be easier.

Well, that's it. I think it might be interesting to see an in-depth comparison of the latest Mepis(ver. 6, rc4) to this new Kubuntu release.

Gary Frankenbery

re: Kubuntu 6.06 Review

gfranken wrote:

Thanks for your review.
Just a few personal comments about this new release.

FIRST, THE BAD . . .

1) Yes, I profoundly dislike NOT being able to log in as root. Granted, I don't spend much time logged-in as user root, but when I'm going to do an extended session of system configuration and maintenance, it's the fastest, most efficient way. And, of course, Linux is supposed to be about choice.

Yeah, I didn't like that one too much, especially when the ubuntus hit the pipes. I'm used to it now I guess and it only annoys me slightly when I forget to sudo a command. I found just setting a root password will fix that tho.

teehee on the tom hanks thing. Big Grin

gfranken wrote:

3) Development compilers and libraries are not included with the basic live CD install. There is such a thing as designing a distro for beginners, but not treating them like idiots. Can't find a package for your distro that works? Then, you can get the source and compile your own. Basic tools to do this, in my opinion, should always be included with the basic install of a Linux distro.

Yeah, that used to really urk me too. But like the sudo thing, I'm getting used to it. I'm seeing it more and more in distros these days. A comment on another site said to install the build-essentials to get gcc, make, and friends. It was nice seeing that comment before my installs that way I didn't have to get all annoyed at it. Big Grin

Thanks for your kind words for me, I really appreciate that.

----
You talk the talk, but do you waddle the waddle?

It's not as bad as it looks

1) Yes, I profoundly dislike NOT being able to log in as root.

I must admit that I created a root password on the first machines where I installed Ubuntu. (sudo passwd) But now I just let the root user blocked and use (sudo su) to get a shell that has root rights and exit it again after some system maintenance. It is possible more secure that script kiddies cannot login as root via ssh (not standard installed but the first thing I usually install) even if they try really hard.

2) I couldn't get Nvidia's accelerated drivers working properly.

After installing the nvidia package (apt-get install nvidia-glx) I usually run the reconfiguration of xorg by (dpkg-reconfigure xserver-xorg) this creates a new /etc/X11/xorg.conf file including sync ranges. This tool always shows the options chosen the last time it has run so it is easy to use multiple times to tweak things.

3) Development compilers and libraries are not included with the basic live CD install.

It is easy to correct (apt-get install build-essential) but many people don't need those tools. The current repositories are packed with all kind of usefull software. It is very seldom that I revert to debian-unstable to get sources of something and repack it to create an ubuntu version of the same project.

Re: It's not as bad as it looks

Quote:
After installing the nvidia package (apt-get install nvidia-glx) I usually run the reconfiguration of xorg by (dpkg-reconfigure xserver-xorg) this creates a new /etc/X11/xorg.conf file including sync ranges. This tool always shows the options chosen the last time it has run so it is easy to use multiple times to tweak things.
Thanks Jurgen, for taking the time to explain the nvidia thing. My weakness when configuring Debian based distros is definitely showing here.

Also thanks for your clarification on installing the development packages.

Kubuntu is definitely on my radar. It was certainly blazingly fast.

Regards,
Gary Frankenbery
Computer Science Teacher
Grants Pass High School, Oregon, USA

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

More in Tux Machines

today's howtos

Games; CHOP, LeClue - Detectivu, Nantucket, MOTHERGUNSHIP

  • Brutal local co-op platform brawler CHOP has released

    CHOP, a brutal local co-op platform brawler recently left Early Access on Steam. If you like fast-paced fighters with a great style and chaotic gameplay this is for you. There's multiple game modes, up to for players in the standard modes and there's bots as well if you don't have people over often. Speaking about the release, the developer told me they felt "many local multiplayer games fall into a major pitfall : they often lack impact and accuracy, they don't have this extra oomph that ensure players will really be into the game and hang their gamepad like their life depends on it." and that "CHOP stands out in this regard". I've actually quite enjoyed this one, the action in CHOP is really satisfying overall.

  • Mystery adventure game Jenny LeClue - Detectivu is releasing this week

    Developer Mografi has confirmed that their adventure game Jenny LeClue - Detectivu is officially releasing on September 19th. The game was funded on Kickstarter way back in 2014 thanks to the help of almost four thousand backers raising over one hundred thousand dollars.

  • Seafaring strategy game Nantucket just had a big patch and Masters of the Seven Seas DLC released

    Ahoy mateys! Are you ready top set sail? Anchors aweigh! Seafaring strategy game Nantucket is now full of even more content for you to play through. Picaresque Studio and Fish Eagle just released a big new patch adding in "100+" new events, events that can be triggered by entering a city, the Resuscitation command can now heal even if someone isn't dead during combat, the ability to rename crew to really make your play-through personal, minor quests give off better rewards and more. Quite a hefty free update!

  • MOTHERGUNSHIP, a bullet-hell FPS where you craft your guns works great on Linux with Steam Play

    Need a fun new FPS to try? MOTHERGUNSHIP is absolutely nuts and it appears to run very nicely on Linux thanks to Steam Play. There's a few reasons why I picked this one to test recently: the developers have moved onto other games so it's not too likely it will suddenly break, there's not a lot of new and modern first-person shooters on Linux that I haven't finished and it was in the recent Humble Monthly.

GNU community announces ‘Parallel GCC’ for parallelism in real-world compilers

Yesterday, the team behind the GNU project announced Parallel GCC, a research project aiming to parallelize a real-world compiler. Parallel GCC can be used in machines with many cores where GNU cannot provide enough parallelism. A parallel GCC can be also used to design a parallel compiler from scratch. Read more

today's leftovers

  • 3 Ways to disable USB storage devices on Linux
  • Fedora Community Blog: Fedocal and Nuancier are looking for new maintainers

    Recently the Community Platform Engineering (CPE) team announced that we need to focus on key areas and thus let some of our applications go. So we started Friday with Infra to find maintainers for some of those applications. Unfortunately the first few occurrences did not seem to raise as much interest as we had hoped. As a result we are still looking for new maintainers for Fedocal and Nuancier.

  • Artificial Intelligence Confronts a 'Reproducibility' Crisis

    Lo and behold, the system began performing as advertised. The lucky break was a symptom of a troubling trend, according to Pineau. Neural networks, the technique that’s given us Go-mastering bots and text generators that craft classical Chinese poetry, are often called black boxes because of the mysteries of how they work. Getting them to perform well can be like an art, involving subtle tweaks that go unreported in publications. The networks also are growing larger and more complex, with huge data sets and massive computing arrays that make replicating and studying those models expensive, if not impossible for all but the best-funded labs.

    “Is that even research anymore?” asks Anna Rogers, a machine-learning researcher at the University of Massachusetts. “It’s not clear if you’re demonstrating the superiority of your model or your budget.”

  • When Biology Becomes Software

    If this sounds to you a lot like software coding, you're right. As synthetic biology looks more like computer technology, the risks of the latter become the risks of the former. Code is code, but because we're dealing with molecules -- and sometimes actual forms of life -- the risks can be much greater.

    [...]

    Unlike computer software, there's no way so far to "patch" biological systems once released to the wild, although researchers are trying to develop one. Nor are there ways to "patch" the humans (or animals or crops) susceptible to such agents. Stringent biocontainment helps, but no containment system provides zero risk.

  • Why you may have to wait longer to check out an e-book from your local library

    Gutierrez says the Seattle Public Library, which is one of the largest circulators of digital materials, loaned out around three million e-books and audiobooks last year and spent about $2.5 million to acquire those rights. “But that added 60,000 titles, about,” she said, “because the e-books cost so much more than their physical counterpart. The money doesn’t stretch nearly as far.”

  • Libraries are fighting to preserve your right to borrow e-books

    Libraries don't just pay full price for e-books -- we pay more than full price. We don't just buy one book -- in most cases, we buy a lot of books, trying to keep hold lists down to reasonable numbers. We accept renewable purchasing agreements and limits on e-book lending, specifically because we understand that publishing is a business, and that there is value in authors and publishers getting paid for their work. At the same time, most of us are constrained by budgeting rules and high levels of reporting transparency about where your money goes. So, we want the terms to be fair, and we'd prefer a system that wasn't convoluted.

    With print materials, book economics are simple. Once a library buys a book, it can do whatever it wants with it: lend it, sell it, give it away, loan it to another library so they can lend it. We're much more restricted when it comes to e-books. To a patron, an e-book and a print book feel like similar things, just in different formats; to a library they're very different products. There's no inter-library loan for e-books. When an e-book is no longer circulating, we can't sell it at a book sale. When you're spending the public's money, these differences matter.

  • Nintendo's ROM Site War Continues With Huge Lawsuit Against Site Despite Not Sending DMCA Notices

    Roughly a year ago, Nintendo launched a war between itself and ROM sites. Despite the insanely profitable NES Classic retro-console, the company decided that ROM sites, which until recently almost single-handedly preserved a great deal of console gaming history, need to be slayed. Nintendo extracted huge settlements out of some of the sites, which led to most others shutting down voluntarily. While this was probably always Nintendo's strategy, some sites decided to stare down the company's legal threats and continue on.

  • The Grey Havens | Coder Radio 375

    We say goodbye to the show by taking a look back at a few of our favorite moments and reflect on how much has changed in the past seven years.

  • 09/16/2019 | Linux Headlines

    A new Linux Kernel is out; we break down the new features, PulseAudio goes pro and the credential-stealing LastPass flaw. Plus the $100 million plan to rid the web of ads, and more.

  • Powering Docker App: Next Steps for Cloud Native Application Bundles (CNAB)

    Last year at DockerCon and Microsoft Connect, we announced the Cloud Native Application Bundle (CNAB) specification in partnership with Microsoft, HashiCorp, and Bitnami. Since then the CNAB community has grown to include Pivotal, Intel, DataDog, and others, and we are all happy to announce that the CNAB core specification has reached 1.0. We are also announcing the formation of the CNAB project under the Joint Development Foundation, a part of the Linux Foundation that’s chartered with driving adoption of open source and standards. The CNAB specification is available at cnab.io. Docker is working hard with our partners and friends in the open source community to improve software development and operations for everyone.

  • CNAB ready for prime time, says Docker

    Docker announced yesterday that CNAB, a specification for creating multi-container applications, has come of age. The spec has made it to version 1.0, and the Linux Foundation has officially accepted it into the Joint Development Foundation, which drives open-source development. The Cloud Native Application Bundle specification is a multi-company effort that defines how the different components of a distributed cloud-based application are bundled together. Docker announced it last December along with Microsoft, HashiCorp, and Bitnami. Since then, Intel has joined the party along with Pivotal and DataDog. It solves a problem that DevOps folks have long grappled with: how do you bolt all these containers and other services together in a standard way? It’s easy to create a Docker container with a Docker file, and you can pull lots of them together to form an application using Docker Compose. But if you want to package other kinds of container or cloud results into the application, such as Kubernetes YAML, Helm charts, or Azure Resource Manager templates, things become more difficult. That’s where CNAB comes in.