Language Selection

English French German Italian Portuguese Spanish

Threats to Linux: Expertise and acceptance

Filed under
Linux

Do you know what most large Solaris installations have in common? Mis-management. What seems to happen is that the people in charge get there on the basis of large system experience in the eighties and then forcefully apply that expertise regardless of whether it's appropriate to the technology or not. That's what happened to a lot of large business projects started on Solaris in the mid to late ninties, why there was a resurgence in mainframe sales as these projects were written off in 2000 one and two, and why there's now a threat that the same thing is about to happen with Linux.

Linux installations, so far, have mainly been compromised by the expertise evolved to cope with the day to day emergencies associated with managing Microsoft's products. I think that's about to change as the big guys grab "the coming thing" and try to twist it into what they already know.

Look at Linux implementations in (bigger) business or government and in a majority of cases what you see is people trying to treat it as a one for one substitute for Windows - producing rackmounts stuffed with PCs all individually licensed from Red Hat, all running one application each, and all being routinely shut down for patch installation and "preventative reboot."

It's not that the people doing this are dishonest or incompetent - quite the contrary they're honestly doing what they've been taught to do, it's just that they haven't internalized the fundamental truth that Unix isn't Windows and so think their expertise applies. In reality, Linux isn't as good a Windows product as Windows, so the net effect is generally to increase cost to the employer while decreasing benefits.

The mainframers all want to virtualize or partition - despite the fact that these technologies address problems that don't exist on Unix. The windows generation wants to use lockdowns, proxies, anti-virus software, and the rackmount approach to SMP for the same reason: these are the things they know how to do and therefore the things they will do -and so what if the problems these solutions address don't exist in Linux.

It's insanely frustrating to hold a conversation with someone who's deeply committed to this kind of technological miscegenation. Typically you're dealing with someone who looks and sounds like a decent human being you'd be happy to have as a friend or neighbour -until you hit the job spot and what spews out are absolute certainties made up of absolute nonsense.

Recently, for example, I found myself explaining to a bunch of Windows people that DHCP started as Sun's bootp support for diskless devices, entered the Windows world as a means of temporarily assigning an IP address to a Windows 3.11 PC so it could be used to access the internet, and became unnecessary, and therefore inappropriate, for fixed network installations when Microsoft finally adopted TCP/IP.

These were bright people, honest and competent in their own way, but I would have won more converts arguing for the replacement of email by trained mice scurrying around carrying digitally inscribed slices of well aged lunar cheese. As a group they agreed that it would be a good idea to use non routable addresses internally, but nothing was going to change their true and certain knowledge that address allocations must be handled through DHCP.

What's going on with them, and their mainframe predecessors, is management by knowledge accretion -the setting in stone of managerial reflexes gained through thirty years of experience and applied, unchanged, to technology they've never seen before.

As a process, accretion works well for making sandstone, but it's not so smart for IT management -and the consequences are usually bad for the technologies involved because the people responsible for the resulting failures blame the tool far more often than they blame themselves.

By Paul Murphy
ZDNet

More in Tux Machines

today's leftovers

Leftovers: Gaming

Leftovers: KDE Software

  • Wayland & Other Tasks Being Worked On For KDE Plasma 5.4
    Now that KDE Plasma 5.3 was released this week, KDE developers are starting to plan out and work on the new material intended for KDE Plasma 5.4.
  • Interview with Wolthera
    My name is Wolthera, I am 25, studied Game Design and currently studying Humanities, because I want to become a better game designer, and I hope to make games in the future as a job. I also draw comics, though nothing has been published yet. [...] After I played a lot with MyPaint, I heard from people that Krita 2.4 was the shit. When I went to the website at the time (which is the one before the one before the current) it just looked alien and strange, and worse: there was no Windows version, so I couldn’t even try it out. So I spent a few more years having fun with MyPaint alone, but eventually I got tired of its brush engine and wanted to try something more rough. When I checked Krita again, it had two things: a new, considerably more coherent website (the one before this one) and a Windows build. Around that time it was still super unstable and it didn’t work with my tablet. But MyPaint also had tablet problems, so I had no qualms about dual booting to Linux and trying it out there.
  • GSoC with KDE
    So, my project is titled: Better Tooling for Baloo. Let me begin by explaining what Baloo is. According to its wiki page it is "Baloo is a metadata and search framework by KDE." What exactly does it mean? Baloo is responsible for providing full text search capabilities to KDE applications. It doesn't end there it also provides searching on basis of metadata of various types of files. To acomplish this it indexes file contents and metadata using various plugins ,called extractors, to handle different types of files. It then exposes the data it has indexed with the help of various API's. So thats a very high level view of how it works. Now, my project, as the title states will provide better tools for Baloo. These tools will mainly be: