Ancient Greece had its Great Explainers, one of whom was Plato. The open source community has its Great Explainers, one of whom is Michael Tiemann.
Several thousand feet in the air, in a conference room on the 10th floor of Red Hat's Raleigh, NC headquarters, Tiemann is prognosticating. The place affords the kind of scope he relishes: broad, sweeping, stretched to a horizon that (this morning, anyway) seems bright. As the company's VP of Open Source Affairs explains what differentiates an open source software company from other firms in a crowded market, he exhibits the idiosyncrasy that has marked his writing for decades: the tendency to pepper his exposition of open source principles with pithy maxims from a diverse range of philosophers, politicians, political economists, and popular writers. It's a habit borne, he says, of the necessity of finding something that resonates with the many skeptics he's confronted over the years—because necessity, he quips (quoting Plato, of course), is the mother of all invention.
MapR's Big Data platform, based on open source Apache Hadoop, gained the endorsement of Amazon Web Services (AWS), which has included the company's software as the first Hadoop distribution in the new AWS Partner Network (APN) Competency Program.
Specifically, Amazon has deemed MapR a Big Data Competency Partner for Amazon Web Services (AWS). The title is awarded to APN partners that "have demonstrated success in helping customers evaluate and use the tools, techniques, and technologies of working with data productively, at any scale," according to AWS.
The appeal of open source solutions to government agencies around the world is not surprising as these solutions can address concerns which had prevented governments from reaping the full benefits of cloud, including security, governance and data transparency. The number of countries actively using open source solutions in their infrastructure is a testament to how it is an appropriate model for IT systems in the public sector.
The real motivation for Sandstorm is, and always has been, making it possible for open source and indie developers to build successful web apps.
In today's popular software-as-a-service model, indie development simply is not viable. People do it anyway, but their software is not accessible to the masses. In order for low-budget software to succeed, and in order for open source to make any sense at all, users must be able to run their own instances of the software, at no cost to the developer. We've always had that on desktop and mobile. When it comes to server-side apps, hosting must be decentralized.
Brescia said that Bitnami's goal is to make it as easy to deploy an application on a server as it is to install an application on an endpoint computer. Bitnami has more than 90 different open-source applications and development environments in its software library that can be deployed with one-click installer packages on desktop, virtual machine and cloud deployments.
A PRESENTATION by the European nuclear research organisation CERN at the recent open source convention (OSCON) has provided a glimpse at where IT organisations are going to have to go in order to remain competitive. They will need to leave old legacy proprietary approaches behind and adopt open source.
CERN collects huge volumes of data every day from thousands of detectors at its nuclear collider ring located under the border between France and Switzerland near Geneva. It organises and archives all of this data and distributes much of it to research scientists located throughout the world over high-speed internet links. It presently maintains 100 Petabytes of legacy data under management, and collects another 35 Petabytes every year that it remains in operation. One Petabyte comprises one million Gigabytes.