Apache Spark has been an integral part of Mesos from its inception. Spark is one of the most widely used big data processing systems for clusters. Matei Zaharia, the CTO of Databricks and creator of Spark, talked about Spark's advanced data analysis power and new features in its upcoming 2.0 release in his MesosCon 2016 keynote.Spark's Design Goals
Spark was created to meet two needs: to provide a unified engine for big data processing, and a concise high-level API for working with big data.
Two years ago, the Raspberry Pi seemed as if it might be eclipsed by a growing number of open-spec single board computer projects with faster processors, Android support, and more extensive features. But then the Raspberry Pi reestablished its lead in two major great leaps forward.
It wasn't that long ago that the idea of managing petabytes of data, and monitoring giant busy computing clusters running thousands of services was something for the future and not the now. As it turned out, that was a mighty short future, and it's all happening now. These talks from MesosCon North America show how two different companies are solving configuration and data management issues with Apache Mesos and other tools.
To create sustained high performance, organizations must invest as much in their people and processes as they do in their technology, according to Puppet’s 2016 State of DevOps Report.
The 50+ page report, written by Alanna Brown, Dr. Nicole Forsgren, Jez Humble, Nigel Kersten, and Gene Kim, aimed to better understand how the technical practices and cultural norms associated with DevOps affect IT and organizational performance as well as ROI.
Container technology remains very big news, and if you bring up the topic almost everyone immediately thinks of Docker. But, there are other tools that can compete with Docker, and tools that can extend it and make it more flexible. CoreOS’s Rkt, for example, is a command-line tool for running app containers.
A recurring theme in our MesosCon North America 2016 series is solving difficult resource provisioning problems. The days of investing days or even weeks in spec’ing, acquiring, and setting up hardware and software to meet increased workloads are long gone. Now we see vast provisioning adjustments taking place in seconds.
Good code is cheap; it’s operational knowledge that’s holding back big data from solving the great problems of our time.
Solving those operational difficulties with a modular, easy-to-use system was the solution Mark Shuttleworth laid out in his keynote entitled “More Fun, Less Friction” at Apache Big Data in Vancouver in May.
“If we take the friction out, we can unleash all sorts of creativity,” Shuttleworth said.
Several home automation platforms support Python as an extension, but if you’re a real Python fiend, you’ll probably want Home Assistant, which places the programming language front and center.
This Week in Linux News: OSS Opportunity For New Grads, Why Cloud Foundry is Gaining Traction, & More
Open source knowledge is very valuable in today’s job market. The 2016 Open Source Jobs Report from The Linux Foundation clearly showed that hiring managers are placing much value on open source cloud, networking, and security skills. It also showed that DevOps is emerging as a red hot job category.
CoreOS Linux, an open source Linux operating system, is now available in China. Microsoft Azure operator 21Vianet has become the first officially supported cloud provider to offer CoreOS Linux in China. Until now, many Chinese organizations have deployed CoreOS Linux internally, on their own.
Verizon Labs is building some impressive projects around Apache Mesos and relies on a lot of open source software for functionality: operating systems, networking, provisioning, monitoring, and administration. Open source software is popular at Verizon Labs because it gives them the flexibility and the functionality to do what they want to do, without fighting vendor restrictions.