Language Selection

English French German Italian Portuguese Spanish

OSI, Third Decade of Open Source, 20 Years and Counting

Filed under
OSS
  • Why I want you to run for the OSI Board

    In the world of tech, we fit across three generations of contributors to free and open source software–those who were involved in the early days of free software; those who found places in the community after open source had been established; and the group paultag humorously dubbed the GNU generation–none of us have lived in a world without the explicit concept of user freedom.

    Within my cadre of FOSS-loving millennials, several of us have fairly similar stories, both inside of our FOSS lives and out: we all had formative life experiences of financial hardship, and tech helped us emerge into comfortable, middle-class lifestyles. We’re all community-focused and have worked as community managers. We’ve been finalists for the same jobs.

    That is to say, while we have different opinions and different outlooks, we all come from fairly similar places.

  • FOSDEM: The Third Decade of Open Source

    This weekend I spoke at FOSDEM in Brussels to deliver the opening conference keynote. My subject was “The Third Decade of Open Source” and as OSI President I summed up the main events of the last 20 years, some of the key facts behind them and then offered five trends that will shape the next decade.

  • Open Source Software: 20 Years and Counting

    When the decision was made to follow the label open source, a rift opened up within the free software movement. Classical adherents of the traditional values – Stallman in particular – viewed the Open Source Initiative as pandering to corporate interests, concerned purely with the marketability of the idea, and less with the social and ethical values.

    The debate still rages on, in 2016 Richard Stallman posted on the GNU website that “open source misses the point of free software” and that “supporters of open source considered the term a marketing campaign for free software…while not raising issues of right and wrong that they might not like to hear.”

    Disagreements aside, the value of open source to the tech industry in the past twenty years is incredible. Fuelling a generation of thinkers and tinkerers and a whirlwind of technological advances, it will continue to grow and shape our digital future.

    In an increasingly digitized world, the core values of the movement are ones that we should consider as we move forward.

More in Tux Machines

today's leftovers

GNOME Shell, Mutter, and Ubuntu's GNOME Theme

Benchmarks on GNU/Linux

  • Linux vs. Windows Benchmark: Threadripper 2990WX vs. Core i9-7980XE Tested
    The last chess benchmark we’re going to look at is Crafty and again we’re measuring performance in nodes per second. Interestingly, the Core i9-7980XE wins out here and saw the biggest performance uplift when moving to Linux, a 5% performance increase was seen opposed to just 3% for the 2990WX and this made the Intel CPU 12% faster overall.
  • Which is faster, rsync or rdiff-backup?
    As our data grows (and some filesystems balloon to over 800GBs, with many small files) we have started seeing our night time backups continue through the morning, causing serious disk i/o problems as our users wake up and regular usage rises. For years we have implemented a conservative backup policy - each server runs the backup twice: once via rdiff-backup to the onsite server with 10 days of increments kept. A second is an rsync to our offsite backup servers for disaster recovery. Simple, I thought. I will change the rdiff-backup to the onsite server to use the ultra fast and simple rsync. Then, I'll use borgbackup to create an incremental backup from the onsite backup server to our off site backup servers. Piece of cake. And with each server only running one backup instead of two, they should complete in record time. Except, some how the rsync backup to the onsite backup server was taking almost as long as the original rdiff-backup to the onsite server and rsync backup to the offsite server combined. What? I thought nothing was faster than the awesome simplicity of rsync, especially compared to the ancient python-based rdiff-backup, which hasn't had an upstream release since 2009.

OSS Leftovers

  • Haiku: R1/beta1 release plans - at last
    At last, R1/beta1 is nearly upon us. As I’ve already explained on the mailing list, only two non-“task” issues remain in the beta1 milestone, and I have prototype solutions for both. The buildbot and other major services have been rehabilitated and will need only minor tweaking to handle the new branch, and mmlr has been massaging the HaikuPorter buildmaster so that it, too, can handle the new branch, though that work is not quite finished yet.
  • Haiku OS R1 Beta Is Finally Happening In September
    It's been five years since the last Haiku OS alpha release for their inaugural "R1" release but next month it looks like this first beta will be released, sixteen years after this BeOS-inspired open-source operating system started development.
  • IBM Scores More POWER Open-Source Performance Optimizations
    Following our POWER9 Linux benchmarks earlier this year, IBM POWER engineers have continued exploring various areas for optimization within the interesting open-source workloads tested. Another batch of optimizations are pending for various projects.
  • DevConf.in 2018
    Earlier this month, I attended DevConf.in 2018 conference in Bengaluru, KA, India. It was sort of culmination of a cohesive team play that began for me at DevConf.cz 2018 in Brno, CZ. I say sort of because the team is already gearing up for DevConf.in 2019.
  • The Unitary Fund: a no-strings attached grant program for Open Source quantum computing
    Quantum computing has the potential to be a revolutionary technology. From the first applications in cryptography and database search to more modern quantum applications across simulation, optimization, and machine learning. This promise has led industrial, government, and academic efforts in quantum computing to grow globally. Posted jobs in the field have grown 6 fold in the last two years. Quantum computing hardware and platforms, designed by startups and tech giants alike, continue to improve. Now there are new opportunities to discover how to best program and use these new machines. As I wrote last year: the first quantum computers will need smart software. Quantum computing also remains a place where small teams and open research projects can make a big difference. The open nature is important as Open Source software has the lowest barriers  for others to understand, share and build upon existing projects. In a new field that needs to grow, this rapid sharing and development is especially important. I’ve experienced this myself through leading the Open Source Forest project at Rigetti Computing and also by watching the growing ecosystem of open projects like QISKit, OpenFermion, ProjectQ, Strawberry Fields, XaCC, Cirq, and many others. The hackathons and community efforts from around the world are inspiring.
  • SiFive Announces First Open-Source RISC-V-Based SoC Platform With NVIDIA Deep Learning Accelerator Technology
    SiFive, the leading provider of commercial RISC-V processor IP, today announced the first open-source RISC-V-based SoC platform for edge inference applications based on NVIDIA's Deep Learning Accelerator (NVDLA) technology.