Language Selection

English French German Italian Portuguese Spanish

Linux.com

Syndicate content
News For Open Source Professionals
Updated: 1 hour 20 min ago

‘Master,’ ‘Slave’ and the Fight Over Offensive Terms in Computing (Kate Conger, New York Times, April 13, 2021)

Friday 7th of May 2021 12:14:49 AM

Nearly a year after the Internet Engineering Task Force took up a plan to replace words that could be considered racist, the debate is still raging.

Anyone who joined a video call during the pandemic probably has a global volunteer organization called the Internet Engineering Task Force to thank for making the technology work. The group, which helped create the technical foundations of the internet, designed the language that allows most video to run smoothly online. It made it possible for someone with a Gmail account to communicate with a friend who uses Yahoo, and for shoppers to safely enter their credit card information on e-commerce sites.

Now the organization is tackling an even thornier issue: getting rid of computer engineering terms that evoke racist history, like “master” and “slave” and “whitelist” and “blacklist.”

But what started as an earnest proposal has stalled as members of the task force have debated the history of slavery and the prevalence of racism in tech. Some companies and tech organizations have forged ahead anyway, raising the possibility that important technical terms will have different meanings to different people — a troubling proposition for an engineering world that needs broad agreement so technologies work together.

While the fight over terminology reflects the intractability of racial issues in society, it is also indicative of a peculiar organizational culture that relies on informal consensus to get things done.

The Internet Engineering Task Force eschews voting, and it often measures consensus by asking opposing factions of engineers to hum during meetings. The hums are then assessed by volume and ferocity. Vigorous humming, even from only a few people, could indicate strong disagreement, a sign that consensus has not yet been reached.

The I.E.T.F. has created rigorous standards for the internet and for itself. Until 2016, it required the documents in which its standards are published to be precisely 72 characters wide and 58 lines long, a format adapted from the era when programmers punched their code into paper cards and fed them into early IBM computers.

“We have big fights with each other, but our intent is always to reach consensus,” said Vint Cerf, one of the founders of the task force and a vice president at Google. “I think that the spirit of the I.E.T.F. still is that, if we’re going to do anything, let’s try to do it one way so that we can have a uniform expectation that things will function.”

The group is made up of about 7,000 volunteers from around the world. It has two full-time employees, an executive director and a spokesman, whose work is primarily funded by meeting dues and the registration fees of dot-org internet domains. It cannot force giants like Amazon or Apple to follow its guidance, but tech companies often choose to do so because the I.E.T.F. has created elegant solutions for engineering problems.

Its standards are hashed out during fierce debates on email lists and at in-person meetings. The group encourages participants to fight for what they believe is the best approach to a technical problem.

While shouting matches are not uncommon, the Internet Engineering Task Force is also a place where young technologists break into the industry. Attending meetings is a rite of passage, and engineers sometimes leverage their task force proposals into job offers from tech giants.

In June, against the backdrop of the Black Lives Matter protests, engineers at social media platforms, coding groups and international standards bodies re-examined their code and asked themselves: Was it racist? Some of their databases were called “masters” and were surrounded by “slaves,” which received information from the masters and answered queries on their behalf, preventing them from being overwhelmed. Others used “whitelists” and “blacklists” to filter content.

Mallory Knodel, the chief technology officer at the Center for Democracy and Technology, a policy organization, wrote a proposal suggesting that the task force use more neutral language. Invoking slavery was alienating potential I.E.T.F. volunteers, and the terms should be replaced with ones that more clearly described what the technology was doing, argued Ms. Knodel and the co-author of her proposal, Nielsten Oever, a postdoctoral researcher at the University of Amsterdam. “Blocklist” would explain what a blacklist does, and “primary” could replace “master,” they wrote.

On an email list, responses trickled in. Some were supportive. Others proposed revisions. And some were vehemently opposed. One respondent wrote that Ms. Knodel’s draft tried to construct a new “Ministry of Truth.”

Amid insults and accusations, many members announced that the battle had become too toxic and that they would abandon the discussion.

The pushback didn’t surprise Ms. Knodel, who had proposed similar changes in 2018 without gaining traction. The engineering community is “quite rigid and averse to these sorts of changes,” she said. “They are averse to conversations about community comportment, behavior — the human side of things.”

In July, the Internet Engineering Task Force’s steering group issued a rare statement about the draft from Ms. Knodel and Mr. ten Oever. “Exclusionary language is harmful,” it said.

A month later, two alternative proposals emerged. One came from Keith Moore, an I.E.T.F. contributor who initially backed Ms. Knodel’s draft before creating his own. His cautioned that fighting over language could bottleneck the group’s work and argued for minimizing disruption.

The other came from Bron Gondwana, the chief executive of the email company Fastmail, who said he had been motivated by the acid debate on the mailing list.

“I could see that there was no way we would reach a happy consensus,” he said. “So I tried to thread the needle.”

Mr. Gondwana suggested that the group should follow the tech industry’s example and avoid terms that would distract from technical advances.

Last month, the task force said it would create a new group to consider the three drafts and decide how to proceed, and members involved in the discussion appeared to favor Mr. Gondwana’s approach. Lars Eggert, the organization’s chair and the technical director for networking at the company NetApp, said he hoped guidance on terminology would be issued by the end of the year.

The rest of the industry isn’t waiting. The programming community that maintains MySQL, a type of database software, chose “source” and “replica” as replacements for “master” and “slave.” GitHub, the code repository owned by Microsoft, opted for “main” instead of “master.”

In July, Twitter also replaced a number of terms after Regynald Augustin, an engineer at the company, came across the word “slave” in Twitter’s code and advocated change.

But while the industry abandons objectionable terms, there is no consensus about which new words to use. Without guidance from the Internet Engineering Task Force or another standards body, engineers decide on their own. The World Wide Web Consortium, which sets guidelines for the web, updated its style guide last summer to “strongly encourage” members to avoid terms like “master” and “slave,” and the IEEE, an organization that sets standards for chips and other computing hardware, is weighing a similar change.

Other tech workers are trying to solve the problem by forming a clearinghouse for ideas about changing language.

That effort, the Inclusive Naming Initiative, aims to provide guidance to standards bodies and companies that want to change their terminology but don’t know where to begin.

The group got together while working on an open-source software project, Kubernetes, which like the I.E.T.F. accepts contributions from volunteers. Like many others in tech, it began the debate over terminology last summer.

“We saw this blank space,” said Priyanka Sharma, the general manager of the Cloud Native Computing Foundation, a nonprofit that manages Kubernetes. Ms. Sharma worked with several other Kubernetes contributors, including Stephen Augustus and Celeste Horgan, to create a rubric that suggests alternative words and guides people through the process of making changes without causing systems to break. Several major tech companies, including IBM and Cisco, have signed on to follow the guidance.

Priyanka Sharma and several other tech workers in the Inclusive Naming Initiative came up
with a rubric to suggest alternative words

Although the Internet Engineering Task Force is moving more slowly, Mr. Eggert said it would eventually establish new guidelines. But the debate over the nature of racism — and whether the organization should weigh in on the matter — has continued on its mailing list.

In a subversion of an April Fools’ Day tradition within the group, several members submitted proposals mocking diversity efforts and the push to alter terminology in tech.

Two prank proposals were removed hours later because they were “racist and deeply disrespectful,” Mr. Eggert wrote in an email to task force participants, while a third remained up.

“We build consensus the hard way, so to speak, but in the end the consensus is usually stronger because people feel their opinions were reflected,” Mr. Eggert said. “I wish we could be faster, but on topics like this one that are controversial, it’s better to be slower.”

Kate Conger is a technology reporter in the San Francisco bureau, where she covers the gig economy and social media. @kateconger

The post ‘Master,’ ‘Slave’ and the Fight Over Offensive Terms in Computing (Kate Conger, New York Times, April 13, 2021) appeared first on Linux Foundation.

The post ‘Master,’ ‘Slave’ and the Fight Over Offensive Terms in Computing (Kate Conger, New York Times, April 13, 2021) appeared first on Linux.com.

Open Mainframe Project Launches Call for Proposals for the 2nd Annual Open Mainframe Summit on September 22-23

Wednesday 5th of May 2021 10:00:00 PM

Registration for the Virtual Event is now Open

SAN FRANCISCO, May 5, 2021 The Open Mainframe Project (OMP), an open source initiative that enables collaboration across the mainframe community to develop shared tool sets and resources, today announced plans for its 2nd annual Open Mainframe Summit, the premier mainframe event of 2021. The event, set for September 22-23, is open to students, developers, users and contributors of Open Mainframe projects from around the globe looking to learn, network and collaborate. As a virtual event again this year, Open Mainframe Summit will feature content tracks that tackle both business and technical strategies for enterprise development and deployment.

In Open Mainframe Project’s inaugural event last year, more than 380 registrants from 175 companies joined the two-day conference that featured 36 sessions. Some of the most popular sessions were the Women in Tech panel, COBOL sessions, new mainframer journey and project overview sessions for Ambitus, Feilong, Polycephaly, and Zowe. The event report can be found here and all of the videos can be watched here.

“Open Mainframe Project is becoming the gateway to all educational tools and initiatives that run some of the world’s biggest enterprise systems,” said John Mertic, Director of Program Management at the Linux Foundation. “For our inaugural event last year, we merely dipped our toes in the water as a new summit. This year, we’ll see more change makers speaking about open source innovation, creativity and diversity in mainframe related technologies. We look forward to igniting conversations that are going to positively impact all facets of mainframes.”

Call for Proposals

The Call for Proposals is now open and will be accepting submissions until July 16, 2021. Interested speakers can submit proposals in five tracks such as business overview, Linux on Z, z/OS, education and training and diversity, equity and inclusion. Options for presentations include lightning talks, 30-minute sessions and panel discussions.

A program committee, which will include maintainers, active community members and project leaders, will review and rate the proposals once all the submissions are in. This year, Open Mainframe Project welcomes Greg MacKinnon, Distinguished Engineer at Broadcom, Inc; Joe Winchester, Technical Staff Member at IBM; Kimberly Andersson, Director of Experience Design at Rocket Software; Stacey Miller, Product Marketing Manager at SUSE; and Harry Williams, Chief Technology Officer at Marist College as the 2021 Open Mainframe Summit program committee.

Submit a proposal here: https://events.linuxfoundation.org/open-mainframe-summit/program/cfp/.

Whether a company is a member or contributor of Open Mainframe Project or is sponsoring the event has no impact on whether talks from their developers will be selected. However, being a community leader does have an impact, as program committee members will often rate talks from the creators or leaders of an open source project more highly. Focus on work with an open source project, whether it is one of the Open Mainframe Project’s 18 hosted projects or working groups that adds value to the ecosystem.

Conference Registration for the online event is $50 for general attendance and $15 for academia. Registration is now open, click here to register.

Thank you Sponsors

Open Mainframe Summit is made possible with support from our Platinum Sponsors Broadcom Mainframe Software, Rocket Software, and SUSE; our Gold Sponsor Vicom Infinity; and our Academic and Community Sponsors CD Foundation and the Fintech Open Source Foundation (FINOS). To become a sponsor, click here.

For more about Open Mainframe Project, visit https://www.openmainframeproject.org/

About the Open Mainframe Project

The Open Mainframe Project is intended to serve as a focal point for deployment and use of Linux and Open Source in a mainframe computing environment. With a vision of Open Source on the Mainframe as the standard for enterprise class systems and applications, the project’s mission is to build community and adoption of Open Source on the mainframe by eliminating barriers to Open Source adoption on the mainframe, demonstrating value of the mainframe on technical and business levels, and strengthening collaboration points and resources for the community to thrive. Learn more about the project at https://www.openmainframeproject.org.

About The Linux Foundation

The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

###

The post Open Mainframe Project Launches Call for Proposals for the 2nd Annual Open Mainframe Summit on September 22-23 appeared first on Linux Foundation.

The post Open Mainframe Project Launches Call for Proposals for the 2nd Annual Open Mainframe Summit on September 22-23 appeared first on Linux.com.

Linux Foundation Launches Open Source Digital Infrastructure Project for Agriculture, Enables Global Collaboration Among Industry, Government and Academia

Wednesday 5th of May 2021 08:00:00 PM

AgStack Foundation will build and sustain the global data infrastructure for food and agriculture to help scale digital transformation and address climate change, rural engagement and food and water security

SAN FRANCISCO, Calif., May 5, 2021 –  The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the launch of the AgStack Foundation, the open source digital infrastructure project for the world’s agriculture ecosystem. AgStack Foundation will improve global agriculture efficiency through the creation, maintenance and enhancement of free, reusable, open and specialized digital infrastructure for data and applications.

Founding members and contributors include leaders from both the technology and agriculture industries, as well as across sectors and geographies. Members and partners include Agralogics, Call for Code, Centricity Global, Digital Green, Farm Foundation, farmOS, HPE, IBM, Mixing Bowl & Better Food Ventures, NIAB, OpenTeam, Our Sci, Produce Marketing Association, Purdue University / OATS & Agricultural Informatics Lab, the University of California Agriculture and Natural Resources (UC-ANR) and University of California Santa Barbara SmartFarm Project.

“The global Agriculture ecosystem desperately needs a digital makeover. There is too much loss of productivity and innovation due to the absence of re-usable tools and data. I’m excited to lead this community of leaders, contributors and members – from across sectors and countries – to help build this common and re-usable resource – AgStack – that will help every stakeholder in global agriculture with free and open digital tools and data,” said Sumer Johal, Executive Director of AgStack.

Thirty-three percent of all food produced is wasted, while nine percent of the people in the world are hungry or undernourished. These societal drivers are compounded with legacy technology systems that are too slow and inefficient and can’t work across the growing and more complex agricultural supply chain. AgStack will use collaboration and open source software to build the 21st century digital infrastructure that will be a catalyst for innovation on new applications, efficiencies and scale.

AgStack consists of an open repository to create and publish models, free and easy access to public data, interoperable frameworks for cross-project use and topic-specific extensions and toolboxes. It will leverage existing technologies such as agriculture standards (AgGateway, UN-FAO, CAFA, USDA and NASA-AR); public data (Landsat, Sentinel, NOAA and Soilgrids; models (UC-ANR IPM), and open source projects like Hyperledger, Kubernetes, Open Horizon, Postgres, Django and more.

“We’re pleased to provide the forum for AgStack to be built and to grow,” said Mike Dolan, general manager and senior vice president of projects at the Linux Foundation. “It’s clear that by using open source software to standardize the digital infrastructure for agriculture, that AgStack can reduce cost, accelerate integration and enable innovation. It’s amazing to see industries like agriculture use open source principles to innovate.”

For more information about AgStack, please visit: http://www.agstack.org

Member/Partner Statements

Call for Code

“Through Call for Code and IBM’s tech-for-good programs, we’ve seen amazing grassroots innovation created by developers who build solutions to address local farming issues that affect them personally,” said Daniel Krook, IBM CTO for Call for Code. “As thriving, sustainable open source projects hosted at the Linux Foundation, applications like Agrolly and Liquid Prep have access to a strong ecosystem of partners and will be able to accelerate their impact through a shared framework of open machine learning models, data sets, libraries, message formats, and APIs such as those provided by AgStack.”

Centricity Global

“Interoperability means working together and open source has proven to be the most practical means of doing so. Centricity Global looks forward to bringing our teams, tools and applications to the AgStack community and to propelling projects that deliver meaningful value long-term,” said Drew Zabrocki, Centricity Global. “Now is the time to get things done. The docking concept at AgStack is a novel way to bring people and technology together under a common, yet sovereign framework; I see great potential for facilitating interoperability and data sovereignty in a way that delivers tangible value on the farm forward across the supply value chain.”

Digital Green

“The explosion of agri-tech innovations from large companies to startups to governments to non-profits represents a game changer for farmers in both the Global South and North.  At the same time, it’s critical that we build digital infrastructure that ensures that the impact of these changes enables the aspirations of those most marginalized and builds their resilience, particularly in the midst of climate change. We’re excited about joining hands with AgStack with videos produced by & for farmers and FarmStack, a secure data sharing protocol, that fosters community and trust and puts farmers back in the center of our food & agricultural system,” said Rikin Gandhi, Co-founder and Executive Director.

Farm Foundation

“The advancements in digital agriculture over the past 10 years have led to more data than ever before—data that can be used to inform business decisions, improve supply and demand planning and increase efficiencies across stakeholders. However, the true potential of all that data won’t be fully realized without achieving interoperability via an open source environment. Interoperable data is more valuable data, and that will lead to benefits for farmers and others throughout the food and ag value chain,” said Martha King, Vice President of Programs and Projects, Farm Foundation.

farmOS

“AgStack’s goal of creating a shared community infrastructure for agricultural datasets, models, frameworks, and tools fills a much-needed gap in the current agtech software landscape. Making these freely available to other software projects allows them to focus on their unique value and build upon the work of others. We in the farmOS community are eager to leverage these shared resources in the open source record keeping tools we are building together,” said Michael Stenta, founder and lead developer, farmOS.

HPE

“The world’s food supply needs digital innovation that currently faces challenges of adoption due to the lack of a common, secure, community-maintained digital infrastructure. AgStack – A Linux Foundation’s Project, is creating this much needed open source digital infrastructure for accelerating innovation. We at Hewlett Packard Enterprise are excited about contributing actionable insights and learnings to solve data challenges that this initiative can provide and we’re committed to its success!” said Janice Zdankus, VP, Innovation for Social Impact, Office of the CTO, Hewlett Packard Enterprise.

Mixing Bowl & Better Food Ventures

“There are a lot of people talking about interoperability; it is encouraging to see people jump in to develop functional tools to make it happen. We share the AgStack vision and look forward to collaborating with the community to enable interoperability at scale,” said Rob Trice, Partner, The Mixing Bowl & Better Food Ventures.

NIAB

“Climate change is a global problem and agriculture needs to do its part to reduce greenhouse gas emissions during all stages of primary production. This requires digital innovation and a common, global, community-maintained digital infrastructure to create the efficient, resilient, biodiverse and low-emissions food production systems that the world needs. These systems must draw on the best that precision agriculture has to offer and aligned innovations in crop science, linked together through open data solutions. AgStack – A Linux Foundation Project, is creating this much needed open-source digital infrastructure for accelerating innovation. NIAB are excited to join this initiative and will work to develop a platform that brings together crop and data science at scale. As the UK’s fastest growing, independent crop research organization NIAB provides crop science, agronomy and data science expertise across a broad range of arable and horticultural crops,” said Dr Richard Harrison, Director of NIAB Cambridge Crop Research.

OpenTEAM

“Agriculture is a shared human endeavor and global collaboration is necessary to translate our available knowledge into solutions that work on the ground necessary to adapt and mitigate climate change, improve livelihoods, and biodiversity as well as the produce of abundant food fiber and energy.  Agriculture is at the foundation of manufacture and commerce and AgStack represents a collaborative effort at a scale necessary to meet the urgency of the moment and unlock our shared innovative capacity through free, reusable, open digital infrastructure.  OpenTEAM is honored to join with the mission to equip producers with tools that both support data sovereignty for trusted transactions while also democratizing site specific agricultural knowledge regardless of scale, culture or geography,” said Dr. Dorn Cox, project lead and founder of Open Technology Ecosystem for Agricultural Management and research director for Wolfe’s Neck Center for Agriculture & the Environment.

Our Sci

“AgStack provides a framework for a scalable base of open source software, and the shared commitment to keep it alive and growing.  We’re excited to see it succeed!” said Greg Austic, owner, Our Sci.

Produce Marketing Association

“The digitization of data will have tremendous benefits for the Fresh Produce and Floral industry in the areas of traceability, quality management, quality prediction and other efficiencies through supply chain visibility. The key is challenges to adoption is interoperability and the development of a common, community-maintained digital infrastructure. I am confident that AgStack – A Linux Foundation’s Project, can create this much needed open-source digital infrastructure for accelerating innovation. We at Produce Marketing Association are excited about this initiative and we are committed to its success,” said Ed Treacy, VP of Supply Chain and Sustainability.

Purdue University

“We need fundamental technical infrastructure to enable open innovation in agriculture, including ontologies, models, and tools. Through the AgStack Project, the Linux Foundation will provide valuable cohesion and development capacity to support shared, community-maintained infrastructure. At the Agricultural Informatics Lab, we’re committed to enabling resilience food and agricultural systems through deliberate design and development of such infrastructure,” said Ankita Raturi, Assistant Professor, Agricultural Informatics Lab, Purdue University.

“True interoperability requires a big community and we’re excited to see the tools that we’ve brought to the open-source ag community benefiting new audiences.  OATS Center at Purdue University looks forward to docking the Trellis Framework for supply chain, market access and regulatory compliance through AgStack for the benefit of all,” said Aaron Ault, Co-Founder OATS Center at Purdue University.

UC Davis

“Translating 100+ years of UC agricultural research into usable digital software and applications is a critical goal in the UC partnership with the AgStack open source community. We are excited about innovators globally using UC research and applying it to their local crops through novel digital technologies,” said Glenda Humiston, VP of Agriculture and Natural Resources, University of California.

“Artificial Intelligence and Machine Learning are critical to food and agriculture transformation, and will require new computational models and massive data sets to create working technology solutions from seed to shelf. The AI Institute for Next Generation Food Systems is excited to partner with the AgStack open source community to make our work globally available to accelerate the transformation,” said Ilias Tagkopoulos, Professor, Computer Science at UC Davis and Director, AI Institute of Next Generation Food Systems.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

###

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page:  https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Media Contact

Jennifer Cloer
for Linux Foundation
503-867-2304
jennifer@storychangesculture.com

The post Linux Foundation Launches Open Source Digital Infrastructure Project for Agriculture, Enables Global Collaboration Among Industry, Government and Academia appeared first on Linux Foundation.

The post Linux Foundation Launches Open Source Digital Infrastructure Project for Agriculture, Enables Global Collaboration Among Industry, Government and Academia appeared first on Linux.com.

10 great sysadmin articles you might have missed from April 2021

Tuesday 4th of May 2021 08:40:49 PM

10 great sysadmin articles you might have missed from April 2021

The best of April 2021 from Enable Sysadmin. Thank you to our contributors and to our readers.
tcarriga
Tue, 5/4/2021 at 1:40pm

Image

Image by msandersmusic from Pixabay

April 2021 was a great month for Enable Sysadmin. We published 30 articles and received 549,684 pageviews from over 370k unique visitors. Today, we are looking back at our top ten articles to give readers a chance to catch up on any of the great content they may have missed. In this list, you will see various topics covered and we are confident that some, if not all will be of interest to you.

Topics:  
Linux  
Linux Administration  
Automation  
Read More at Enable Sysadmin

The post 10 great sysadmin articles you might have missed from April 2021 appeared first on Linux.com.

Linux Foundation & CNCF Launch Free Kubernetes on Edge Training

Tuesday 4th of May 2021 03:00:36 PM

Offered on the edX.org learning platform, the new online course explores use cases and applications of Kubernetes at the edge

SAN FRANCISCO, May 4, 2021 – The Linux Foundation, the nonprofit organization enabling mass innovation through open source, and Cloud Native Computing Foundation (CNCF), which builds sustainable ecosystems for cloud native software, today at KubeCon + CloudNativeCon Europe (Virtual) announced the availability of a new online training course on edX.org, the online learning platform founded by Harvard and MIT. The course, Introduction to Kubernetes on Edge with K3s (LFS156x), takes a deep dive into the use cases and applications of Kubernetes at the edge using examples, labs, and a technical overview of the K3s project and the cloud native edge ecosystem.

In this 15 hour course, participants will learn the use cases for running compute in edge locations and about various supporting projects and foundations such as LF Edge and CNCF. The course covers how to deploy applications to the edge with open source tools such as K3s and k3sup, and how those tools can be applied to low-power hardware such as the Raspberry Pi. Students will learn the challenges associated with edge compute, such as partial availability and the need for remote access. Through practical examples, students will gain experience of deploying applications to Kubernetes and get hands-on with object storage, MQTT, and OpenFaaS. It also introduces the fleet management and GitOps models of deployment, and helps the student understand messaging, and how to interface with sensors and real hardware.

LFS156x is designed primarily for developers who need to learn about the growing impact the cloud native movement is having on modernizing edge deployments, though others working with Kubernetes or edge computing will find the content of use.

The course was developed by Alex Ellis, a CNCF Ambassador and the Founder of OpenFaaS and inlets. Ellis is a respected expert on serverless and cloud native computing. He founded OpenFaaS, one of the most popular open-source serverless projects, where he has built the community via writing, speaking, and extensive personal engagement. As a consultant and CNCF Ambassador, he helps companies around the world navigate the cloud native landscape and build great developer experiences. Ellis also authored the existing Introduction to Serverless on Kubernetes (LFS157x) course.

“K3s fills a very specific need and helps lower the barrier to entry for development and operation teams,” said Alex Ellis, Founder of Inlets and OpenFaaS, CNCF Ambassador. “I’ve seen the project grow from Darren’s initial post on Hacker News, to a GA, production-ready Kubernetes distribution housed within CNCF. I’m excited to share this course with the community and customers alike, and am looking forward to seeing increased use of Kubernetes at the edge.”

Introduction to Kubernetes on Edge with K3s is available to begin immediately. Auditing the course through edX is free for ten weeks, or participants can opt for a paid verified certificate of completion, which provides access to the course for a full year and additional assessments and content to deepen the learning experience. 

About the Cloud Native Computing Foundation

Cloud native computing empowers organizations to build and run scalable applications with an open source software stack in public, private, and hybrid clouds. The Cloud Native Computing Foundation (CNCF) hosts critical components of the global technology infrastructure, including Kubernetes, Prometheus, and Envoy. CNCF brings together the industry’s top developers, end users, and vendors, and runs the largest open source developer conferences in the world. Supported by more than 500 members, including the world’s largest cloud computing and software companies, as well as over 200 innovative startups, CNCF is part of the nonprofit Linux Foundation. For more information, please visit www.cncf.io

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

# # #

The post Linux Foundation & CNCF Launch Free Kubernetes on Edge Training appeared first on Linux Foundation – Training.

The post Linux Foundation & CNCF Launch Free Kubernetes on Edge Training appeared first on Linux.com.

2021 State of the Edge Report Shows Impact of COVID-19 on Edge Use

Tuesday 4th of May 2021 12:47:37 AM

Christine Hall writes in ITPRo:

A recent edge report by the Linux Foundation concluded that COVID-19 has changed the prognosis on which industries will have the largest edge computing architecture footprint going forward.

“Our 2021 analysis shows demand for edge infrastructure accelerating in a post-COVID-19 world,” said Matt Trifiro, co-chair of State of the Edge and CMO of the edge data center company Vapor IO, in a statement. “We’ve been observing this trend unfold in real-time as companies re-prioritize their digital transformation efforts to account for a more distributed workforce and a heightened need for automation. The new digital norms created in response to the pandemic will be permanent. This will intensify the deployment of new technologies like wireless 5G and autonomous vehicles, but will also impact nearly every sector of the economy, from industrial manufacturing to healthcare.”

Read more at ITPro

The post 2021 State of the Edge Report Shows Impact of COVID-19 on Edge Use appeared first on Linux.com.

Open-source software economics and community health analytics: Enter CHAOSS

Monday 3rd of May 2021 04:21:57 PM

George Anadiotis at ZDNet writes:

CHAOSS stands for Community Health Analytics Open Source Software. It’s a Linux Foundation project, and its roots go back 15 years ago. A research team at the University of Juan-Carlos in Madrid, Spain, was trying to understand how software is being built in open source.

There was no tooling to help them do that, so they built their own open-source software. That was the foundation of what is now called GrimoireLab: A set of free, open-source software tools for software development analytics.

The tools gather data from several platforms involved in software development (Git, GitHub, Jira, Bugzilla, Gerrit, Mailing lists, Jenkins, Slack, Discourse, Confluence, StackOverflow, and more), merge and organize it in a database, and produce visualizations, actionable dashboards, and analytics.

Read more at ZDNet

The post Open-source software economics and community health analytics: Enter CHAOSS appeared first on Linux.com.

Certification Exam Prices Increase July 1 – Lock in Current Pricing

Friday 30th of April 2021 09:00:41 PM

Since we launched our first certification exam in August of 2014, all Linux Foundation performance-based certification exams have been priced at $300. To address the rise in costs associated with administering these exams, we will be implementing a modest price increase effective July 1, 2021. 

All performance-based exams will increase in price from $300 to $375. Bundles of performance-based certifications and their associated training courses will increase from $499 to $575. Bootcamp pricing will also increase in price from $999 to $1,200, and the Linux Foundation Certified IT Associate (LFCA) knowledge-based exam will increase from $200 to $250. We continue to provide the industry’s only free-retake guarantee (an automatic second attempt if your first is unsuccessful on most exams), and are in the process adding other features for exam takers such as an enhanced interface and exam simulation labs.

We strive to make quality open source certifications as accessible as possible so we want to provide plenty of notice, and the old pricing will remain in place through June 30th.

Don’t forget that our Linux exams and training courses including LFCA, LFCE and LFCS are discounted 30% through the end of 2021 in recognition of the 30th anniversary of Linux. Use code LINUX30 at checkout to take advantage of these savings.

The post Certification Exam Prices Increase July 1 – Lock in Current Pricing appeared first on Linux Foundation – Training.

The post Certification Exam Prices Increase July 1 – Lock in Current Pricing appeared first on Linux.com.

May the Fourth be with you via Podman

Thursday 29th of April 2021 07:47:37 PM

A unique approach to rewatching the original Star Wars movie in a container.
Read More at Enable Sysadmin

The post May the Fourth be with you via Podman appeared first on Linux.com.

The Linux Foundation Announces Open Source Summit + Embedded Linux Conference 2021 Will Move From Dublin, Ireland to Seattle, Washington

Wednesday 28th of April 2021 07:40:00 AM

Calls for Speaking Proposals close June 13OSPOCon and Linux Security Summit will also move to SeattleAll events will take place September 27 – October 1

SAN FRANCISCO, April 27, 2021The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced today that Open Source Summit + Embedded Linux Conference 2021, along with Linux Security Summit and OSPOCon, will take place in Seattle, Washington, USA, from September 27 – October 1. 

Earlier in the year, it was announced that instead of separate North America and Europe editions of Open Source Summit + Embedded Linux Conference (OS Summit + ELC), only one would be held in 2021, located in Dublin, Ireland. The decision to move these events from Dublin, Ireland to Seattle, Washington, USA, has been made due to the current state of vaccination rates in Europe and upon review of past attendee survey results regarding where and when they would feel comfortable traveling this year.  

OS Summit + ELC will be held in a hybrid format, with both in-person and virtual offerings, to ensure that everyone who wants to participate is able to do so.

KVM Forum, which was also scheduled to take place in Dublin, will now be a virtual event taking place September 15 -16. New details on Linux Plumbers Conference and Linux Kernel Maintainer Summit, also previously scheduled in Dublin, will be announced shortly. A second OSPOCon – OSPOCon Europe, will be held in London on October 6, 2021, with more details coming soon.

Registration for all events will open in June, after more details on local regulations and venue safety plans are available. 

Calls for Speaking Proposals
The Call for Speaking Proposals for OS Summit + ELC and OSPOCon are open through Sunday, June 13 at 11:59pm PDT.  Interested community members are encouraged to apply here. Speakers will be able to speak in person or remotely. 

Linux Security Summit’s Call for Proposals is open through Sunday, June 27 at 11:59pm PDT.  Applications are being accepted here.

Sponsorships
Sponsorships are available for all events. Benefits include speaking opportunities, prominent branding, opportunities to support diversity and inclusion, lead generation activities, event passes, and more. View the sponsorship prospectus here or email us to learn more.  

Open Source Summit + Embedded Linux Conference 2021 is made possible thanks to Diamond Sponsors IBM and Red Hat, Platinum Sponsor Huawei and Gold Sponsor Soda Foundation, among others. For information on becoming an event sponsor, click here.

OSPOCon is presented by The Linux Foundation and the TODO Group and is made possible by Host Sponsors Eclipse Foundation and Huawei, and Supporter Sponsor Sauce Labs. For information on becoming an event sponsor, click here

Linux Security Summit is made possible by General Sponsor Technology Innovation Institute, and Supporter Sponsors IBM and Indeed. For information on becoming a sponsor, click here

Members of the press who would like to request a media pass should contact Kristin O’Connell at koconnell@linuxfoundation.org

About The Linux Foundation
Founded in 2000, The Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation projects are critical to the world’s infrastructure, including Linux, Kubernetes, Node.js, and more.  The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

Follow The Linux Foundation on TwitterFacebook, and LinkedIn for all the latest news, event updates and announcements.

The Linux Foundation Events are where the world’s leading technologists meet, collaborate, learn and network in order to advance innovations that support the world’s largest shared technologies.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

####

Media Contact:

Kristin O’Connell
The Linux Foundation
koconnell@linuxfoundation.org

The post The Linux Foundation Announces Open Source Summit + Embedded Linux Conference 2021 Will Move From Dublin, Ireland to Seattle, Washington appeared first on Linux Foundation.

The post The Linux Foundation Announces Open Source Summit + Embedded Linux Conference 2021 Will Move From Dublin, Ireland to Seattle, Washington appeared first on Linux.com.

Edit Post

Wednesday 28th of April 2021 07:18:03 AM
The Linux Foundation Announces Open Source Summit + Embedded Linux Conference 2021 will move from Dublin, Ireland to Seattle, Washington

Calls for Speaking Proposals close June 13OSPOCon and Linux Security Summit will also move to SeattleAll events will take place September 27 – October 1

SAN FRANCISCO, April 27, 2021The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced today that Open Source Summit + Embedded Linux Conference 2021, along with Linux Security Summit and OSPOCon, will take place in Seattle, Washington, USA, from September 27 – October 1. 

Earlier in the year, it was announced that instead of separate North America and Europe editions of Open Source Summit + Embedded Linux Conference (OS Summit + ELC), only one would be held in 2021, located in Dublin, Ireland. The decision to move these events from Dublin, Ireland to Seattle, Washington, USA, has been made due to the current state of vaccination rates in Europe and upon review of past attendee survey results regarding where and when they would feel comfortable traveling this year.  

OS Summit + ELC will be held in a hybrid format, with both in-person and virtual offerings, to ensure that everyone who wants to participate is able to do so.

KVM Forum, which was also scheduled to take place in Dublin, will now be a virtual event taking place September 15 – 16. New details on Linux Plumbers Conference and Linux Kernel Maintainer Summit, also previously scheduled in Dublin, will be announced shortly. A second OSPOCon – OSPOCon Europe, will be held in London on October 6, 2021, with more details coming soon.

Registration for all events will open in June, after more details on local regulations and venue safety plans are available. 

Calls for Speaking Proposals
The Call for Speaking Proposals for OS Summit + ELC and OSPOCon are open through Sunday, June 13 at 11:59pm PDT.  Interested community members are encouraged to apply here. Speakers will be able to speak in person or remotely. 

Linux Security Summit’s Call for Proposals is open through Sunday, June 27 at 11:59pm PDT.  Applications are being accepted here.

Sponsorships
Sponsorships are available for all events. Benefits include speaking opportunities, prominent branding, opportunities to support diversity and inclusion, lead generation activities, event passes, and more. View the sponsorship prospectus here or email us to learn more.  

Open Source Summit + Embedded Linux Conference 2021 is made possible thanks to Diamond Sponsors IBM and Red Hat, Platinum Sponsor Huawei and Gold Sponsor Soda Foundation, among others. For information on becoming an event sponsor, click here.

OSPOCon is presented by The Linux Foundation and the TODO Group and is made possible by Host Sponsors Eclipse Foundation and Huawei, and Supporter Sponsor Sauce Labs. For information on becoming an event sponsor, click here

Linux Security Summit is made possible by General Sponsor Technology Innovation Institute, and Supporter Sponsors IBM and Indeed. For information on becoming a sponsor, click here

Members of the press who would like to request a media pass should contact Kristin O’Connell at koconnell@linuxfoundation.org

About The Linux Foundation
The Linux Foundation is the organization of choice for the world’s top developers and companies to build ecosystems that accelerate open technology development and industry adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. More information can be found at www.linuxfoundation.org.

The Linux Foundation Events are where the world’s leading technologists meet, collaborate, learn and network in order to advance innovations that support the world’s largest shared technologies.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademark usage page: https://www.linuxfoundation.org/trademark-usage.

Linux is a registered trademark of Linus Torvalds.

####

Media Contact:
Kristin O’Connell
The Linux Foundation
koconnell@linuxfoundation.org

The post Edit Post appeared first on Linux Foundation.

The post Edit Post appeared first on Linux.com.

BPF: Application Development and libbpf

Tuesday 27th of April 2021 10:00:00 PM

Notes on BPF: Building applications using libbpf
Click to Read More at Oracle Linux Kernel Development

The post BPF: Application Development and libbpf appeared first on Linux.com.

Introducing the Production Engineering Track of the MLH Fellowship, powered by Facebook

Tuesday 27th of April 2021 09:27:07 PM

This article was originally posted at Major League Hacking

This Summer, Major League Hacking (MLH) is launching the Production Engineering Track of the MLH Fellowship, powered by Facebook. This is a 12-week educational program that will use industry leading curriculum from the Linux Foundation Training & Certification and hands-on project-based learning to teach students how to become Production Engineers. The program will provide opportunities to 100 aspiring software engineers to broaden their skills and career options, by learning important Production Engineering and DevOps skills. The program will run between June 7 – August 30, 2021 and will be open to United States, Mexico, and Canada based students who are enrolled in a 4 year degree granting program. Applications are open and will run until May 28, 2021.

What is Production Engineering anyway?

Early career software engineers are passionate and motivated to learn new skills and create a positive impact on the world, but many have not been exposed to the wider array of career options that are available to them. Production Engineering, also known as Site Reliability Engineering and DevOps, is one of the most in-demand skill sets that leading technology companies are hiring for, yet it is not widely available as an educational option in university settings. 

Production Engineers (PEs) at Facebook are a hybrid between software and systems engineers and are core to engineering efforts that keep Facebook running smoothly and scaling efficiently. PEs work within Facebook’s product and infrastructure teams to make sure their products and services are reliable and scalable. This means writing code and debugging hard problems in production systems across Facebook services like Instagram, WhatsApp, and Oculus as well as backend services like Storage, Cache and Network.

What is the Production Engineering Track of the MLH Fellowship?

Initially launched in Summer 2020, the MLH Fellowship pairs early career software engineers with widely used open source projects like React, Jest, Docusaurus. This gives them the opportunity to apply their knowledge to real world projects, which helps them learn important concepts and patterns while also having production-level code to showcase in their portfolio. Through the Fellowship, MLH has created opportunities for hundreds of developers from around the world to level up and hone their skills in a 12-week cohort surrounded by peers and expert mentors. The Production Engineering Track takes this proven model and expands it to a broader range of technology disciplines, creating even more valuable opportunities for developers starting off their careers. 

Program participants will gain practical skills thanks to educational content from Linux Foundation Training & Certification’s LFS201 – Essentials of System Administration training course, which covers how to administer, configure and upgrade Linux systems, along with the tools and concepts necessary to efficiently build and manage a production Linux infrastructure. By pairing this industry leading curriculum with hands-on project-based learning, students in the Production Engineering Track can build on their foundational software engineering knowledge to learn a broader array of technology skills and open the door to a variety of exciting and challenging new career options. Creating these types of non-traditional learning experiences helps  empower developers from many diverse backgrounds and educational institutions to accelerate their careers and make an impact on the world.

Who is eligible?

The Production Engineering Track starts in June, and guarantees a $3,600 educational stipend to each participant. Applications are open now until May 28, 2021. Eligible students are rising sophomores or juniors who are United States, Mexico, and Canada based, enrolled in a 4 year degree granting program, and able to code. MLH invites and encourages people to apply who identify as women or non-binary. MLH also invites and encourages people to apply who identify as Black/African American or LatinX. In partnership with Facebook, MLH is committed to building a more diverse and inclusive tech industry and providing learning opportunities to under-represented technologists.

Learn more and apply

The post Introducing the Production Engineering Track of the MLH Fellowship, powered by Facebook appeared first on Linux Foundation – Training.

The post Introducing the Production Engineering Track of the MLH Fellowship, powered by Facebook appeared first on Linux.com.

Video: Open Policy Agent Is Now A CNCF Graduate

Monday 26th of April 2021 08:25:19 PM

OPA, or Open Policy Agent, is one of the most popular open-source tools for administrators to have fine-grained control for their cloud-native environment. The project was created by Styra and contributed to CNCF. The project recently graduated from CNCF to join matured projects like Kubernetes. Torin Sandall, VP of Open Source at Styra, sat down with Swapnil Bhartiya, CEO of TFiR and host of video interviews at Linux.com, to talk about the significance of graduation for a project like OPA which is already being used in production, the importance of OPA in the cloud-native world and what are some of the most exciting use-cases of the project.

The post Video: Open Policy Agent Is Now A CNCF Graduate appeared first on Linux.com.

OpenAPI Specification 3.1.0 Available Now

Monday 26th of April 2021 08:16:46 PM
Introduction

The OpenAPI Initiative (OAI), a consortium of forward-looking industry experts who focus on standardizing how APIs are categorized and described, released the OpenAPI Specification 3.1.0 in February. This new version introduces better support for webhooks and adds 100% compatibility with the latest draft (2020-12) of JSON Schema.

Complete information on the OpenAPI Specification (OAS) is available for immediate access here: https://spec.openapis.org/oas/v3.1.0  It includes Definitions, Specifications including Schema Objects, and much more.

Also, the OAI sponsored the creation of new documentation to make it easier to understand the structure of the specification and its benefits. It is available here: https://oai.github.io/Documentation. It is intended for HTTP-based API designers and writers wishing to benefit from having their API formalized in an OpenAPI Description document.

What is the OpenAPI Specification?

The OAS is the industry standard for describing modern APIs. It defines a standard, and programming language-agnostic interface description for HTTP APIs, which allows both humans and computers to discover and understand the capabilities of a service without requiring access to source code, additional documentation, or inspection of network traffic.

The OAS is used by organizations worldwide, including Atlassian, Bloomberg, eBay, Google, IBM, Microsoft, Oracle, Postman, SAP, SmartBear, Vonage, and many more.

JSON Schema Now Supported

The Schema Object defines everything inside the `schema` keyword in OpenAPI. Previously, this was loosely based on JSON Schema and referred to as a “subset superset” because it added some things and removed other things from JSON Schema. The OpenAPI and JSON Schema communities worked together to align the specifications to align tooling and approach. 

Supporting modern JSON Schema is a significant step forward.

“The mismatch between OpenAPI JSON Schema-like structures and JSON Schema itself has long been a problem for users and implementers. Full alignment of OpenAPI 3.1.0 with JSON Schema draft 2020-12 will not only save users much pain but also ushers in a new standardized approach to schema extensions,” said Ben Hutton, JSON Schema project lead. 

“We’ve spent the last few years (and release) making sure we can clearly hear and understand issues the community faces. With our time-limited volunteer-based effort, not only have we fixed many pain points and added new features, but JSON Schema vocabularies allows for standards to be defined which cater for use cases beyond validation, such as the generation of code, UI, and documentation.”

Major Changes in OpenAPI Specification 3.1.0 
  • JSON Schema vocabularies alignment 
  • New top-level element for describing Webhooks that are registered and managed out of band 
  • Support for identifying API licenses using the standard SPDX identifier 
  • PathItems object is now optional to make it simpler to create reusable libraries of components. Reusable PathItems can be described in the components object. There is also support for describing APIs secured using client certificates. 

Full OAS 3.1.0 details are available here: https://spec.openapis.org/oas/v3.1.0  The new documentation is available here: https://oai.github.io/Documentation/ 

Get Involved in the OAI

To learn more about participating in the evolution of the OAS: https://www.openapis.org/participate/how-to-contribute

Conclusion

In order to achieve the impressive goal of full compatibility with modern JSON Schema, the OAS underwent important updates that make things a lot easier for tooling maintainers, as they no longer need to try and guess what draft a schema is by looking at where it is referenced from. Just this makes implementation of OAS 3.1.0 important to evaluate and strongly consider.

The post OpenAPI Specification 3.1.0 Available Now appeared first on Linux.com.

Video: New Online Courses for RISC-V

Friday 23rd of April 2021 04:15:51 PM

RISC-V is a free and open instruction set architecture (ISA) enabling a new era of processor innovation through open standard collaboration. To help individuals get started with the RISC-V, the Linux Foundation and RISC-V International have announced two new free online training courses through edX.org, the online learning platform founded by Harvard and MIT. Stephano Cetola, Technical Program Manager at RISC-V International, sat down with Swapnil Bhartiya, CEO of TFiR and host of video interviews at Linux.com, to talk about the new courses and who can benefit from it.

The post Video: New Online Courses for RISC-V appeared first on Linux.com.

Have you ever racked a server? 

Friday 23rd of April 2021 03:40:10 AM

There are sysadmins who have to rack servers as part of their jobs while others have never stepped foot inside a chilly data center.
Read More at Enable Sysadmin

The post Have you ever racked a server?  appeared first on Linux.com.

Video: A New Online Course For Node.js

Thursday 22nd of April 2021 03:32:02 PM

The Linux Foundation and OpenJS Foundation recently released a new online training course targeted at Node.js community. The course is developed by a long-time member of the Node.js community, David Mark Clements who wears many hats. He is a Principal Architect, technical author, public speaker, and OSS creator specializing in Node.js and browser JavaScript. Clements joined Swapnil Bhartiya, CEO of TFiR and host of video interviews at Linux.com, to talk about the new course and who can benefit from it.

The post Video: A New Online Course For Node.js appeared first on Linux.com.

Interview with Jory Burson, Community Director, OpenJS Foundation on Open Source Standards

Thursday 22nd of April 2021 01:00:00 PM

Jason Perlow, Editorial Director of the Linux Foundation, chats with Jory Burson, Community Director at the OpenJS Foundation about open standardization efforts and why it is important for open source projects.

JP: Jory, first of all, thanks for doing this interview. Many of us know you from your work at the OpenJS Foundation, the C2PA, and on open standards, and you’re also involved in many other open community collaborations. Can you tell us a bit about yourself and how you got into working on Open Standards at the LF?

JB: While I’m a relatively new addition to the Linux Foundation, I have been working with the OpenJS foundation for probably three years now — which is hosted by the Linux Foundation. As some of your readers may know, OpenJS is home to several very active JavaScript open source projects, and many of those maintainers are really passionate about web standards. Inside that community, we’ve got a core group of about 20 people participating actively at Ecma International on the JavaScript TCs, the W3C, the Unicode Consortium, the IETF, and some other spaces, too. What we wanted to do was create this space where those experts can get together, discuss things in a cross-project sort of way, and then also help onboard new people into this world of web standards — because it can be a very intimidating thing to try and get involved in from the outside.

The Joint Development Foundation is something I’m new to, but as part of that, I’m very excited to get to support the C2PA, which stands for Coalition for Content Provenance and Authenticity; it’s a new effort as well. They’re going to be working on standards related to media provenance and authenticity — to battle fakes and establish trustworthiness in media formats, so I’m very excited to get to support that project as it grows.

JP: When you were at Bocoup, which was a web engineering firm, you worked a lot with international standards organizations such as Ecma and W3C, and you were in a leadership role at the TC53 group, which is JavaScript for embedded systems. What are the challenges that you faced when working with organizations like that?

JB: There are the usual challenges that I think face any international or global team, such as coordination of meeting times and balancing the tension between asynchronously conducting business via email lists, GitHub, and that kind of thing. And then more synchronous forms of communication or work, like Slack and actual in-person meetings. Today, we don’t really worry as much about the in-person meetings, but still, there’s like, this considerable overhead of, you know, “human herding” problems that you have to overcome.

Another challenge is understanding the pace at which the organization you’re operating in really moves. This is a complaint we hear from many people new to standardization and are used to developing projects within their product team at a company. Even within an open source project, people are used to things moving perhaps a bit faster and don’t necessarily understand that there are actually built-in checks in the process — in some cases, to ensure that everybody has a chance to review, everybody has an opportunity to comment fairly, and that kind of thing.

Sometimes, because that process is something that’s institutional knowledge, it can be surprising to newcomers in the committees — so they have to learn that there’s this other system that operates at an intentionally different pace. And how does that intersect with your work product? What does that mean for the back timing of your deliverables? That’s another category of things that is “fun” to learn. It makes sense once you’ve experienced it, but maybe running into it for the first time isn’t quite as enjoyable.

JP: Why is it difficult to turn something like a programming language into an internationally accepted standard? In the past, we’ve seen countless flavors of C and Pascal and things like that.

JB: That’s a really good question. I would posit that programming languages are some of the easier types of standards to move forward today because the landscape of what that is and the use cases are fairly clear. Everybody is generally aware of the concept that languages are ideally standardized, and we all agree that this is how this language should work. We’re all going to benefit, and none of us are necessarily, outside of a few cases, trying to build a market in which we’re the dominant player based solely on a language. In my estimation, that tends to be an easier case to bring lots of different stakeholders to the table and get them to agree on how a language should proceed.

In some of the cases you mentioned, as with C, and Pascal, those are older languages. And I think that there’s been a shift in how we think about some of those things, where in the past it was much more challenging to put a new language out there and encourage adoption of that language, as well as a much higher bar and much more difficult sort of task in getting people information out about how that language worked.

Today with the internet, we have a very easy distribution system for how people can read, participate, and weigh in on a language. So I don’t think we’re going to see quite as many variations in standardized languages, except in some cases where, for example, with JavaScript, TC53 is carving out a subset library of JavaScript, which is optimized for sensors and lower-powered devices. So long story short, it’s a bit easier, in my estimation, to do the language work. Where I think it gets more interesting and difficult is actually in some of the W3C communities where we have standardization activities around specific web API’s you have to make a case for, like, why this feature should actually become part of the platform versus something experimental…

JP: … such as for Augmented Reality APIs or some highly specialized 3D rendering thing. So what are the open standardization efforts you are actively working on at the LF now, at this moment?

JB: At this exact moment, I am working with the OpenJS Foundation standards working group, and we’ve got a couple of fun projects that we’re trying to get off the ground. One is creating a Learning Resource Center for people who want to learn more about what standardization activities really look like, what they mean, some of the terminologies, etc.

For example, many people say that getting involved in open source is overwhelming — it’s daunting because there’s a whole glossary of things you might not understand. Well, it’s the same for standardization work, which has its own entire new glossary of things. So we want to create a learning space for people who think they want to get involved. We’re also building out a feedback system for users, open source maintainers, and content authors. This will help them say, “here’s a piece of feedback I have about this specific proposal that may be in front of a committee right now.”

So those are two things. But as I mentioned earlier, I’m still very new to the Linux Foundation. And I’m excited to see what other awesome standardization activities come into the LF.

JP: Why do you feel that the Linux Foundation now needs to double down its open standards efforts?

JB: One of the things that I’ve learned over the last several years working with different international standards organizations is that they have a very firm command of their process. They understand the benefits of why and how a standard is made, why it should get made, those sorts of things. However, they don’t often have as strong a grasp as they ought to around how the software sausage is really made. And I think the Linux Foundation, with all of its amazing open source projects, is way closer to the average developer and the average software engineer and what their reality is like than some of these international standards developing boards because the SDOs are serving different purposes in this grander vision of ICT interoperability.

On the ground, we have, you know, the person who’s got to build the product to make sure it’s fit for purpose, make sure it’s conformant, and they’ve got to make it work for their customers. In the policy realm, we have these standardization folks who are really good at making sure that the policy fits within a regulatory framework, is fair and equitable and that everybody’s had a chance to bring concerns to the table — which the average developer may not have time to be thinking about privacy or security or whatever it might be. So the Linux Foundation and other open source organizations need to fit more of the role of a bridge-builder between these populations because they need to work together to make useful and interoperable technologies for the long term.

That’s not something that one group can do by themselves. Both groups want to make that happen. And I think it’s really important that the LF demonstrate some leadership here.

JP: Is it not enough to make open software projects and get organizations to use them? Or are open standards something distinctly different and separate from open source software?

JB: I think I’ll start by saying there are some pretty big philosophical differences in how we approach a standard versus an open source project. And I think the average developer is pretty comfortable with the idea that version 1.0 of an open source project may not look anything like version 2.0. There are often going to be cases and examples where there are breaking changes; there’s stuff that they shouldn’t necessarily rely on in perpetuity, and that there’s some sort of flex that they should plan for in that kind of thing.

The average developer has a much stronger sense with a standardization activity that those things should not change. And should not change dramatically in a short period. JavaScript is a good example of a language that changes every year; new features are added. But there aren’t breaking changes; it’s backward compatible. There are some guarantees in terms of a standard platform’s stability versus an open source platform, for example. And further, we’re developing more of a sense of what’s a higher bar, if you will, for open standards activities, including the inclusion of things like test suites, documentation, and the required number of reference implementations examples.

Those are all concepts that are kind of getting baked into the idea of what makes a good standard. There’s plenty of standards out there that nobody has ever even implemented — people got together and agreed how something should work and then never did anything with it. And that’s not the kind of standard we want to make or the kind of thing we want to promote.

But if we point to examples like JavaScript — here’s this community we have created, here’s the standard, it’s got this great big group of people who all worked on it together openly and equitably. It’s got great documentation, it’s got a test suite that accompanies it — so you can run your implementation against that test suite and see where the dragons lie. And it’s got some references and open source reference implementations that you can view.

Those sorts of things really foster a sense of trustworthiness in a standard — it gives you a sense that it’s something that’s going to stick around for a while, perhaps longer than an open source project, which may be sort of the beginnings of a standardization activity. It may be a reference to implementing a standard, or some folks just sort of throwing spaghetti at a wall and trying to solve a problem together. And I think these are activities that are very complementary with each other. It’s another great reason why other open source projects and organizations should be getting involved and supporting standardization activities.

JP: Do open standardization efforts make a case for open source software even stronger?

I think so — I just see them as so mutually beneficial, right? Because in the case of an open standards activity, you may be working with some folks and saying, well, here’s what I’m trying to express what this would look like — if we take the prose — and most of the time, the standard is written in prose and a pseudocode sort of style. It’s not something you can feed into the machine and have it work. So the open source projects, and polyfills, and things of that sort can really help a community of folks working on a problem say, “Aha, I understand what you mean!” “This is how we interpreted this, but it’s producing some unintended behaviors”, or “we see that this will be hard to test, or we see that this creates a security issue.”

It’s a way of putting your ideas down on paper, understanding them together, and having a tool through which everybody can pull and say, Okay, let’s, let’s play with it and see if this is really working for what we need it for.”

Yes, I think they’re very compatible.

JP: Like peanut butter and jelly.

JB: Peanut butter and jelly. Yeah.

JP: I get why large organizations might want things like programming languages, APIs, and communications protocols to be open standards, but what are the practical benefits that average citizens get from establishing open standards?

JB: Open standards really help promote innovation and market activity for all players regardless of size. Now, granted, for the most part, a lot of the activities we’ve been talking about are funded by some bigger players. You know, when you look at the member lists of some of the standards bodies, it’s larger companies like the IBMs, Googles, and Microsofts of the world, the companies that provide a good deal more of the funding. Still, hundreds of small and midsize businesses are also benefiting from standards development.

You mentioned my work at Bocoup earlier — that’s another great example. We were a consulting firm, who heavily benefited from participating in and leveraging open standards to help build tools and software for our customers. So it is a system that I think helps create an equitable market playing field for all the parties. It’s one of those actual examples of rising tides, which lift all boats if we’re doing it in a genuinely open and pro-competitive way. Now, sometimes, that’s not always the case. In other types of standardization areas, that’s not always true. But certainly, in our web platform standards, that’s been the case. And it means that other companies and other content authors can build web applications, websites, services, digital products, that kind of thing. Everybody benefits — whether those people are also Microsoft customers, Google customers, and all that. So it’s an ecosystem.

JP: I think it’s great that we’ve seen companies like Microsoft that used to have much more closed systems embrace open standards over the last ten years or so. If you look at the first Internet Explorer they ever had out — there once were websites that only worked on that browser. Today, the very idea of a website that only works on one company’s web browser correctly is ridiculous, right? We now have open source engines that these browsers use that embrace open standards have become much more standardized. So I think that open standards have helped some of these big companies that were more closed become more open. We even see it happen at companies like Apple. They use the Bluetooth protocol to connect to their audio hardware and have adopted technologies such as the USB-C connector when previously, they were using weird proprietary connectors before. So they, too, understand that open standards are a good thing. So that helps the consumer, right? I can go out and buy a wireless headset, and I know it’ll work because it uses the Bluetooth protocol. Could you imagine if we had nine different types of wireless networking instead of WiFi? You wouldn’t be able to walk into a store and buy something and know that it would work on your network. It would be nuts. Right?

JB: Absolutely. You’re pointing to hardware and the standards for physical products and goods versus digital products and goods in your example. So in using that example, do you want to have seven different adapters for something? No, it causes confusion and frustration in the marketplace. And the market winner is the one who’s going to be able to provide a solution that simplifies things.

That’s kind of the same thing with the web. We want to simplify the solutions for web developers so they’re not having to say, “Okay, what am I going to target? Am I going to target Edge? Am I going to target Safari?”

JP: Or is my web app going to work correctly in six years or even six months from now?

JB: Right!

JP: Besides web standards, are there other types of standardization you are passionate about, either inside the LF or in your spare time?

JB: It’s interesting because I think in my career, I’ve followed this journey of first getting involved because it was intellectually interesting to me. Then it was about getting involved because it was about  making my job easier. Like, how does this help me do business more effectively? How does this help me make my immediate life, life as a developer, and my life as an internet consumer a little bit nicer?

Beyond that, you start to think of the order of magnitude: our standardization activities’ social impact. I often think about the role that standards have played in improving the lives of everyday people. For the last 100 years, we have had building standards, fire standards, and safety standards, all of these things. And because they developed, adopted, and implemented in global policy, they have saved people’s lives.

Apply that to tech — of course, it makes sense that you would have safety standards to prevent the building from burning down — so what is the version of that for technology? What’s the fire safety standard for the web? And how do we actually think about the standards that we make, impacting people and protecting them the way that those other standards did?

One of the things that have changed in the last few years is that the Technical Advisory Group group or “TAG” at the W3C are considering more of the social impact questions in their work. TAG is a group of architects elected by the W3C membership to take a horizontal/global view of the technologies that the W3C standardizes. These folks say, “okay, great; you’re proposing that we standardize this API, have you considered it from an accessibility standpoint? Have you considered it from, you know, ease of use, security?” and that sort of thing.

In the last few years, they started looking at it from an ethical standpoint, such as, “what are the questions of privacy?” How might this technology be used for the benefit of the average person? And also, perhaps, how could it potentially be used for evil? And can we prevent that reality?

So one of the thingsI think is most exciting, is the types of technologies that are advancing today that are less about can we make X and Y interoperable, but can we make X and Y interoperable in a safe, ethical, economical, and ecological fashion — the space around NFT’s right now as a case in point. And can we make technology beneficial in a way that goes above and beyond “okay, great, we made the website, quick click here.”

So C2PA, I think, is an excellent example of a standardization activity that the LF supports could benefit people. One of the big issues of the last several years is the authenticity of media that we consume things from — whether it was altered, or synthesized in some fashion, such as what we see with deepfakes. Now, the C2PA is not going to be able to and would not say if a media file is fake. Rather, it would allow an organization to ensure that the media they capture or publish can be analyzed for tampering between steps in the edit process or the time an end user consumes it.  This would allow organizations and people to have more trust in the media they consume.

JP: If there was one thing you could change about open source and open standards communities, what would it be?

JB: So my M.O. is to try and make these spaces more human interoperable. With an open source project or open standards project, we’re talking about some kind of technical interoperability problem that we want to solve. But it’s not usually the technical issues that cause delays or serious issues — nine times out of ten; it comes down to some human interoperability problem. Maybe it’s language differences, cultural differences, or expectations — it’s process-oriented. There’s some other thing that may cause that activity to fail to launch.

So if there were something that I could do to change communities, I would love to make sure that everybody has resources for running great and effective meetings. One big problem with some of these activities is that their meetings could be run more effectively and more humanely. I would want humane meetings for everyone.

JP: Humane meetings for everyone! I’m pretty sure you could be elected to public office on that platform. <laughs>. What else do you like to do with your spare time, if you have any?

JB: I love to read; we’ve got a book club at OpenJS that we’re doing, and that’s fun. So, in my spare time, I like to take time to read or do a crossword puzzle or something on paper! I’m so sorry, but I still prefer paper books, paper magazines, and paper newspapers.

JP: Somebody just told me recently that they liked the smell of paper when reading a real book.

JB: I think I think they’re right; I think it feels better. I think it has a distinctive smell, but there’s also something very therapeutic and analog about it because I like to disconnect from my digital devices. So you know, doing something soothing like that. I also enjoy painting outdoors and going outside, spending time with my four-year-old, and that kind of thing.

JP: I think we all need to disconnect from the tech sometimes. Jory, thanks for the talk; it’s been great having you here.

The post Interview with Jory Burson, Community Director, OpenJS Foundation on Open Source Standards appeared first on Linux Foundation.

The post Interview with Jory Burson, Community Director, OpenJS Foundation on Open Source Standards appeared first on Linux.com.

More in Tux Machines

Today in Techrights

today's howtos

  • Hans de Goede: Changing hidden/locked BIOS settings under Linux

    This all started with a Mele PCG09 before testing Linux on this I took a quick look under Windows and the device-manager there showed an exclamation mark next to a Realtek 8723BS bluetooth device, so BT did not work. Under Linux I quickly found out why, the device actually uses a Broadcom Wifi/BT chipset attached over SDIO/an UART for the Wifi resp. BT parts. The UART connected BT part was described in the ACPI tables with a HID (Hardware-ID) of "OBDA8723", not good. Now I could have easily fixed this with an extra initrd with DSDT-overrride but that did not feel right. There was an option in the BIOS which actually controls what HID gets advertised for the Wifi/BT named "WIFI" which was set to "RTL8723" which obviously is wrong, but that option was grayed out. So instead of going for the DSDT-override I really want to be able to change that BIOS option and set it to the right value. Some duckduckgo-ing found this blogpost on changing locked BIOS settings.

  • Test Day:2021-05-09 Kernel 5.12.2 on Fedora 34

    All logs report PASSED for each test done and uploaded as prompted at instruction page.

  • James Hunt: Can you handle an argument?

    This post explores some of the darker corners of command-line parsing that some may be unaware of. [...] No, I’m not questioning your debating skills, I’m referring to parsing command-lines! Parsing command-line option is something most programmers need to deal with at some point. Every language of note provides some sort of facility for handling command-line options. All a programmer needs to do is skim read the docs or grab the sample code, tweak to taste, et voila! But is it that simple? Do you really understand what is going on? I would suggest that most programmers really don’t think that much about it. Handling the parsing of command-line options is just something you bolt on to your codebase. And then you move onto the more interesting stuff. Yes, it really does tend to be that easy and everything just works… most of the time. Most? I hit an interesting issue recently which expanded in scope somewhat. It might raise an eyebrow for some or be a minor bomb-shell for others.

  • 10 Very Stupid Linux Commands [ Some Of Them Deadly ]

    If you are reading this page then you are like all of us a Linux fan, also you are using the command line every day and absolutely love Linux. But even in love and marriage there are things that make you just a little bit annoyed. Here in this article we are going to show you some of the most stupid Linux commands that a person can find.

China Is Launching A New Alternative To Google Summer of Code, Outreachy

The Institute of Software Chinese Academy of Sciences (ISCAS) in cooperation with the Chinese openEuler Linux distribution have been working on their own project akin to Google Summer of Code and Outreachy for paying university-aged students to become involved in open-source software development. "Summer 2021" as the initiative is simply called or "Summer 2021 of Open Source Promotion Plan" is providing university-aged students around the world funding by the Institute of Software Chinese Academy of Sciences to work on community open-source projects. It's just like Google Summer of Code but with offering different funding levels based upon the complexity of the project -- funding options are 12000 RMB, 9000 RMB, or 6000 RMB. That's roughly $932 to $1,865 USD for students to devote their summer to working on open-source. There are not any gender/nationality restrictions with this initative but students must be at least eighteen years old. Read more

Kernel: Linux 5.10 and Linux 5.13

  • Linux 5.10 LTS Will Be Maintained Through End Of Year 2026 - Phoronix

    Linux 5.10 as the latest Long Term Support release when announced was only going to be maintained until the end of 2022 but following enough companies stepping up to help with testing, Linux 5.10 LTS will now be maintained until the end of year 2026. Linux 5.10 LTS was originally just going to be maintained until the end of next year while prior kernels like Linux 5.4 LTS are being maintained until 2024 or even Linux 4.19 LTS and 4.14 LTS going into 2024. Linux 5.10 LTS was short to begin with due to the limited number of developers/organizations helping to test new point release candidates and/or committing resources to using this kernel LTS series. But now there are enough participants committing to it that Greg Kroah-Hartman confirmed he along with Sasha Levin will maintain the kernel through December 2026.

  • Oracle Continues Working On The Maple Tree For The Linux Kernel

    Oracle engineers have continued working on the "Maple Tree" data structure for the Linux kernel as an RCU-safe, range-based B-tree designed to make efficient use of modern processor caches. Sent out last year was the RFC patch series of Maple Tree for the Linux kernel to introduce this new data structure and make initial use of it. Sent out last week was the latest 94 patches in a post-RFC state for introducing this data structure.

  • Linux 5.13 Brings Simplified Retpolines Handling - Phoronix

    In addition to work like Linux 5.13 addressing some network overhead caused by Retpolines, this next kernel's return trampoline implementation itself is seeing a simplification. Merged as part of x86/core last week for the Linux 5.13 kernel were enabling PPIN support for Xeon Sapphire Rapids, KProbes improvements, and other minor changes plus simplifying the Retpolines implementation used by some CPUs as part of the Spectre V2 mitigations. The x86/core pull request for Linux 5.13 also re-sorts and better documents Intel's increasingly long list of different CPU cores/models.

  • Linux 5.13 Adds Support For SPI NOR One-Time Programmable Memory Regions - Phoronix

    The Linux 5.13 kernel has initial support for dealing with SPI one-time programmable (OTP) flash memory regions. Linux 5.13 adds the new MTD OTP functions for accessing SPI one-time programmable data. The OTP are memory regions intended to be programmed once and can be used for permanent secure identification, immutable properties, and similar purposes. In addition to adding the core infrastructure support for OTP to the MTD SPI-NOR code in Linux 5.13, the functionality is wired up for Winbond and similar flash memory chips. The MTD subsystem has already supported OTP areas but not for SPI-NOR flash memory.