Language Selection

English French German Italian Portuguese Spanish

Openwashing and Linux Foundation Openwash

Filed under
OSS
  • Huobi’s ‘Regulator-Friendly’ Blockchain Goes Open Source

    Huobi Chain, the regulator-facing public blockchain of exchange Huobi Group, is now open source and publicly available to all developers on GitHub, the firm said Tuesday.

    Nervos, a blockchain development startup, is providing part of the technical infrastructure for the project.

    The firms are developing pluggable components for the network that could enable regulators to supervise contract deployments, asset holdings and transfers, as well as the enforcement of anti money laundering regulations, Bo Wang, a Nervos researcher, told CoinDesk.

    The components will also allow financial institutions, such as banks and regulatory agencies, to freeze assets and accounts in case of emergencies via sidechains, according to Wang.

  • Is Open Source Broken?

    The movement to develop software applications and all manner of IT services through the open source model is fundamentally rooted in the notion of community contribution, but things have shifted.

  • Managing all your enterprise's APIs with new management gateways for review
  • See you at KubeCon!

    It’s that time of year again! We’re getting ready to head on out to San Diego for KubeCon + CloudNativeCon NA. For me, KubeCon always makes for an exciting and jam-packed week. 

  • Amazon Web Services, Genesys, Salesforce Form New Open Data Model

    To accelerate digital transformation, organizations in every industry are modernizing their on-premises technologies by adopting cloud-native applications. According to the International Data Corporation (IDC), global spend on cloud computing will grow from $147 billion in 2019 to $418 billion by 2024. Almost half of that investment will be tied to technologies that help companies deliver personalized customer experiences.

    One major challenge of this shift to cloud computing is that applications are typically created with their own data models, forcing developers to build, test, and manage custom code that’s necessary to map and translate data across different systems. The process is inefficient, delays innovation, and ultimately can result in a broken customer experience.

  • The Linux Kernel Mentorship program was a life changing experience

    Operating systems, computer architectures and compilers have always fascinated me. I like to go in depth to understand the important software components we depend on! My life changed when engineers from IBM LTC (Linux Technology Center) came to my college to teach us the Linux Kernel internals. When I heard about the Linux Kernel Mentorship program, I immediately knew that I wanted to be a part of it to further fuel my passion for Linux.

    One of the project in the lists of projects available to work during the Linux Kernel Mentorship program was on “Predictive Memory Reclamation”. I really wanted the opportunity to work on the core kernel, and I began working with my mentor Khalid Aziz immediately during the application period where he gave me a task regarding the identification of anonymous memory regions for a process. I learned a lot in the application period by reading various blogs, textbooks and commit logs.

    During my mentorship period, I worked to develop a predictive memory reclamation algorithm in the Linux Kernel. The aim of the project was to reduce the amount of time the Linux kernel spends in reclaiming memory to satisfy processes requests for memory when there is memory pressure, i.e not enough to satisfy the memory allocation of a process. We implemented a predictive algorithm that can forecast memory pressure and proactively reclaim memory to ensure there is enough available for processes.

AWS and Salesforce

  • AWS, Salesforce join forces with Linux Foundation on Cloud Information Model

    Last year, Adobe, SAP and Microsoft came together and formed the Open Data Initiative. Not to be outdone, this week, AWS, Salesforce and Genesys, in partnership with The Linux Foundation, announced the Cloud Information Model.

    The two competing data models have a lot in common. They are both about bringing together data and applying a common open model to it. The idea is to allow for data interoperability across products in the partnership without a lot of heavy lifting, a common problem for users of these big companies’ software.

    Jim Zemlin, executive director at The Linux Foundation, says this project provides a neutral home for the Cloud Information model, where a community can work on the problem. “This allows for anyone across the community to collaborate and provide contributions under a central governance model. It paves the way for full community-wide engagement in data interoperability efforts and standards development, while rapidly increasing adoption rate of the community,” Zemlin explained in a statement.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

More in Tux Machines

digiKam 7.7.0 is released

After three months of active maintenance and another bug triage, the digiKam team is proud to present version 7.7.0 of its open source digital photo manager. See below the list of most important features coming with this release. Read more

Dilution and Misuse of the "Linux" Brand

Samsung, Red Hat to Work on Linux Drivers for Future Tech

The metaverse is expected to uproot system design as we know it, and Samsung is one of many hardware vendors re-imagining data center infrastructure in preparation for a parallel 3D world. Samsung is working on new memory technologies that provide faster bandwidth inside hardware for data to travel between CPUs, storage and other computing resources. The company also announced it was partnering with Red Hat to ensure these technologies have Linux compatibility. Read more

today's howtos

  • How to install go1.19beta on Ubuntu 22.04 – NextGenTips

    In this tutorial, we are going to explore how to install go on Ubuntu 22.04 Golang is an open-source programming language that is easy to learn and use. It is built-in concurrency and has a robust standard library. It is reliable, builds fast, and efficient software that scales fast. Its concurrency mechanisms make it easy to write programs that get the most out of multicore and networked machines, while its novel-type systems enable flexible and modular program constructions. Go compiles quickly to machine code and has the convenience of garbage collection and the power of run-time reflection. In this guide, we are going to learn how to install golang 1.19beta on Ubuntu 22.04. Go 1.19beta1 is not yet released. There is so much work in progress with all the documentation.

  • molecule test: failed to connect to bus in systemd container - openQA bites

    Ansible Molecule is a project to help you test your ansible roles. I’m using molecule for automatically testing the ansible roles of geekoops.

  • How To Install MongoDB on AlmaLinux 9 - idroot

    In this tutorial, we will show you how to install MongoDB on AlmaLinux 9. For those of you who didn’t know, MongoDB is a high-performance, highly scalable document-oriented NoSQL database. Unlike in SQL databases where data is stored in rows and columns inside tables, in MongoDB, data is structured in JSON-like format inside records which are referred to as documents. The open-source attribute of MongoDB as a database software makes it an ideal candidate for almost any database-related project. This article assumes you have at least basic knowledge of Linux, know how to use the shell, and most importantly, you host your site on your own VPS. The installation is quite simple and assumes you are running in the root account, if not you may need to add ‘sudo‘ to the commands to get root privileges. I will show you the step-by-step installation of the MongoDB NoSQL database on AlmaLinux 9. You can follow the same instructions for CentOS and Rocky Linux.

  • An introduction (and how-to) to Plugin Loader for the Steam Deck. - Invidious
  • Self-host a Ghost Blog With Traefik

    Ghost is a very popular open-source content management system. Started as an alternative to WordPress and it went on to become an alternative to Substack by focusing on membership and newsletter. The creators of Ghost offer managed Pro hosting but it may not fit everyone's budget. Alternatively, you can self-host it on your own cloud servers. On Linux handbook, we already have a guide on deploying Ghost with Docker in a reverse proxy setup. Instead of Ngnix reverse proxy, you can also use another software called Traefik with Docker. It is a popular open-source cloud-native application proxy, API Gateway, Edge-router, and more. I use Traefik to secure my websites using an SSL certificate obtained from Let's Encrypt. Once deployed, Traefik can automatically manage your certificates and their renewals. In this tutorial, I'll share the necessary steps for deploying a Ghost blog with Docker and Traefik.