Language Selection

English French German Italian Portuguese Spanish

Mozilla: Bazel, TLS and Decentralisation

Filed under
Moz/FF
  • evaluating bazel for building firefox, part 2

    In our last post, we highlighted some of the advantages that Bazel would bring. The remote execution and caching benefits Bazel bring look really attractive, but it’s difficult to tell exactly how much they would benefit Firefox. I looked for projects that had switched to Bazel, and a brief summary of each project’s experience is written below.

    The Bazel rules for nodejs highlight Dataform’s switch to Bazel, which took about 2 months. Their build involves some combination of “NPM packages, Webpack builds, Node services, and Java pipelines”. Switching plus enabling remote caching reduced the average time for a build in CI from 30 minutes to 5 minutes; incremental builds for local development have been “reduced to seconds from minutes”. It’s not clear whether the local development experience is also hooked up to the caching infrastructure as well.

  • Validating Delegated Credentials for TLS in Firefox

    At Mozilla we are well aware of how fragile the Web Public Key Infrastructure (PKI) can be. From fraudulent Certification Authorities (CAs) to implementation errors that leak private keys, users, often unknowingly, are put in a position where their ability to establish trust on the Web is compromised. Therefore, in keeping with our mission to create a Web where individuals are empowered, independent and safe, we welcome ideas that are aimed at making the Web PKI more robust. With initiatives like our Common CA Database (CCADB), CRLite prototyping, and our involvement in the CA/Browser Forum, we’re committed to this objective, and this is why we embraced the opportunity to partner with Cloudflare to test Delegated Credentials for TLS in Firefox, which is currently undergoing standardization at the IETF.

    As CAs are responsible for the creation of digital certificates, they dictate the lifetime of an issued certificate, as well as its usage parameters. Traditionally, end-entity certificates are long-lived, exhibiting lifetimes of more than one year. For server operators making use of Content Delivery Networks (CDNs) such as Cloudflare, this can be problematic because of the potential trust placed in CDNs regarding sensitive private key material. Of course, Cloudflare has architectural solutions for such key material but these add unwanted latency to connections and present with operational difficulties. To limit exposure, a short-lived certificate would be preferable for this setting. However, constant communication with an external CA to obtain short-lived certificates could result in poor performance or even worse, lack of access to a service entirely.

    The Delegated Credentials mechanism decentralizes the problem by allowing a TLS server to issue short-lived authentication credentials (with a validity period of no longer than 7 days) that are cryptographically bound to a CA-issued certificate. These short-lived credentials then serve as the authentication keys in a regular TLS 1.3 connection between a Firefox client and a CDN edge server situated in a low-trust zone (where the risk of compromise might be higher than usual and perhaps go undetected). This way, performance isn’t hindered and the compromise window is limited. For further technical details see this excellent blog post by Cloudflare on the subject.

  • Tantek Çelik: #Redecentralize 2019 Session: Decentralized Identity & Rethinking Reputation

    On Friday 2019-10-25 I participated in Redecentralize Conference 2019, a one-day unconference in London, England on the topics of decentralisation, privacy, autonomy, and digital infrastructure.

    I gave a 3 minute lightning talk, helped run an IndieWeb standards & methods session in the first open slot of the day, and participated in two more sessions. The second open session had no Etherpad notes, so this post is from my one week ago memory recall.

    [...]

    We did not get into any deep discussions of any specific decentralized identity systems, and that was perhaps ok. Mostly there discussion about the downsides of centrally controlled identity, and how each of us wanted more control over various aspects of our online identities.

    For anyone who asked, I posited that a good way to start with decentralized identity was to buy and use a personal domain name for your primary online presence, setting it up to sign-into sites, and build a reputation using that. Since you can pick the domain name, you can pick whatever facet(s) of your identity you wish to represent. It may not be perfectly distributed, however it does work today, and is a good way to explore a lot of the questions and challenges of decentralized identity.

More in Tux Machines

Openwashing Deception and FUD (Misusing and Badmouthing the "Open Source" Brand)

Acquia/Drupal After the Vista Equity Partners Takeover

  • Acquia, Drupal founder Dries Buytaert on open source, Vista, CDPs

    Dries Buytaert: No. We were profitable, we really didn't need more investment. But at the same time, we have an ambitious roadmap and our competitors are well-funded. We were starting to receive a lot of inbound requests from different firms, including Vista. When they come to you, you've got to look at it. It made sense.

  • New Acquia Drupal tools show open source loyalty post-Vista deal

    Web content management vendor Acquia Inc. delivered new marketing automation and content personalization platforms for the open-source Drupal faithful and for commercial customers. In late September, venture capital firm Vista Equity Partners acquired a majority stake in Acquia, but commitment to Acquia Drupal open source content management applications remain steady, according to Acquia CMO Lynne Capozzi.

Microsoft Claims a Monopoly Over 'Open Source'

Bringing PostgreSQL to Government

  • Crunchy Data, ORock Technologies Form Open Source Cloud Partnership for Federal Clients

    Crunchy Data and ORock Technologies have partnered to offer a database-as-a-service platform by integrating the former's open source database with the latter's managed offering designed to support deployment of containers in multicloud or hybrid computing environments. The partnership aims to implement a PostgreSQL as a service within ORock's Secure Containers as a Service, which is certified for government use under the Federal Risk and Authorization Management Program, Crunchy Data said Tuesday.

  • Crunchy Data and ORock Technologies Partnership Brings Trusted Open Source Cloud Native PostgreSQL to Federal Government

    Crunchy Data and ORock Technologies, Inc. announced a partnership to bring Crunchy PostgreSQL for Kubernetes to ORock’s FedRAMP authorized container application Platform as a Service (PaaS) solution. Through this collaboration, Crunchy Data and ORock will offer PostgreSQL-as-a-Service within ORock’s Secure Containers as a Service with Red Hat OpenShift environment. The combined offering provides a fully managed Database as a Service (DBaaS) solution that enables the deployment of containerized PostgreSQL in hybrid and multi-cloud environments. Crunchy PostgreSQL for Kubernetes has achieved Red Hat OpenShift Operator Certification and provides Red Hat OpenShift users with the ability to provision trusted open source PostgreSQL clusters, elastic workloads, high availability, disaster recovery, and enterprise authentication systems. By integrating with the Red Hat OpenShift platform within ORock’s cloud environments, Crunchy PostgreSQL for Kubernetes leverages the ability of the Red Hat OpenShift Container Platform to unite developers and IT operations on a single FedRAMP-compliant platform to build, deploy, and manage applications consistently across hybrid cloud infrastructures.