Language Selection

English French German Italian Portuguese Spanish

Devices: Project Things and More

Filed under
Linux
  • Mozilla Announces Project Things, ZFS Version 0.7.6, Kali Linux New Release and More

    Mozilla announced Project Things yesterday, "an open framework for connecting your devices to the web". According to the Mozilla Blog, "We kicked off 'Project Things', with the goal of building a decentralized 'Internet of Things' that is focused on security, privacy, and interoperability."

  • Lars and the Real Internet of Things - Part 1

    First, though, my history with home automation:

    When I was a teenager in the 1970s, I had an analog alarm clock with an electrical outlet on the back labeled "coffee". About ten minutes before the alarm would go off, it would turn on the power to the outlet. This was apparently to start a coffee maker that had been setup the night before. I, instead, used the outlet to turn on my record player so I could wake to music of my own selection. Ten years after the premier of the Jetsons automated utopia, this was the extent of home automation available to the average consumer.

    By the late 1970s and into the 1980s, the landscape changed in consumer home automation. A Scottish electronics company conceived of a remote control system that would communicate over power lines. By the mid 1980s, the X10 system of controllers and devices was available at Radio Shack and many other stores.

    [....]

    My next blog posting will walk through the process of downloading and setting up a Mozilla Things Gateway.

  • Build your own phono preamplifier

    I was fortunate to receive a new phono cartridge for Christmas. What a lovely present! And of course, there is great pleasure (or, I suppose, great frustration, depending on one’s point of view) in all the tinkering required to remove the old phono cartridge, mount the new one, and correctly set things up.

    For some expert advice on this matter, I turned to the excellent instructional videos and articles by Michael Fremer, a vinyl enthusiast and audio journalist with many years of experience in all things phono. Rather than offer a single representative link here, I recommend searching for “Michael Fremer cartridge setup video” in your favorite search engine.

  • Atom C3000 based net appliance offers eight LAN ports

    Advantech’s FWA-1012VC follows a number of headless networking appliances that run Linux on Intel’s Atom C3000 (“Denverton”) SoC, including Aaeon’s recent FWS-2360 and Axiomtek’s NA362. The FWA-1012VC stands out from both competitors by offering more wireless expansion options.

More in Tux Machines

today's leftovers

GNOME Shell, Mutter, and Ubuntu's GNOME Theme

Benchmarks on GNU/Linux

  • Linux vs. Windows Benchmark: Threadripper 2990WX vs. Core i9-7980XE Tested
    The last chess benchmark we’re going to look at is Crafty and again we’re measuring performance in nodes per second. Interestingly, the Core i9-7980XE wins out here and saw the biggest performance uplift when moving to Linux, a 5% performance increase was seen opposed to just 3% for the 2990WX and this made the Intel CPU 12% faster overall.
  • Which is faster, rsync or rdiff-backup?
    As our data grows (and some filesystems balloon to over 800GBs, with many small files) we have started seeing our night time backups continue through the morning, causing serious disk i/o problems as our users wake up and regular usage rises. For years we have implemented a conservative backup policy - each server runs the backup twice: once via rdiff-backup to the onsite server with 10 days of increments kept. A second is an rsync to our offsite backup servers for disaster recovery. Simple, I thought. I will change the rdiff-backup to the onsite server to use the ultra fast and simple rsync. Then, I'll use borgbackup to create an incremental backup from the onsite backup server to our off site backup servers. Piece of cake. And with each server only running one backup instead of two, they should complete in record time. Except, some how the rsync backup to the onsite backup server was taking almost as long as the original rdiff-backup to the onsite server and rsync backup to the offsite server combined. What? I thought nothing was faster than the awesome simplicity of rsync, especially compared to the ancient python-based rdiff-backup, which hasn't had an upstream release since 2009.

OSS Leftovers

  • Haiku: R1/beta1 release plans - at last
    At last, R1/beta1 is nearly upon us. As I’ve already explained on the mailing list, only two non-“task” issues remain in the beta1 milestone, and I have prototype solutions for both. The buildbot and other major services have been rehabilitated and will need only minor tweaking to handle the new branch, and mmlr has been massaging the HaikuPorter buildmaster so that it, too, can handle the new branch, though that work is not quite finished yet.
  • Haiku OS R1 Beta Is Finally Happening In September
    It's been five years since the last Haiku OS alpha release for their inaugural "R1" release but next month it looks like this first beta will be released, sixteen years after this BeOS-inspired open-source operating system started development.
  • IBM Scores More POWER Open-Source Performance Optimizations
    Following our POWER9 Linux benchmarks earlier this year, IBM POWER engineers have continued exploring various areas for optimization within the interesting open-source workloads tested. Another batch of optimizations are pending for various projects.
  • DevConf.in 2018
    Earlier this month, I attended DevConf.in 2018 conference in Bengaluru, KA, India. It was sort of culmination of a cohesive team play that began for me at DevConf.cz 2018 in Brno, CZ. I say sort of because the team is already gearing up for DevConf.in 2019.
  • The Unitary Fund: a no-strings attached grant program for Open Source quantum computing
    Quantum computing has the potential to be a revolutionary technology. From the first applications in cryptography and database search to more modern quantum applications across simulation, optimization, and machine learning. This promise has led industrial, government, and academic efforts in quantum computing to grow globally. Posted jobs in the field have grown 6 fold in the last two years. Quantum computing hardware and platforms, designed by startups and tech giants alike, continue to improve. Now there are new opportunities to discover how to best program and use these new machines. As I wrote last year: the first quantum computers will need smart software. Quantum computing also remains a place where small teams and open research projects can make a big difference. The open nature is important as Open Source software has the lowest barriers  for others to understand, share and build upon existing projects. In a new field that needs to grow, this rapid sharing and development is especially important. I’ve experienced this myself through leading the Open Source Forest project at Rigetti Computing and also by watching the growing ecosystem of open projects like QISKit, OpenFermion, ProjectQ, Strawberry Fields, XaCC, Cirq, and many others. The hackathons and community efforts from around the world are inspiring.
  • SiFive Announces First Open-Source RISC-V-Based SoC Platform With NVIDIA Deep Learning Accelerator Technology
    SiFive, the leading provider of commercial RISC-V processor IP, today announced the first open-source RISC-V-based SoC platform for edge inference applications based on NVIDIA's Deep Learning Accelerator (NVDLA) technology.