Language Selection

English French German Italian Portuguese Spanish

Personal Computing on the fly

Filed under
Linux

The cloud. It's the talk of the town and has been growing for awhile now.

I use desktop and laptop computers for everything from business tasks like creating spreadsheets, databases entry, document creation and so much more. On the personal side, the computer gets used for entertainment (or distraction) schoolwork, research, and communication. More than ever, the computers get used for communication.

The internet and world wide web have been dominant in terms of communication. E-mail, social networks, forums, instant messaging and more. It is no secret that communication and porn have probably been the two biggest drivers of the internet.

What does the cloud have to offer? Mostly it's providing traditionally local activities as an online service. Data storage, office apps, multimedia playback and more like that.

For me, the "pros" of the cloud boils down to one thing: portability. You can access your files and data anywhere you go as long as you can get a connection.

I admit, this can be a very nice thing. Having a lot of data and extremely useful files and documents at hand where ever you go without having to keep all of it on a laptop hard drive is very handy.

The "cons" for me though are brought down to only a few points, but very important ones. Security, accessibility and privacy.

Talking about security, I still see WAY too many websites that want to collect information when you sign up that are not secure. There are WAY too many content providers and data storage providers reporting being hacked with information being stolen.

Accessibility is another problem. I don't care what they say their projected uptime is, most companies offering data storage or information collecting apps, etc... will be down sometime. The question is, when it goes down, for whatever reasons, will it be at the time you need to get to that data? Third party providers are offering a lot of interesting data collection services available.

I can think of 2 big providers offering beekeeping data collection services. Great idea, if the stored data is available. What if when you collect that data out in the field using your laptop or smartphone, etc... it's all going well, but when you get to somewhere where you need to look at that information and show it to a customer or employee or just a buddy who is helping you and that service or website has crashed or is down for support or any other reason?

I've been in that situation with third party providers and I have to tell you, it really sucks to not be able to get your data when you need it.

Lastly, I mentioned privacy. How vigorously will you defend who has access to your data on your computers? Will a third party defend your information as vigorously as you? Not likely.

Third party providers just want to stay in business. They don't want to be in the middle of a headache. If the government wants access to you specific data, the third party provider might give token resistance, but only as far as the law will allow. Some won't even hold out that long.

Now even if the "if you have nothing to hide" crowd had a point, which they don't, you may have a hundred different reasons for keeping your information private and none of them being illegal. Only you can keep your data as private as you want it to be.

As for me, I like building "clouds" but I want control over my cloud. I have my own servers that run the cloud type apps I want access to in my own house. I set them up as secure (https) sites with my own certificates. I do scheduled downtime to best ensure they are up when I need them to be.

Now, not everyone has or wants the ability to set up their own local servers. I understand that. But if I must choose between private, local clouds or public, third party clouds, I tend to stick with the big boys like Google. No, it's not nearly as private as I really want it to be, they do have pretty good uptimes and availability and they will at least make the legal beagles get a warrant before they turn your information over.

I won't use it for anything other than things I don't care are made public though. The first rule of the internet still applies. "If you want it to be private, DON'T put it on the internet."

If you are willing and able to make private clouds, cloud computing is a very nice way to get things done on the run.

If not, you are taking your chances really should minimize what you put out there.

More in Tux Machines

Default window manager switched to CTWM in NetBSD-current

For more than 20 years, NetBSD has shipped X11 with the "classic" default window manager of twm. However, it's been showing its age for a long time now. In 2015, ctwm was imported, but after that no progress was made. ctwm is a fork of twm with some extra features - the primary advantages are that it's still incredibly lightweight, but highly configurable, and has support for virtual desktops, as well as a NetBSD-compatible license and ongoing development. Thanks to its configuration options, we can provide a default experience that's much more usable to people experienced with other operating systems. Read more

Red Hat/Fedora Leftovers

  • Call an existing REST service with Apache Camel K

    With the release of Apache Camel K, it is possible to create and deploy integrations with existing applications that are quicker and more lightweight than ever. In many cases, calling an existing REST endpoint is the best way to connect a new system to an existing one. Take the example of a cafe serving coffee. What happens when the cafe wants to allow customers to use a delivery service like GrubHub? You would only need to introduce a single Camel K integration to connect the cafe and GrubHub systems. In this article, I will show you how to create a Camel K integration that calls an existing REST service and uses its existing data format. For the data format, I have a Maven project configured with Java objects. Ideally, you would have this packaged and available in a Nexus repository. For the purpose of my demonstration, I utilized JitPack, which lets me have my dependency available in a repository directly from my GitHub code. See the GitHub repository associated with this demo for the data format code and directions for getting it into JitPack.

  • Build a data streaming pipeline using Kafka Streams and Quarkus

    In typical data warehousing systems, data is first accumulated and then processed. But with the advent of new technologies, it is now possible to process data as and when it arrives. We call this real-time data processing. In real-time processing, data streams through pipelines; i.e., moving from one system to another. Data gets generated from static sources (like databases) or real-time systems (like transactional applications), and then gets filtered, transformed, and finally stored in a database or pushed to several other systems for further processing. The other systems can then follow the same cycle—i.e., filter, transform, store, or push to other systems. In this article, we will build a Quarkus application that streams and processes data in real-time using Kafka Streams. As we go through the example, you will learn how to apply Kafka concepts such as joins, windows, processors, state stores, punctuators, and interactive queries. By the end of the article, you will have the architecture for a realistic data streaming pipeline in Quarkus.

  • Fedora 32 : Can be better? part 012.

    Pidgin is a chat program which lets you log into accounts on multiple chat networks simultaneously. Pidgin can be install on multiple operating systems and platforms. Pidgin is compatible with the following chat networks out of the box: I.R.C., Jabber/XMPP, Bonjour, Gadu-Gadu, IRC, Novell GroupWise Messenger, Lotus Sametime, SILC, SIMPLE, and Zephyr. Can it be better? The only problems a user in need of help may have are in the command line environment. Obviously, in this case, this application cannot be used. I would suggest building a terminal application like WeeChat dedicated to Fedora users and including I.R.C channels. Now, let's install this application.

Touchégg 2.0.0 Released: A Linux Multi-Touch Gesture Recognizer App

For years, it has continued to work in every desktop environment. However, as the Linux desktop has advanced a lot, Touchégg fails to work on desktop environments using modern technologies like Wayland compositor. Therefore, Jose has now revised, completely rewrote the old version, and released a new version 2.0.0 after more than years of gap. The new release aims to make the app compatible with new technology stacks incorporated in GNOME, KDE, and other desktops. Read more

Linux 5.10: Freedreno/MSM Driver and Broadcom Ethernet

  • MSM Adreno DRM Driver For Linux 5.10 Has DisplayPort, Per-Process Pagetables

    Rob Clark who founded the Freedreno/MSM driver project and current Googler sent in the MSM direct rendering manager driver updates targeting the upcoming Linux 5.10 merge window. This time around the Adreno kernel graphics/display driver has some notable additions. With Linux 5.10 the MSM DRM driver now has DisplayPort output support for Adreno hardware with DP outputs.

  • Broadcom Has 200G Ethernet Link Speed Support Coming To Its Driver For Linux 5.10

    Broadcom engineers have prepared their Linux network driver infrastructure for supporting 200G link speeds. Coming to Broadcom's "bnxt_en" Linux network driver in Linux 5.10 are the necessary alterations for handling 200G links. It was back in late 2018 when Broadcom first announced the world's first 200G Ethernet controller utilizing 50G PAM-4 and PCI Express 4.0. Now as we approach the end of 2020 and prepping for an interesting 2021 of new hardware, bnxt_en is ready with this 200G Ethernet link speed.