Language Selection

English French German Italian Portuguese Spanish

Burden is on us to protect our data

Filed under
Security

If you had to guess, how many companies would you say have enough of your personal data stored in various databases to make even a rookie crook ready for prime-time conning?

Ten, perhaps? What about 50, 100 or 1,000?

You probably don't know the answer, and that is exactly the problem.

In the past six months, the personal data of millions of consumers have been lost, stolen or sold to identity thieves. The most recent case involved a financial unit of Citigroup Inc. CitiFinancial, which provides a wide variety of consumer loan products, disclosed that personal information (Social Security numbers, loan account data and addresses) of 3.9 million of its customers was lost by UPS in transit to a credit bureau. So far CitiFinancial said it had no reason to believe that the information has been used inappropriately.

So far.

Every time we hear of one of these cases, the companies involved tell their customers not to worry. Trust us, they say. They pledge to enhance their security procedures.

The promises don't make me feel any safer about my personal data. How about you?

It's time for the federal government and the states to step in and make sure the companies fulfill those promises.

There have been some efforts to protect people's financial information. On June 1, a new federal rule took effect that requires businesses and individuals to destroy sensitive information derived from consumer credit reports.

I was initially encouraged when I heard about this rule. It seems to cover all the bases -- individuals, and both large and small organizations that use consumer reports, including consumer reporting companies, lenders, insurers, employers, landlords, government agencies, mortgage brokers, car dealers, attorneys, private investigators, debt collectors and people who pull consumer reports on prospective home employees, such as nannies or contractors.

There's just one little problem with this "Disposal Rule." There is no standard for how the documents have to be destroyed. Here's the direction the Federal Trade Commission is giving to businesses and individuals: "The proper disposal of information derived from a consumer report is flexible and allows the organizations and individuals covered by the rule to determine what measures are reasonable based on the sensitivity of the information, the costs and benefits of different disposal methods, and changes in technology."

How strong is a standard if it has no standard? Basically, those who have our information get to decide how and when it is to be destroyed.

"The burden is completely on the consumer to protect what is important," said Evan Hendricks, editor and publisher of the newsletter, Privacy Times.

Full Article.

More in Tux Machines

FOSS in the European Union

  • Competition authorities first to implement DMS services
    The DRS are published as open source software using the European Union’s open source software licence EUPL, and are available on Joinup. The software provides connectors for most commonly-used document management systems, and includes scripts to create a database to implement the connecting web services.
  • Czech Republic is at the forefront of an open data international project
    With the beginning of the new year, an international project “Open crowdsourcing data related to the quality of service of high-speed Internet” was launched, which aims to encourage the development of open data in the user’s measurement of high-speed Internet.

Arch Linux News

  • Linux Top 3: Arch Anywhere, Bitkey and Vinux
    Arch Linux is a powerful rolling Linux distribution, that hasn't always been particularly easy for new users to install and deploy. The goal of the Arch Anywhere system is to provide new and old users with the ability to install a fully custom Arch Linux system in minutes.
  • Arch Linux Preparing To Deprecate i686 Support
    Arch Linux is moving ahead with preparing to deprecate i686 (x86 32-bit) support in their distribution. Due to declining usage of Arch Linux i686, they will be phasing out official support for the architecture. Next month's ISO spin will be the last for offering a 32-bit Arch Linux install. Following that will be a nine month deprecation period where i686 packages will still see updates.
  • News draft for i686 deprecation
    Finally found some time to write a draft for news post on i686. Here it is: Title: i686 is dead, long live i686 Due to the decreasing popularity of i686 among the developers and the community, we have decided to phase out the support of this architecture. The decision means that February ISO will be the last that allows to install 32 bit Arch Linux. The next 9 months are deprecation period, during which i686 will be still receiving upgraded packages. Starting from November 2017, packaging and repository tools will no longer require that from maintainers, effectively making i686 unsupported. However, as there is still some interest in keeping i686 alive, we would like to encourage the community to make it happen with our guidance. Depending on the demand, an official channel and mailing list will be created for second tier architectures.

LinuxCon Europe on 100G Networking

  • The World of 100G Networking
    Capacity and speed requirements keep increasing for networking, but going from where are now to 100G networking isn’t a trivial matter, as Christopher Lameter and Fernando Garcia discussed recently in their LinuxCon Europe talk about the world of 100G networking. It may not be easy, but with recently developed machine learning algorithms combined with new, more powerful servers, the idea of 100G networking is becoming feasible and cost effective.
  • The World of 100G Networking by Christoph Lameter
    The idea of 100G networking is becoming feasible and cost effective. This talk gives an overview about the competing technologies in terms of technological differences and capabilities and then discusses the challenges of using various kernel interfaces to communicate at these high speeds.

Development News

  • Oh, the things Vim could teach Silicon Valley's code slingers
    Vim text editor turned 25 late last year – the first public iteration was launched on November 2, 1991, a couple of weeks after Linus Torvalds announced Linux. To celebrate Vim's anniversary, creator Bram Moolenaar recently dropped version 8.0. Ordinarily the update of a text editor wouldn't be worth mentioning, but this is the first major Vim release in ten years. In today's world, where web browsers drop major point updates (what they consider major, anyway) several times a year, Vim's lack of major updates is not just refreshing, but speaks of an entirely different approach to developing software. Even leaving aside the absurd version system of today's web browsers, eight releases in 25 years would be considered slow by today's software development standards. Interestingly, though, Vim's biggest rival, GNU Emacs, has a roughly similar development pace. GNU Emacs began life in the 1970s and is currently at version 25, which means it averages two releases to Vim's one, but still definitely on the slow side.
  • Learn to code site Code.org loses student work due to index bug
    Learn-to-code site Code.org is apologising to its students after being caught by a database table maxing out, and dropping progress for an unknown number of participants. In its mea-culpa blog post, the group says it was burned by a database table with a 32-bit index.
  • GCC 7.0 Lands The BRIG Frontend For AMD's HSA
    GCC 7 moved on to only bug/documentation fixes but an exception was granted to allow the BRIG front-end to land for AMD's HSA support in this year's GNU Compiler Collection update. As of this morning, the BRIG front-end has merged. BRIG is the binary form of the Heterogeneous System Architecture Intermediate Language (HSA IL). This BRING front-end also brings the libhsail-rt run-time into GCC. So far BRIG in GCC has just been tested on Linux x86_64.