Monthly Archives: August 2016

Quantum Leap Could Redefine

No, I’m not talking about that Quantum Leap. IBM just made a really interesting announcement in that it is enhancing its online quantum computer systems with a new API and improving its simulator so it can handle 20 qubits.

While listening to the prebriefing was a bit like pretending I was Penny trying to understand Sheldon Cooper on Big Bang Theory, I think this move does showcase yet another huge approaching computing wave that could eclipse the one we currently are trying desperately, but largely failing, to ride.

I’ll share some thoughts on quantum computing and close with my product of the week: the Arlo Security Camera system from Netgear, which has to be the best comprehensive home security system in the market.

It is easy to get lost in the terminology surrounding quantum computing and glaze over. Basically, quantum computing is a revolutionary, not evolutionary, system that is pretty much indistinguishable from magic.

Let me give you an example. With a regular computing system at a machine language level you have 1s and 0s — an element is one or the other. With quantum computing, an element is both at the same time. This is like someone asking if your new car is black or white, and you can answer “yes” and be completely accurate.

In the world we think we live in, two opposites aren’t the same thing. In the quantum world, they sort of are. The most sick — or fun — explanation for this is Schrodinger’s cat (here’s a TED video about it), which is about how a cat who died in a closed box exists as both living and dead until the box is opened. Schrodinger supposedly was so disturbed about his analysis that he decided to abandon quantum physics and take up biology. I’m guessing talking smack about cats forced a career change.

When we currently talk about parallel computing, we talk about taking a single program, breaking it up into parts, and then executing it to get around the limitations of Moore’s law and avoiding the need to have a processor in our computer running hotter than the core of the sun. That gives you speed without heat.

With quantum computing, things happen pretty much at the same time. Because elements can be both things at once, things basically can happen instantly — not sequentially –so the potential speed of solving a problem approaches instant.

The example of a practical application I was given years ago was decrypting the most secure data file. Traditional computing might take years, but true quantum computing only seconds (which would be required to interpret the results, not get them in the first place). Effectively, it should blow away any concept we have of speed.

The damn things even look weird, more like a cross between a traditional computer and something from the steampunk dimension.

It’s not just that it would be hard to understand a quantum computer — think what a nightmare it would be to program one or interface with the result.

Lab Linux Is a Rare Treat

The latest release of Black Lab Linux, an Ubuntu 16.04-based distribution, adds a Unity desktop option. You will not find Unity offered by any other major — or nearly any minor — Linux distributor outside of Ubuntu.

Black Lab Linux 8.0, the consumer version of PC/OpenSystems’ flagship distro, also updates several other prominent desktop options.

Black Lab Linux is a general purpose community distribution for home users and small-to-mid-sized businesses. PC/OpenSystems also offers Black Lab Enterprise Linux, a commercial counterpart for businesses that want support services.

Black Lab Linux is an outgrowth of OS4 OpenLinux, a distro the same developers released in 2008. Both the community and the commercial releases could be a great alternative for personal and business users who want to avoid the UEFI (Unified Extensible Firmware Interface) horrors of installing Linux in a computer bought off the shelf with Microsoft Windows preinstalled.

Black Lab offers its flagship releases with a choice of self or full support, and both come at a price upon launch. However, you can wait 45 days and get the same release with the self-support option for free. Black Lab Linux 8.0 became available for free late last year.

Black Lab 8.0 with Unity gave me a few problems depending on the hardware I tested. It sometimes was slow to load various applications. It more than occasionally locked up. However, its performance usually was trouble-free on more resource-rich computers.

Its core set of specs are nice but nothing that outclasses other fully free Linux OS options. Here is a quick rundown on the updated packages. Remember that version 8.0 is based on Ubuntu 16.04, which is a solid starting point.

Open Source Devs to Give E2EMail Encryption

Google last week released its E2EMail encryption code to open source as a way of pushing development of the technology.

“Google has been criticized over the amount of time and seeming lack of progress it has made in E2EMail encryption, so open sourcing the code could help the project proceed more quickly,” said Charles King, principal analyst at Pund-IT.

That will not stop critics, as reactions to the decision have shown, he told LinuxInsider.

However, it should enable the company to focus its attention and resources on issues it believes are more pressing, King added.

Google started the E2EMail project more than a year ago, as a way to give users a Chrome app that would allow the simple exchange of private emails.

The project integrates OpenPGP into Gmail via a Chrome extension. It brings improved usability and keeps all cleartext of the message body exclusively on the client.

E2EMail is built on a proven, open source Javascript crypto library developed at Google, noted KB Sriram, Eduardo Vela Nava and Stephan Somogyi, members of Google’s Security and Privacy Engineering team, in an online post.

The early versions of E2EMail are text-only and support only PGP/MIME messages. It now uses its own keyserver.

The encryption application eventually will rely on Google’s recent Key Transparency initiative for cryptographic key lookups. Google earlier this year released the project to open source with the aim of simplifying public key lookups at Internet scale.

The Key Transparency effort addresses a usability challenge hampering mainstream adoption of OpenPGP.

During installation, E2EMail generates an OpenPGP key and uploads the public key to the keyserver. The private key is always stored on the local machine.

E2EMail uses a bare-bones central keyserver for testing. Google’s Key Transparency announcement is crucial to its further evolution.

 

Google Partially Benefits

Secure messaging systems could benefit from open sourcing the system. Developers could use a directory when building apps to find public keys associated with an account along with a public audit log of any key changes.

Encryption key discovery and distribution lie at the heart of the usability challenges that OpenPGP implementations have faced, suggested Sriram, Nava and Somogyi in their joint post.

Key Transparency delivers a solid, scalable and practical solution. It replaces the problematic web-of-trust model traditionally used with PGP, they pointed out.

Linux Begins

Once you have a sense of the vast potential of Linux, you may be eager to experience it for yourself. Considering the complexity of modern operating systems, though, it can be hard to know where to start.

As with many things, computers can be better understood through a breakdown of their evolution and operation. The terminal is not only where computers began, but also where their real power still resides. I’ll provide here a brief introduction to the terminal, how it works, and how you can explore further on your own.

Although “terminal,” “command line,” and “shell” are often used interchangeably, it helps to learn the general distinctions between these terms. The word “terminal” comes from the old days of Unix — the architecture on which Linux is based — when university campuses and research facilities had a room-sized computer, and users interacted with it by accessing keyboard-and-screen terminals scattered around the campus and connected to the central hub with long cables.

Today, most of us don’t deal with true terminals like those. Instead, we access emulators — interfaces on Unix-like systems that mimic the terminal’s control mechanism. The kind of terminal emulator you’re most likely to see is called a “pseudo-terminal.”

Also called a “terminal window,” a pseudo-terminal is an operating system application on your normal graphical desktop session. It opens a window allowing interaction with the shell. An example of this is the Gnome Terminal or KDE Konsole. For the purpose of this guide, I’ll use “terminal” to refer exclusively to terminal emulators.

The “command line” is simply the type of control interface that one utilizes on the terminal, named for the fact that you write lines of text which are interpreted as commands.

The “shell” is the program the command line uses to understand and execute your commands. The common default shell on Linux is Bash, but there are others, such as Zsh and the traditional Unix C shell.

 

File Organization

The last thing you need to know before diving in is how files are organized. In Unix-like systems, directories are ordered in an upside down tree, with the root filesystem (notated as “/” and different from the “/root” directory) as the starting point.

The root filesystem contains a number of directories within it, which have their own respective directories and files, and so on, eventually extending to encompass every file your computer can access. The directories directly within the root filesystem, in directory notation, are given right after the “/”.

For example, the “bin” directory contained right inside the root would be addressed as “/bin”. All directories at subsequent levels down are separated with a “/”, so the “bin” directory within the “usr” directory in the root filesystem would be denoted as “/usr/bin”. Furthermore, a file called “bash” (the shell), which is in “bin” in “usr” would be listed as “/usr/bin/bash”.

So how do you find these directories and files and do stuff with them? By using commands to navigate.

To figure out where you are, you can run “pwd” (“print working directory”) and you will get the full path to the directory you’re currently in.

To see where you can go, run “ls” to list directory contents. When run by itself, it returns the contents of the current directory, but if you put a space after it and then a path to a directory, it will print the contents of the directory at the end of the path.

Using “ls” can tell you more than that, though. If you insert “-l” between the command and the path with a single space on either side, you will get the “long” listing specifying the file owner, size and more.

 

Commands, Options, Arguments

This is a good time to explain the distinction between commands, options and arguments. The command, which is the program being run, goes first.

After that you can alter the functionality of the command by adding options, which are either one dash and one letter (“-a”) or two dashes and a word (“–all”).

The argument — the thing the command operates on — takes the form of a path. Many commands do not need arguments to provide basic information, but some lend far greater functionality with them, or outright require them.