TDD – Test-Driven Design

A newer, sometimes controversial best practice these days when writing good, maintainable code is to write the unit tests for your code before writing your code. This is called Test-Driven Development or Design. I have had some good experiences with this methodology lately. I have used the nose for python and the built in prove which uses Test::Simple or Test::More in perl. Test::More is recommended by both modern and enlightened perl. While I haven’t used it yet, the BASH specific bats testing framework looks good for shell code.

Standards exist for unit tests. Perl developers have a long history of unit testing. Nose, prove and bats are all TAP compliant. Though the xUnit family is popular there are competing frameworks exist for many languages. Jasmine and others are ready to help unit test javascript code. The history of unit testing overlaps with many areas of computer science. The Mythical Man-Month talks about unit tests, first published in 1975.

The time taken to write code is arguably a savings of time overall. How can this be? This is due to work taken to clarify specifications which is required anyway. This should lead to better, more efficient code. The theory is that the much more modular code required for unit tests is also easier to understand and maintain. This theory has been validated in several research studies. The extensive regression testing can quickly identify changes that introduce bugs if the tests are run just before and/or just after a check in to a revision control system. However there isn’t quite consensus that TDD is the best way to code. What have your experiences been with Test Driven Design?

We meet at Bobby G’s Pizzeria on the second and fourth Sundays of each month from Noon to 3PM in Berkeley near the Downtown Berkeley BART station. Bobby G’s is on University Ave near Shattuck Ave. We hope you join us, join the discussion on our email list and/or join us in #berkeleylug on by following the tabs at the top.

Internet Slowdown Protest

On September 10th websites protested online by displaying a symbolic loading symbol on their sites. The results measured by the number of calls, emails and comments conveyed to the FCC, Congress and the White house were strong. This is very important to us now. It is difficult to imagine our lives before the Internet. In May, 2014 I wrote about Net Neutrality. The US focused article on Net Neutrality in the United States helps to summarize and remind us of some relevant history dating back an century and a half. “While the term is new, the ideas underlying net neutrality have a long pedigree in telecommunications practice and regulation. The concept of network neutrality originated in the age of the telegram in 1860 or even earlier, where standard (pre-overnight telegram) telegrams were routed ‘equally’ without discerning their contents and adjusting for one application or another. Such networks are ‘end-to-end neutral’.”

We meet at Bobby G’s Pizzeria on the second and fourth Sundays of each month from Noon to 3PM in Berkeley near the Downtown Berkeley BART station. Bobby G’s is on University Ave near Shattuck Ave. We hope you join us, join the discussion on our email list and/or join us in #berkeleylug on by following the tabs at the top.

HTTPS Everywhere

Web sites who install an SSL/TLS certificate allow the encryption of web browser traffic. Why should you care? Awareness for this need was raised for the public in April 2014 by the OpenSSL Heartbleed bug. HTTPS is especially useful in preventing eavesdropping and man-in-the-middle attacks when using public Wi-Fi access points.

Encrypting web browser traffic is done with HTTP Secure. This provides an additional level of privacy between your computer and web sites you view by layering SSL/TLS over HTTP. TLS replaces SSL as of 1999.

On my computer I installed a browser extension available for Chrome, Firefox, Firefox for Android and Opera. The extension makes my browser default to prefer the use of HTTPS when possible. "HTTPS Everywhere is produced as a collaboration between the The TOR Project and the Electronic Frontier Foundation (EFF).

When reading about the use of HTTPS Everywhere I was surprised to find an entry related to I hope to work with our LUG members to update our website and links to embedded content to automatically rewrite the requests of web browsers to our web server.

We meet at Bobby G’s Pizzeria on the second and fourth Sundays of each month from Noon to 3PM in Berkeley near the Downtown Berkeley BART station. Bobby G’s is on University Ave near Shattuck Ave. We hope you join us, join the discussion on our email list and/or join us in #berkeleylug on by following the tabs at the top.

Vi IMproved

VI is a key text editing tool available on unix systems. The most common version of vi used by desktop linux distributions is VIM. There are a number of implementations of the modal vi editor but I have learned and use the most common one, VIM.

Having a consistent and capable editor available on any system is important, especially important for people who move from machine to machine for whatever reason. People who administer UNIX systems or who help others with their operating system (like many Linux User Group members) need to edit files quickly. The family of editors known as vi (including nvi, aka. vi on BSD) is very useful for this. Some of the reasons for its success are the use of classic UNIX philosophy, regular expressions, keyboard use (no mouse required) and plugins. The learning curve is a bit steep for beginners. Because text editors are written for use in terminals they can be used even across platforms with the help of fink (brew and others) on Mac OSX and cygwin on Windows. Text editors are clearly not WYSIWYG word processors.

I like vim and I am biased. I’ve answered questions in #vim on Freenode. I’ve customized my installation with a personalized color scheme. I’ve written functions for use in my statusline. I use set ruler number hlsearch laststatus=2 cmdheight=3 and syntax on. I’ve googled and read other people’s vimrc files more than once. Over the years I’ve cobbled together configurations across many different machines. I’ve typed the vim command :help a lot. I’ve installed and used quite a few vimscript plugins and am still pleasantly surprised to find a new one that suits my needs well as my needs change. I’ve experimented with more modern ways to install plugins using git revision control rather than the traditional way of putting downloaded copies of files into a $HOME/.vim directory. I’ve written a little vimscript. I’ve played a couple games inside vim. I’ve tried my best to teach vim to a number of people in person.

The commands used for editing with vi and vim can come up in the most interesting places. The wikipedia article covers the history well. There are quite a few references to our home, Berkeley. A small version of vi is included with Cisco products. Perhaps we might be able to find some UC Berkeley and O’Reilly sponsorship to develop a vi* unconference style celebration in 2016 as vi/ex turns 40 years old!

What’s the best way to get started with vim? vimtutor is a great self-paced tutorial experience. Firing up vim and typing :help will provide you with the online documentation and instructions. Much has been written and said about vi and vim elsewhere. Mike Saunders of Linux Voice produced a great 12 minute video Learn to love Vim. It was featured on the front page of Linux Voice and encouraged me to write this article. The cheat sheet he referenced is fantastic. The related youtube videos also look interesting. Time and patience are really all that’s needed though these things seem to be in shorter supply than they used to be. While there is a learning curve it is worth it. Two friends, Michael and Jim, have both taught vi professionally.

A number of books are available notably from O’Reilly, New Rider, Packt and Pragmatic Bookshelf. At least one has a PDF download available. Purchasing from that page provides donations to Bram Molenar’s favorite charity, the ICCF-Holland which helps needy children in Uganda. Having seen firsthand some needy children in Kenya I encourage and support the works of ICCF-Holland.

This article should in no way be interpreted as a negative statement about other editors of any kind. gedit, nano, elvis, BusyBox’s clone and nvi are other common text editors. I am focusing here on text editing, not other differentiated text needs such as word processing. There are many good editors, most with highly specialized features for particular uses and/or for particular environments. Yet few other editors can be so well customized and are as universally available as vim. Code focused integrated development environments (IDE) also need to edit text and have many helpful features to write code. After scratching the surface I found that many features can be added to vim with some effort.

While emacs is powerful and comes to mind, emacs is not as universally available. While I learned some of the basics of emacs over the years and spent a period of time learning emacs many moons ago, I am not an expert. I realize that the emacs approach of not using editing modes (modes are used by vim) has advantages. Are you an emacs user? We welcome your blog post submissions.

I hope this article sparks informed discussion. Believe it or not, the choice of text editor has been known to elicit long discussions in some circles though this is far less prevalent now. People who work with text editors know that the choice of an editor can dramatically affect efficiency. Informed comments are encouraged.

We meet at Bobby G’s Pizzeria on the second and fourth Sundays of each month from Noon to 3PM in Berkeley near the Downtown Berkeley BART station. Bobby G’s is very close to the corner of University & Shattuck. We hope you join us, join the discussion on our email list and/or join us in #berkeleylug on by following the tabs at the top.


I found a broken link on another website I maintain. I quickly surveyed the available solutions and chose to try the Python based LinkChecker instead of the other perl or ruby based solutions. I wanted something I could use from a command line, run locally and possibly script. WordPress has it’s own plugin for this called Broken Link Checker which I enjoy using on a couple of other websites I maintain.

After installing linkchecker under my local user I found that the invocation is pretty straight forward. Output options are numerous. By default it checks internal site links. I then found the --check-extern option which is what I was looking for to check all the external links. By default ten threads are used at a time. While this option is good I haven’t had time to try out all the other options available.

The first link checker people often hear about is a perl script available via CPAN and hosted as Another perl option is The Python based twill scripting language for web browsing looked intriguing. The Ruby gem link-checker was published a few years ago and caught my attention but I have more experience with Python than Ruby. I am sure readers of this blog post would love to hear about the good and bad experiences of others using these and similar tools to keep the links on their websites from pointing to bad URLs.

We meet at Bobby G’s Pizzeria on the second and fourth Sundays of each month from noon to three in Berkeley near the Downtown Berkeley BART station. Bobby G’s is very close to the corner of University & Shattuck. We hope you join us, join the discussion on our email list and/or join us in #berkeleylug on by following the tabs at the top.

A Very Basic Guide to Linux Distributions for Newcomers

There’s an old joke about standards: How many of them do you want?

Linux comes in a large number of packagings (called distributions), differing widely in how you build, administer, and use their systems. Which will be “best” or “friendliest” for you is difficult to tell you before you’ve tried them, and easy answers should be questioned.

I’m going to give you an easy answer. You should question it. 😉

Newcomers using modern-ish workstation or laptop machines should seriously consider starting with  Bodhi Linux, Linux Mint, Ultimate Edition, MEPIS Linux, or PCLinuxOS.  Here’s why:

1. Good installers that work on all likely hardware.

2. Polished, beautiful desktop environments with all the fixings including commonly wanted A/V software. Want Adobe Flash support? mp3 support? Oracle Java? Nvidia drivers included? No problem.

3. Solid and healthy surrounding community that produces it (from all signs so far).

Nothing’s perfect, and all OSes suck, but all of these are highly respected by the overwhelming majority of informed independent reviewers, and garner no serious complaints.

There are a number of respected sources of general information on distro (distribution) choice. Here are three: Linux Distribution Chooser, DistroWatch Major Distributions, and Karsten’s Distribution Guide. (As always on the Web, watch for signs of obsolescence.)

Why did I mention five distros? Why not one? Because the reality of Linux is that you get choice, whether you like it or not, so you might as well come to terms with it. The good news is: You can flip a coin and you can’t really lose. They’re all good, and freedom is a good thing. Try one distro today. Next time you’re curious, you can try another, live with it for a while too, and compare. Or not, because choice is what this is about.

There’s a lot of questionable distro advice around. Even though all should be questioned, I think the bias in most of it will be obvious, and the flaws and errors easy to find. E.g., shouldn’t people pushing Ubuntu (w/Kubuntu, Xubuntu, Lubuntu) disclose the lack of integrated A/V support and Nvidia hardware drivers. Shouldn’t they disclose *buntu’s inability to install on many PC systems the five I cite have no problem with? Shouldn’t many of them disclose being members of a Canonical, Ltd. (the commercial company publishing Ubuntu)-funded advocacy organisation? Shouldn’t people pushing Fedora disclose that it’s a short-term development platform for Red Hat with occasional stability problems because it’s cutting edge?

My [non-]biases:

1. I have no connection to any of the above-cited distributions, neither the ones I recommend nor the ones I critique. I run Debian Testing on my own servers, Aptosid on my workstation, and CentOS on the ~3500 Internet servers I administer and architect for $DAYJOB.
2. Far from liking proprietary A/V software, Nvidia drivers, Oracle Java, the MP3 format, and Adobe Flash, I specifically don’t like any of them – but I know newcomers to Linux perceive a need for them, therefore I take that perception into account in recommendations.
3. I am not employed by, nor do I have an financial interest in or backing from, any Linux company. I used to work at several Linux-industry companies that were famous in the 2000s and no longer exist, and am now a senior system administrator at a large content company with no horse in these races.

Computers as Docker Platforms

Docker (using LXC, Linux cgroups, Linux namespaces and union mounted file systems) is a powerful and efficient alternative to virtual machines. released their 1.0 version last month.

This is an ingenious implementation of boundary separation as seen implemented in the past on computers as inter-process communication (IPC), OS virtualization (as KVM and Solaris Containers) and
Java servlets on tomcat. As a side note, in the software world Java was heavily promoted with a promise of allowing the writing once and running anywhere of software. Reality has not fulfilled these promises. Recently Java was surpassed by python as the most popular programming language for teaching programming in higher education.

An operating system simply allows software applications to run. Keeping application boundaries clear has been tied to operating systems for a long time. UNIX software and the hardware used to run these computers has changed a lot. Docker may be the answer. The power of boundary separation and abstractions is that you don’t need to understand the other parts. It’s gotten so “easy” for end users that even without a full understanding of how things work many people today in the developed world can use embedded computers in their TVs, thermostats, raspberry pi, phones/tablets/mobile (often using iOS or Android), laptops or desktop computers. Yet when something goes wrong, down the rabbit hole we must go to figure out what’s really going on across the layers of abstractions and boundaries. What do you think?

Due to the 2014 FIFA World Cup final game viewing we may not be able to get as much space at Bobby Gs as we need. We may seek more space at Cafe Au Coquelet down University on the same block, 2000 University Ave.

We meet on the second and fourth Sundays of each month from noon to three in Berkeley near the Downtown Berkeley BART station near the corner of University & Shattuck. We hope you join us at Bobby G’s Pizzeria and/or join the discussion on our email list.