Wednesday, November 23, 2016

O'Reilly Book Bundle!

There's a pretty awesome pay-what-you want sale over at Humble Book Bundle, featuring O'Reilly Unix books.  One of my all time favorites, which I own in original dead-tree version, Unix Power Tools, is included if you pay $8 or more.   If you pay $15 or more, you also get some pretty awesome networking tomes, including O'Reilly's excellent TCP/IP and DNS/BIND books, and a couple more.

I really can not recommend the Unix Power Tools book enough, so I'll just cover a few reasons why I think everyone should read this book, even though it was last revised in 2002. It remains a fantastic tome, and well worthy of reading.



Three top reasons to read it:


1.  It has the best written explanation of the zen of Unix-like systems that I have ever read, the composability paradigm, and the sometimes baffling array of choices you have, such as between various shells.

2. It covers most of what you need to be a competent user, developer, or system administrator on Unix-like systems, Linux or otherwise.

3. It will get you started in a hundred different things, whether it's awk or sed, or vi (lately vim), shell scripting, or the myriad of standard utilities for finding things (learn when to use find, when to use grep, when to use locate/updatedb, and others) and so many more things.

This was the book that taught me enough that I started to actually grok the unix way. Before this book it was a random walk through chaos and entropy. After this book, it was still a random walk through chaos and entropy, but with a wry smile and a sense of understanding.

Even if you think you're a Unix expert you'll learn lots from this book. It's been taken down and thumbed through dozens of times, and I've read it cover to cover twice. Go grab it!  And give the fine folks at O'Reilly some love, tell them linux code monkey sent ya.


Update: If you're one of the millions of people who was happily working away in your Windows only world, and then someone had to go and bring Linux into things, and you are expected to learn how to just SSH into a wild Linux boxen and remotely deal with it,  and that is something that scares you, there's a book in the bundle that starts exactly with installing Putty on Windows and learning what to do when you get into the mysterious world of that bash shell in that remote Linux system that you now have to learn, and may even learn to enjoy.  If this sounds like you, check out the Ten Steps To Linux Survival title in the Unix bundle above.

Thursday, November 17, 2016

Upgrading OpenSuSE LEAP 42.1 to 42.2

For some reason I can't find this documented anywhere in a step by step set of instructions that only apply to LEAP 42.1 to 42.2.  The SuSE docs are good but they have to cover a bunch of versions and a lot of edge cases.  So, since this is the web, I puzzled, googled, puzzled, and then tried stuff.   It worked, and it's pretty easy, actually, but seemed strange to me as I'm more used to Debian and Ubuntu, and Fedora.

It seems odd in 2016 to have to use sed to modify a text file, so you can update from a major release of OpenSuSE to another. But you could also edit the zypper repo file by hand, if the following seems too convenient and easy.   Ubuntu  and Debian are a bit friendlier when it's upgrade time.  I was kind of floored that YaST didn't have a "new OpenSuSE version detect/upgrade" menu.

Before starting you should check your drive space and clean up any old snapshots using snapper.  If your system has btrfs as the root filesystem type, I suggest making sure you zap old snapshots until you have at least 12 gigs of free disk space.  You can not believe the output of df if you want to know your real free space on SuSE Linux systems with btrfs, use this command as root:


btrfs filesystem show

Note that it doesn't show you your actual free space, you have to do a bit of mental math, in the example below, there is a little less than 10 gigs free, approximately, probably enough for your upgrade to work.  I always like to have 12 to 15 gigs free before I do any system upgrade, so the situation below is borderline to me:

Label: none uuid: b3b42cba-c08e-4401-9382-6db379176a1f
Total devices 1 FS bytes used 90.21GB
devid 1 size 100.00GB used 85.29GB path /dev/sda4


On the same system above, df -h might report the free gigabytes to be much higher than that, and that is why you can not trust df, it doesn't work right with btrfs.  And to make life even weirder the command btrfs filesystem df /  where df (disk free space) is in the name of the command does not actually report free space, only total and usage.  Sometimes I really want to reach through the internet and ask the authors of tools what were you thinking?

There is a way to get actual free space from btrfs, which is this:

btrfs filesystem usage -h /

Besides giving me the unallocated free space in gigabytes, it also gives me a lot of stuff I don't care about.

Also before starting, make sure your system is up to date (zypper dup) and then list your zypper repos (zypper ls) and disable any that are third party. I disabled google-chrome before upgrade.

I also have a personal habit of archiving my entire pre-update /etc, with tar. (sudo tar cvf /root/etc-backup.tgz /etc).

 As root the system upgrade commands to go from 42.1 to 42.1 is:


sudo sed -i 's/42\.1/42\.2/g' /etc/zypp/repos.d/*
zypper ref
zypper dup


Further decisions/actions may be required, usually involving selection of packages to be de-installed, to avoid broken system. For example:


Reading installed packages...
Computing distribution upgrade...
2 Problems:
Problem: libkdevplatform8-1.7.1-1.3.x86_64 requires kdevplatform = 1.7.1, but this requirement cannot be provided
Problem: kdevplatform-lang-5.0.1-1.1.noarch requires kdevplatform = 5.0.1, but this requirement cannot be provided

Problem: libkdevplatform8-1.7.1-1.3.x86_64 requires kdevplatform = 1.7.1, but this requirement cannot be provided
  deleted providers: kdevplatform-1.7.1-1.3.x86_64
 Solution 1: Following actions will be done:
  deinstallation of kdevelop4-plugin-cppsupport-4.7.1-1.4.x86_64
  deinstallation of kdevelop4-4.7.1-1.4.x86_64
  deinstallation of kdevelop4-lang-4.7.1-1.4.noarch
 Solution 2: keep obsolete kdevplatform-1.7.1-1.3.x86_64
 Solution 3: break libkdevplatform8-1.7.1-1.3.x86_64 by ignoring some of its dependencies


Choose from above solutions by number or skip, retry or cancel [1/2/3/s/r/c] (c): 


I chose     1   whenever given a choice like the one above because I am pretty sure I can live without some bit of stuff (kdevelop and its various bits) until I figure out how to get it back installed and working.

Next I get to the big download stage.  I have to read and accept a few license agreements (page through or hit q, then type yes).


2307 packages to upgrade, 27 to downgrade, 230 new, 11 to reinstall, 89 to
remove, 4  to change vendor, 1 to change arch.
Overall download size: 1.98 GiB. Already cached: 0 B. After the operation, 517.6
MiB will be freed.
Continue? [y/n/? shows all options] (y): 


After all the EULA dances, it's time to get a cup of tea and wait for about 2 gigs of stuff to download through my rickety Canadian cable internet.

Next post will be on the joys of what worked and didn't work after this finishes.

Post-Script:   Everything seems quite stable after upgrade. The official wiki docs on "System Upgrade"  consists of a series of interleaved bits of advice, some of which apply to upgrading from pre-LEAP to LEAP, and some of which generally still apply to 42.1 to 42.2 LEAP updates.   On the SUSE irc chat, Peter Linnell has informed me that starting in Tumbleweed there is a new package to make this process easier called yast2-wagon.

Sunday, November 6, 2016

Gitlab All The Things

Gitlab is awesome and you should be using it for everything.   In this quick blog post I will explain why I think that, and a few quick ways you can try it out for yourself.



1.  Distributed version control is the future. 

I have been for many years a passionate user of Mercurial and not a passionate fan of Git.  I was right about one thing, which is that distributed version control does enable ways of working which, once you adjust to their small differences, are so powerful and useful, that you will probably not want to go back.   I was wrong about another thing though, which is that a personal choice of technology which is not the dominant or popular tool for this tool category, is a matter of no consequence.  Wrong. Really wrong.

Subversion is a dead end, as are Perforce, Team Foundation Server, and Clearcase, and Visual Source Safe, and every other pre-DVCS tool.  Centralized version control systems are an impediment to remote distributed teams.  In the future all teams will be at least partly remote and distributed, and in fact, that future is already mostly here.

Mercurial is a cool tool, and it actually has some use cases, and luckily it has git interop capabilities. But I'm done with mercurial.  The future is git.

2. Git is the lingua-franca of software development.

The fact is that git has won, and that Git has a tooling ecosystem, and a developer community that is essential.  If you want to keep using Mercurial on your own computer, that doesn't mean you are not going to have to interact with Git repositories, fork them on various git hosting sites, and learn Git.  So if you want to know two DVCSs, Mercurial is still a great "second" version control tool. I still use it for a few personal projects that I don't collaborate with others on.  But I have switched to Git, and overcome my anti-git irrational bias, and more than that, I have come to love it.

3.  Github is important but it is not where you should keep your code or your company's code, or your open source project's code,  as Github's current near monopoly on git hosting is harmful.

This is perhaps a controversial statement, and perhaps seems fractious.    I'll keep my argument simple, and consistent with what others who believe the same way I believe do:

Large Enterprises: If you are a commercial company, you should host your own server and take responsibility for the security and durability (disaster recovery) of your most critical asset.   Enterprises should run their own Gitlab. Anything less is irresponsible.

Small Enterprises: Secondly, small businesses can easily run their own Gitlab or can pay someone to host a git server on their behalf, and can find much better deals than Github offers.  If ethical arguments do not sway you, how about an appeal to thrift; always a powerful motivator for a small business. You can host your private git repositories on gitlab for free. Why are you paying github?

Open Source Project Leaders and Open Source Advocates:   Why are you preaching open source collaboration and hosting your open source project on a site which is powered by a closed source technology, and which is holding your whole way of working on one datacenter owned by one company?  In what draconian alternative future are we living where the meaning of Distributed has come to largely mean that all roads lead to one CENTRALIZED github?  I am not against companies making money. Gitlab is owned by a company.  But that company values transparency, publishes almost all of its main product as a free open source product (Gitlab Community Edition) and then has a plan to commercialize an enterprise product.   In an ideal world there would be six to twenty large sites that host all the open source projects, and github will only be ONE of them. I don't want to crush github, I am just unhappy that a giant closed source commercial elephant is hogging an unfair amount of the market.  I happen to like Bitbucket very much, and would be much happier if Github, Gitlab, and Bitbucket were approximately equal in size and income.  It bothers me that it has become almost encoded into the culture of NodeJS, and Go, and Ruby coders that their shared libraries/packages/modules must be hosted on Github.  That's bad and should be fixed.

4.  Gitlab is the best git server there is.

I love great tools.  I think Gitlab is the best git server there is.   It has built in merge request (pull request) handling, built in continuous integration, built in issue tracker, built in kanban style visual issue board for moving issues through workflow stages, built in chat server (a slack alternative, powered by mattermost), and so much more that I could spend four or five blog posts on it. But in spite of the incredible depth of the features it Gitlab, getting started using it is incredibly easy.


5. Gitlab is also the best FREE and OPEN SOURCE git server there is.

I already said this above, but I feel quite passionate about it so I'm going to restate it a different way.  The future of trustworthy software is to hide nothing.  Transparency is a corporate virtue in an information age, and open source software is the right way to create software which can be trusted. Important tools and infrastructure should be built from inspectable parts. I include all tools used to write software in this.  All software development tools you wish to trust utterly should be open source.

That includes github, and github is only going to open its source if we force it to.  This requires you to stop using github. YOU and YOU and YOU.

Your version control software, and the operating system it runs on should be open source.  You should not host your version control server on windows. It should be hosted on Linux.  It should be a self-hosted or cloud hosted Gitlab instance.  Or you can use the public gitlab.com to host your open source project.   Be fully open source. Don't host your open projects on closed proprietary services.

I have nothing specifically against github or bitbucket other than that I only believe they are appropriate for closed source commercial companies to use to host closed source projects and proprietary technologies.  Use proprietary tech to host your proprietary code. Fair and good.

Closed source software used to write software can no longer be trusted. We can not afford to trust it. Even if we trust the people who create it, we can not inspect it, and find its security weak points, we cannot understand our own ability to protect our customers, our employees, and the general public unless we can do anything necessary to verify the correctness of any element of the tools we choose to trust.   Just as we do not inspect every elevator before we ride in it, I do not believe that open source advocates actually read the entire linux kernel, and every element of a linux system.  What we are instead doing is evolving more rapidly into a state which already is, and always will be, more secure than running a closed source commercial version control tool, or operating system.

Do not imagine me to be saying more than I am saying here. Commercial software is fine, up to the point where you have to trust your entire company, or the entire internet to run on it.   I also do not want a surgeon to use a Windows laptop running any Microsoft or non-Microsoft anti-virus software to visualize my internals while he performs surgery on me.

Now I'm actually NOT trashing Microsoft or my employer here, or yours.  I'm saying, that if Microsoft, or my employer were to receive, in good faith, a request from their customers to audit their codebases, I would like to believe that they (makers and sellers of closed source software) might be able to grant access to the codebase used for their products, to qualified auditors who might want to search for security flaws, and back doors, for their own purposes, that they will soon be legally obliged to do so. But the sad news is that those legal rights which I believe will materialize, will be sadly inadequate, because looking at Windows source code is not the same as knowing that this is the exact, and only code, that is present on your Windows 10 machine.   Microsoft is loading new stuff on there every hour of every day, for all you know, and you can't turn it off or stop it.

OpenSSH vulnerabilities of the last few years notwithstanding, I would trust OpenSSH over any closed source SSL implementation, hands down.  No, many eyes do not make ALL bugs shallow.  But neither is closed source safe from attack, nor can it be made safe.  Networked systems must be made safe, and I believe that only a collaborative open source model will work to make systems actually safe. I do not believe that any single company, even microsoft, will be able to internally work to secure the internet.  Doing that will require a collaborative effort by everybody.

We all matter. Open source matters because people matter. Open source is a fundamental democratic principle of an information age.

The cost to build closed source software is going to increase radically. The cost of supporting and auditing it for safety is only going to increase. Because of that, I believe that many software companies will over time evolve into service providers (SAAS) models, and will transition to models which allow companies to get paid, and to hire great staff to work on their products, but that it will in the end become cheaper to maintain an open source ecosystem than a closed source one. I believe that the public will move away from trusting giant binary blobs, and that our culture will shift to a fundamental distrust of binary blobs.

I believe that companies that are farther ahead on that curve will adapt to the coming future better than those who deny it.  I believe that companies who embrace open source will save their companies.

And that little rant is really a way of stating more about why I think Gitlab is great, and Github is a problem.  I do not, and I can not trust Github. I can't trust it not to abuse its position and power. I can't actually trust it to stay online either. And neither should you. And neither should the tools that assume that their libraries or collections of gems should all be hosted on github. Even github shouldn't be happy having a bullseye that large on their backs.

Put the distributed back in distributed version control.  Diversity is your best defence.

Get gitlab today, and use it.  I'm from the future, I'm right.  Please listen.

What are you waiting for, smart person. Go try it now.


1.   Start by creating an account on Gitlab.com so you don't have to install anything to just try it out and get familiar with its features.   If you know github, you already know how to use gitlab.com so this won't take long.

2.   If you have a modicum of Linux knowledge, or can even read and type in a few bash commands you can spin up a fresh ubuntu 16 LTS VM on your server virtualization hardware, or if you haven't got any, spin one up on azure or amazon or google cloud, and follow the easy instructions to set up your own gitlab instance.   I recommend the "omnibus" install.  It's awesome.

3. Learn a bit from the excellent documentation.   If you have questions you can ask them on the community forum.