Dunc's programming and sysadmin blog

I'm an experienced C and Perl programmer and systems administrator, interested in large scale ruthless automation of systems. These are random musings about stuff I'm doing or thinking about.

Shared Norms of C Programming?

Following a discussion on the “Plain Ordinary C” LinkedIn newsgroup, I started wondering about whether C Programmers (or programmers more generally) develop a shared sense of normal ways of programming, rather in the sense of “Norms” in psychology, these rules of thumb might alert you to something “not right” just because it’s very abnormal, and thus keep you away from “Here be dragons” areas on the map.

The particular example was a C newbie, writing a recursive call to main().  In fact, worse than that, main() called function f() which called main() again, i.e. it was Mutual Recursion involving main().  This seemed so weird and evil to me that I thought it was a very confused example - in fact, the recursive calls had no conditions, so it was a confused example that could not possibly work.

I suspect every individual C programmer develops their own personal sense of what’s normal, but the more interesting question is whether these individual senses combine into a Shared Norm.  I don’t have an answer to this, but it’s interesting to think about:-)

A couple of my suggestions are:

Do not call main() from anywhere - it’s the entry point.

Do not use recursion when a simple iterative solution is available.

Posted 528 weeks ago

Oh bugger

Just heard that Terry Pratchett has died.  Damn damn damn.

greatmen–

Posted 550 weeks ago

How Modifiable is Open Source Software?

Several times when trying to modify open-source software (eg. gnu tar), I’ve found that some changes “go with the grain of the design ” (woodworking metaphor) and are easy to make, whereas other changes “cut across the grain” and are nearly impossible.  In one case I gave up trying to add a feature (per-volume checksums) to gnu tar and wrote a simple filesystem traverser and multi-volume packer program (called tint).  Why was it so hard?  Because multi-volume support was added late to gnu tar, and was bodged in about 3 levels deep inside the “append some bytes to the tarball” function, and I was simply unable after extensive code-reading and grepping to locate clearly defined “start new volume” and “finish volume” functionality inside the code.

Of course, this is probably just me not being smart enough.  The “start” and “finish” volume functionality must have been in the code somewhere, I just couldn’t find it.

But I’ve often wondered how general this phenomenon is, and whether this casts any doubt on the “given enough eyeballs, all bugs are shallow” concept that Eric Raymond coined (see http://en.wikipedia.org/wiki/Eric_S._Raymond) a few years back.

Posted 550 weeks ago

Just finished giving this year’s Perl course to second year CS undergrads. Went pretty well, I enjoyed it, and I think the students enjoyed it too.

Mentioned DBIC, Catalyst for the first time, and expanded the Moose coverage this year.   www.doc.ic.ac.uk/~dcw/perl2014/

Posted 558 weeks ago

finished www.doc.ic.ac.uk/~dcw/PSD/article8

Hope everyone had a good Christmas.  Finally finished my new Professional Software Development article, a discussion of how I designed and built datadec, a tool that enables you to define one or more Recursive Data Types (as used in functional programming languages like Haskell) and then turn them automatically into an ANSI C module which implements those types in C, which you may link into your programs.

Posted 560 weeks ago

(Delayed from Friday 24th Nov) Second day of Dave Cross’s Advanced Perl Course also excellent, covered Moose, Catalyst, Plack and PSGI.  Catalyst section rather out of my comfort zone, I found MVC stuff hard going (to be fair, learning Ruby on Rails just as slow going).  Catalyst is both very similar to Rails and very different.  AutoCRUD plugin looks fantastic.  Dave’s course highly recommended.

Posted 566 weeks ago

Finished the first day of Dave Cross’s excellent 2-day “Advanced Perl Techniques” course, learning lots of great things new to Perl in the last decade or so:-)  The most interesting topics were proper testing and using DBIx::Class, aka DBIC, Perl’s most popular ORM (Object Relational Mapper).  However, I still hate exceptions with a passion, even when they’re in Perl.

Posted 567 weeks ago

Fun with Perl threads and signals.  We’ve just raised a perl bug report (123188) having encountered the weird fatal error in the perl interpreter:

“Maximal count of pending signals (120) exceeded”

The cause of this was a patch added back in 2008 to fix a signal storm on the long dead OS/2 operating system.  The patch is described here:

www.nntp.perl.org/group/perl.perl5.porters/2006/12/msg119236.html

Note that even the original patcher didn’t much like his own patch:-)  Despite this, the patch was added to Perl 5.8.8 and has stayed in ever since, occasionally causing problems to signal-intensive code.

However, our code that triggers this problem does no explicit signal handling, but instead uses Perl threads.  It’s robust mature code that has worked flawlessly for at least 6 years, but now it trips up in Perl’s Thread::Queue dequeue function, which seems to be using condition variables made up from signals.  We’re currently working around the problem with a hand compiled Perl 5.18.2 interpreter with

#define SIG_PENDING_DIE_COUNT  520

what fun:-)

Posted 567 weeks ago

Wow!  finally getting to grips with git branches, the problem is picking a suitable development model from all the possibilities that git offers.  I’m using a simple “short lived feature branch” model.

Posted 569 weeks ago

Bash "ShellShock": fixing unsupported Ubuntu releases (update)

Yesterday I blogged about fixing the Shellshock vulnerability on unsupported Ubuntu linux distributions.  There are now 2 CVEs with different fixes:

CVE-2014-6271 was shown by:

bash
env x='() { :;}; echo vulnerable to 6271' bash -c 'echo'

If it prints “vulnerable to 6271” then you’re vulnerable..

CVE-2014-7169 was shown by:

bash
env X="() { (a)=>\\" bash -c '/dev/stdout echo vulnerable to 7169'

If it prints “vulnerable to 7169” then you’re vulnerable.. Ubuntu have released patches for both CVEs, in diff and dpatch form, and it’s merely a question of ensuring that these patches are applied into older source packages (all available via launchpad and google) during a package rebuild, and the debian/changelog names a newer version.

In older dpatch-using distros up to 11.04, this involves appending the CVE basenames to the list of debian_patches in debian/rules; in newer diff-using distros (eg 13.04), you append the CVE names to debian/patches/series.in.

We’ve rebuilt experimental patched packages for 7.04, 8.04, 9.04, 11.04 and 13.04, verified that they fix both CVEs (using the above tests) and released them to all our machines.  If anyone’s interested, you can download the source trees plus packages from

 http://www.doc.ic.ac.uk/~dcw/unsupported-ubuntu-shellshock.tgz

As ever, there are no guarantees whether they work for you, whether they break your systems, etc.  Use at your own risk.

If you try and rebuild any of these packages yourself,or adapt them for use with the XX.10 Ubuntu releases (we never use these), you will of course need local apt repos to get the build prerequisites (now that Ubuntu have nuked many older Ubuntu repos, idiots), and you may run into other weird problems.  The weirdest problem we hit (on 7.04 and 8.04) was latex refusing to install cos it was over 5 years old.  Edit /usr/share/texmf-texlive/tex/latex/base/latex.ltx, search for “years old” and comment out an entire “if.. fi” stanza (see http://web.archive.org/web/20140223193323/http://my.opera.com/Michael-Dodl/blog/2011/06/24/latex-disabling-the-5-year-time-bomb)

Posted 574 weeks ago