Planet Debian

Subscribe to Planet Debian feed
Planet Debian - http://planet.debian.org/
Updated: 11 min 15 sec ago

Vincent Fourmond: New home for Tioga

2 January, 2015 - 17:29
Now, it's been a long time since Rubyforge has become unusable, and the Tioga project (on which ctioga2 is based) was still hosted there. I first wanted to thanks Rubyforge for hosting our projects for so long, and then say that Tioga is now maintained in sourceforge in a git repository. The web page is hosted by sourceforge as well, and that includes the documentation. A clone of the git repository is available on github. Enjoy !

Russ Allbery: Review: An Imaginary Tale

2 January, 2015 - 12:39

Review: An Imaginary Tale, by Paul J. Nahin

Publisher: Princeton University Copyright: 1998, 2007 Printing: 2010 ISBN: 0-691-14600-4 Format: Trade paperback Pages: 259

Subtitled The Story of √-1, An Imaginary Tale is a history of the use of the number mathematicians normally call i. Nahin starts in the 15th century with the discussion of the time on solving cubic equations, specifically del Ferro's solution to the depressed cubic. He walks through how to approach that solution with imaginary numbers, provides a brief introduction to the complex number plane, and then explains that del Ferro didn't follow the logic in those directions at all. Mathematicians at the time were dubious about negative numbers because they were not intuitive representations of real-world quantities. The square root of negative numbers was considered simply impossible.

Nahin continues on in this historical vein for three chapters, walking through the mathematical problems that arose from analysis of cubics and the constant appearance of imaginary numbers, the early attempts to find a geometric interpretation, and then the slow development of the modern conception of imaginary numbers in the 19th century. The emphasis throughout is on the specifics of the math, not on the personalities, although there are a few personal tidbits. Along the way, he takes frequent side journeys to highlight the various places complex numbers are useful in solving otherwise-intractable problems.

After that initial history come two chapters of applications of complex numbers: vector analysis, Kepler's laws, applications in electrical engineering, and more. He does win a special place in my heart by walking through the vector analysis problem that George Gamow uses to demonstrate complex numbers in One Two Three... Infinity: a treasure map whose directions depend on landmarks that no longer exist. Following that is a great chapter on deeper mathematical wizardry involving i, including Euler's identity, ii, and a pretty good explanation of hyperbolic functions. The final chapter is an introduction to complex function theory.

One's opinion of this book is going to vary a lot depending on what type of history of math you like to read. Unfortunately, it wasn't what I was hoping for. That doesn't make it a bad book — other reviewers have recommended it highly, and I think it would be a great book for someone with slightly different interests. But it's a very mathematical book. It's full of proofs, calculations, and analysis, and assumes that you remember a reasonable amount of algebra and calculus to follow along. It's been a long time since I studied math, and while I probably could have traced the links between steps of his proofs and derivations with some effort, I found myself skimming through large chunks of this book.

I love histories of mathematics, and even popularizations of bits of it (particularly number theory), but with the emphasis on the popularization. If you're like me and are expecting something more like The Music of the Primes, or even One Two Three... Infinity, be warned that this is less about the people or the concepts and more about the math itself. If you've read books like that and thought they needed more equations, more detail, and more of the actual calculations, this may be exactly what you're looking for.

Nahin is fun to read, at least when I wasn't getting lost in steps that are obvious to him. He's quite enthusiastic about the topic and clearly loves being able to show how to take apart a difficult mathematical equation using a novel technique. His joy reminds me of when I was most enjoying my techniques of integration class in college. Even when I started skimming past the details, I liked his excitement.

This wasn't what I was looking for, so I can't exactly recommend it, but hopefully this review will help you guess whether you would like it. It's much heavier on the mathematics and lighter on the popularization, and if that's the direction you would have preferred the other similar books I've reviewed to go, this may be worth your attention.

Rating: 6 out of 10

Jeff Licquia: Happy 2015!

2 January, 2015 - 11:04
[en]

Look, it’s 2015. How’d that happen?

Blogging has been pretty much nonexistent this year, apart from a public service announcement about the Heartbleed bug and a public statement about boring encryption stuff.  (Which, if you’re actually interested in that sort of thing: I did finally get the requisite number of signatures and replaced my key in Debian just in time to avoid the Great Key Purge of 2014.)

It’s a new world.  Social media has made blogs obsolete, supposedly, except that they’re now too busy competing with each other (and, sometimes, their own fans) to be terribly useful.  I’ve tried to write a blog post about that a few times, only to delete it out of frustration and long-windedness.

So there’s a resolution for 2015: get past these social media issues and get back to regular communication.  Maybe what I need is a good social media rant.  I’m sure you’re all waiting on pins and needles for that one.

Lots has been going on.  I’m an empty-nester now; the youngest kid started college this fall.  I’ve been busy at work and busy at skill freshening, including getting on this funky Haskell bandwagon that seems to be all the rage among the cool kids.  And plenty of other things going on, some of which probably deserve their own blog posts.

Maybe those will get written in 2015.  Plan: write.  Journey of a thousand miles starting with single steps, and all that.

Steve McIntyre: UEFI Debian installer work for Jessie, part 3

2 January, 2015 - 10:40

Time for another update on my work for UEFI improvements in Jessie!

I've got an i386-only UEFI netinst up and running right now, which will boot and install on the Asus X205TA machine I have. I've got the required i2c modules included in this build so that the installer can use the keyboard and trackpad on the machine - useful...! See #772578 for more details about that. Visit http://cdimage.debian.org/cdimage/unofficial/efi-development/jessie-upload1/ to download and test the image. There are a few other missing pieces of hardware support for this machine yet, but the basics are there. See the wiki for more information.

My initial i386 test CD is not yet going to do an amd64 installation for you, but it should let you get going with Debian on these machines! I'm going to continue working on that 32-64 support next.

WARNING: this CD is provided for testing only. Use at your own risk! If you try this on an early Intel-based Mac, it will not work. Otherwise, this should likely work for most folks using 32-bit x86 hardware just like any other Debian Jessie daily netinst build.

If you have appropriate (U)EFI hardware, please try this image and let me know how you get on, via the debian-cd and debian-boot mailing lists.

John Goerzen: Sound players: Adventures with Ampache, mpd, pulseaudio, Raspberry Pi, and Logitech Media Server

2 January, 2015 - 09:54

I finally decided it was about time to get my whole-house sound project off the ground. As an added bonus, I’d like to be able to stream music from my house to my Android phone.

Some Background

It was about 2.5 years ago that I last revisited the music-listening picture on Linux. I used Spotify for awhile, but the buggy nature of its support for local music eventually drove me up the wall. I have a large collection of music that will never be on Spotify (local choirs, for instance) and this was an important feature.

When Google Play Music added the feature of uploading your local collection, I used that; it let me stream music from my phone in my car (using the Bluetooh link to the car). I could also listen at home on a PC, or plug my phone into various devices to play. But that was a hassle, and didn’t let me have music throughout the house.

Google Play is reasonable for that, but it has a number of really glaring issues. One is that it often gets album artwork wrong; it doesn’t use the ID3 tags embedded in the files, but rather tries to “guess”. Another is that the “sync” is only “add”. Move files to another place in your collection, or re-rip them to FLAC and replace your old MP3s, and suddenly they’re in Google Play twice. It won’t ever see updated metadata, either — quite a hassle for someone that uses Musicbrainz and carefully curates metadata.

My Hardware

I already have an oldish PC set up as a entertainment box running MythTV. It is a diskless system (boots PAE over the network and has NFS root) and very quiet. It has video output to my TV, and audio output via S/PDIF to my receiver. It is one logical audio frontend.

My workstation in my office is another obvious place.

My kitchen has a radio with a line in jack, and I also have a small portable speaker with a line in jack to make the last two options. I also have a Raspberry Pi model B that I bought awhile ago and was looking for a purpose, so I thought – this should be cheap and easy, right? Well. Cheap, yes. Easy, not so much.

First attempt: Ampache

The Ampache project produces quite a nice piece of software. Ampache has matured significantly in the last 2.5 years, and the usability of its web-interface — with HTML5 and Flash players as options — is quite impressive. It is easily as usable as Google’s, though its learning curve is rather more steep. There are multiple Android apps for Ampache to stream remotely. And, while most are terribly buggy and broken, there is at least one that seems to work well.

Ampache can also output m3u/pls files for a standalone player. It does on-the-fly transcoding. There are some chinks in the armor, however. The set of codecs that are transcoded or passed through is a global setting, not a per-device setting. The bitrates are per-account, so you can’t easily have it transcode FLAC into 320Kbps mp3 for streaming on your LAN and 128Kbps MP3 for streaming to your phone. (There are some hacks involving IP address ranges and multiple accounts, but they are poorly documented and cumbersome.)

Ampache also has a feature called “localplay” in which it drives local players instead of remote ones. I tried to use this in combination with mpd to drive music to the whole house. Ampache’s mpd interface is a bit odd; it actually loads things up into mpd’s queue. Sadly it shares the same global configs as the rest. Even though mpd is perfectly capable of handling FLAC audio, the Ampache web player isn’t, so you have to either make it transcode mp3 for everything or forego the web player (or use a second account that has “transcode everything” set). Frustratingly, not one of the Android clients for Ampache is even remotely compatible with Localplay, and some will fail in surprising ways if you have been using Localplay on the web client.

So let’s see how this mpd thing worked out.

Ampache with mpd for whole-house audio

The primary method here is to use mpd’s pulseaudio driver. I configured it like so:


audio_output {
type "pulse"
name "MPD stream"
#server "remote_server" # optional
sink "rtp" # optional
mixer_type "software"
}

Then in /etc/pulse/system.pa:


load-module module-null-sink sink_name=rtp format=s16be channels=2 rate=44100 sink_properties="device.description='RTP Multicast Sink'"
load-module module-rtp-send source=rtp.monitor

This tells Pulse to use multicast streaming to the LAN for the audio packets. Pulse is supposed to have latency synchronization to achieve perfect audio everywhere. In practice, this works somewhat poorly. Plus I have to install pulse everywhere, which inserts its tentacles way too deeply into the ALSA stack for my taste. (alsamixer suddenly turns useless by default, for instance.)

But I gave it a try. After much fiddling — Pulse is rather poorly documented and the interactions of configuration tools with it even more so — I found a working configuration. On my MythTV box, I added:


load-module module-rtp-recv
load-module module-native-protocol-tcp auth-ip-acl=127.0.0.1;x.x.x.x

The real annoyance was getting it to set the output through the S/PDIF digital port. Finally I figured out the magic potion:


set-card-profile 0 output:iec958-stereo+input:analog-stereo

This worked reasonably well. The Raspberry Pi was a much bigger challenge, however.

I put Raspbian on it easily enough, and installed Pulse. But apparently it is well-known in the Pulse community that Pulse’s RTP does not work well with wifi. Multicast itself works poorly with wifi in general, and Pulse won’t do unicast RTP. However, Pulse assumes very low latency and just won’t work well with wifi at all.

Morever, pulseaudio on the rpi is something of a difficult beast to tame. It has crackling audio, etc. I eventually got it working decently with:

daemon.conf:


realtime-scheduling = yes
realtime-priority = 5
resample-method = src-sinc-fastest
default-sample-rate = 44100
default-fragments = 4
default-fragment-size-msec=20

and in system.pa, commented out the module-udev-detect and module-detect, and added:


load-module module-alsa-card device_id=0 tsched=0 fragments=10 fragment_size=640 tsched_buffer_size=4194384 tsched_buffer_watermark=262144

plus the rtp-recv and tcp protocol lines as before. This got it working with decent quality, but it was always out of sync by at least a few tenths of a second, if not a whole second, with the other rooms.

Brief diversion: mplayer

Many players besides pulseaudio can play the RTP streams that pulseaudio generates. mplayer, for instance, can. I found this worked on the Raspberry Pi:


mplayer -really-quiet -cache 64 -cache-min 95 -demuxer rawaudio -rawaudio format=0x20776172 rtp://224.0.0.56:46510

But this produced even worse sync. It became clear that Ampache was not going to be the right solution.

Logitech Media Server

I have looked at this a few times over the years, but I’ve somehow skipped it. But I looked at it again now. It is an open source (GPL) server, and was originally designed to work with Logitech hardware. There are now all sorts of software clients out there. There is a helpful wiki about it, although Logitech has rebranded the thing so many times, it’s not even consistent internally on what the heck it’s called (is it LMS? or Squeezebox Server? or Slimserver? The answer is: yes.)

LMS does no web-based streaming. At all. But I thought I’d give it a try. Installation is a little… weird (its .deb packages up binary perl modules for things that are already in Debian, for half a dozen architectures and many Perl versions.) I had some odd issues but eventually it worked. It scanned my media collection. I installed squeezelite on my workstation as my first player, and things worked reasonably well out of the box.

I proceeded to install squeezelite on the Raspberry Pi and MythTV server (where I had to compile libsoxr for wheezy, due to an obscure error I couldn’t find the cause of until I used strace). Log in to the LMS web interface, tell it to enable synchronization, and bam! Perfect synchronization! Incredible.

But I ran into two issues: one was that it used a lot of CPU even on my workstation (50% or more, even when idle). Strangely, it used far less CPU on the pi than did pulseaudio. Secondly, I’d get annoying clicks from it from time to time. Some debugging and investigation revealed that they were both somewhat related; it was getting out of sync with the pi and correcting. Initially I assumed this to be an issue with the pi, but tracked it down to something else. I told it to upsample to 48kHz and this made the problem go away. My command on PC hardware is:


squeezelite -a 200:10 -o dmix -u hLX -z

And on the Raspberry Pi:


squeezelite -a 800:10:16:0 -n kitchen -z

It is working perfectly now.

LMS even has an Android remote control app that can control all devices together, queue up playlists, etc. Very slick.

The Raspberry Pi

The Raspberry Pi, even the model B+, apparently has notoriously bad audio output. I noticed this first as very low volume on the output. I found a $11 USB DAC on Amazon that seems to do the trick, so we will see. I am also using the $10 Edimax USB Wifi and it seems to work well.

Other Options

The big-name other option here is Sonos. They have perfect sync audio already working. But they’re pricy, closed, and proprietary; you have to buy Sonos speakers, transmitters, etc. for the entire house. Their receivers start at $200, and it’s at least $350 to integrate an existing stereo into the system. I’d be shelling out over $1000 to use Sonos for this setup. As it is, I’ve bought less than $100 of equipment (a second Pi and its accessories) and getting output that, while no doubt not quite as pristine, it still quite nice and acceptable. Quite nice.

Update: LMS with Apache

It turns out that Logitech Media Server can automatically save created playlists to a spot where Ampache can find them. So an Ampache streaming player can be used to access the same collection LMS uses, with full features. Nice!

Russ Allbery: 2014 Book Reading in Review

2 January, 2015 - 06:04

This year, after a series of catastrophically horrible and unethical management decisions, I walked away from my job of seventeen years and found a new job.

As you might expect, reading wasn't the top priority for much of the year. I'm moderately surprised that I read as much as I did. The good side is that I'm now in a much better place both professionally and personally and no longer have to put up with draining and demoralizing nonsense happening on a regular basis. The downside for my review output is that the new job is more engrossing and is, in some ways, harder work, so I expect my reading totals going forward to stabilize somewhere below where they were in the past (although it's possible that the daily commute will change that equation somewhat).

As mentioned last year, I had a feeling that something like this would happen (although not that it would be anywhere near this bad), so I had no specific reading goals for the year. Next year, I'm going to see how it goes for the first few months, and might then consider setting some goals if I want to encourage myself to take more time for reading.

The below statistics are confined to the books I reviewed in 2014. I read three more books that I've not yet reviewed, partly because the end of the year isn't as packed with vacation as it was at Stanford. Those will be counted in 2014.

Despite the low reading totals for the year, I read two 10 out of 10 books. My favorite book of the year was Anne Leckie's Ancillary Justice, which was one of the best science fiction novels I've ever read. Highly recommended if you like the space opera genre at all. A close second was my favorite non-fiction book of the year and the other 10 out of 10: Allie Brosh's collection Hyperbole and a Half. Those of you who have read her blog already know her brilliant and insightful style of humor. Those who haven't are in for a treat.

I read a lot of non-fiction this year and not as much fiction, partly for mood reasons, so I don't have honorable mentions in the fiction department. In the non-fiction department, though, there are four more books worth mentioning. Cryptography Engineering, by Niels Ferguson, Bruce Schneier, and Tadayoshi Kohno, was the best technical book that I read last year, and a must-read for anyone who works on security or crypto software. David Graeber's Debt was the best political and economic book of the year and the book from which I learned the most. It changed the way that you think about debt and loans significantly. A close second, though, was David Roodman's Due Diligence, which is a must-read for anyone who has considered investing in microfinance or is curious about the phenomenon. We need more data-driven, thoughtful, book-length analysis like this in the world.

Finally, The Knowledge, by Lewis Dartnell, is an entertaining and quixotic project. The stated goal of the book is to document the information required to rebuild civilization after a catastrophe, with hopefully fewer false starts and difficult research than was required the first time. I'm dubious about its usefulness for that goal, but it's a fascinating and entertaining book in its own right, full of detail about industrial processes and the history of manufacturing and construction that are otherwise hard to come by without extensive (and boring) research. Recommended, even if you're dubious about the efficacy of the project.

The full analysis includes some additional personal reading statistics, probably only of interest to me.

Tim Retout: Looking back at 2014

2 January, 2015 - 04:38

I have a tendency to forget what I've been up to - so I made a list for 2014.

I started the year having recently watched many 30c3 videos online - these were fantastic, and I really should get round to the ones from 31c3. January is traditionally the peak time for the recruitment industry, so at work we were kept busy dealing with all the traffic. We'd recently switched the main job search to use Solr rather than MySQL, which helped - but we did spend a lot of time during the early months of the year converting tables from MyISAM to InnoDB.

At the start of February was FOSDEM, and Kate and I took Sophie (then aged 10 months) to her first software conference. I grabbed a spot in the Go devroom for the Sunday afternoon, which was awesome. Downside: we got horribly ill while in Brussels.

At work I was sorting out configuration management - this led to some Perl module backporting for Debian, and I uploaded Zookeeper at some point during the year as well. We currently make use of vagrant, chef and a combination of Debian packages and cpanm for Perl modules, but I have big plans to improve on that this year.

Over a break from work I hacked up apt-transport-tor, which lets you install Debian packages over the Tor network. (This was inspired by videos from 30c3 and/or LibrePlanet, I think?) Continuing the general theme of paranoia, I attended the Don't Spy On Us campaign's day of action in June.

Over the summer at work I was experimenting with Statsd and Graphite for monitoring. I also wrote Toggle, a Perl module for feature flags. In July I attended a London.pm meeting for the first time, and heard Thomas Klausner talk about OX - this nudged me into various talks at LPW (see below). Pubs have a lot to answer for.

At some point I got an IPv6 tunnel working at home (although my ISP-provided router's wireless doesn't forward it), and I had an XBMC install going on a Raspberry Pi as another fun hack.

In August and September I worked on packaging pump.io for Debian, and attended IndieWebCamp Brighton, where I delivered a talk/workshop on setting up TLS. (This all ties in to the paranoia theme.) I stalled the work on pump.io, partly because of licensing issues at build-dependency time (if you want to run all the tests) - but I expect I'll pick this up in 2015 once jessie is released.

November was the London Perl Workshop, where I presented my work from the summer on statsd/graphite and Toggle, and a Bread::Board lightning talk. LPW was more enjoyable for me this year than previous years, probably because of the interesting people discussing various aspects of how feature flags ought to work. Simultaneously was the Cambridge MiniDebConf (why do these always clash?) where I think I fixed at least one RC bug.

This is not an exhaustive list of everything I've done this year - there are more changes now lined up for 2015 which I haven't shared yet. But looking back, I'm pleased that the many small experiments I get up to do add up to something over time, and I can see that I'm achieving something. Here's to another year!

Chris Lamb: Goals

2 January, 2015 - 04:36

Dr. Guy Winch:

We used to think that happiness is based on succeeding at our goals, but it turns out not so much.

Most marathon runners, for example — not professionals but the amateur runners — their high for completing the marathon usually disappears even before their nipples stop bleeding.

My point is that the high lasts for a very short amount of time. If you track where people's happiness and satisfaction is, it is in "getting" or in making progress towards our goals. It's a more satisfying, life-affirming, motivating and happy thing than actually reaching them.

So it's a great thing to keep in mind... Health, for example. When you define health as something you want to do, living your life and looking back on the week and saying "That was a healthy week for me: I worked out this number of times, I did this amount of push-ups, I ate reasonably most of the time..." that's very satisfying and that's where you'll feel happy about yourself. And if you're too focused on a scale for some reason, then that's an external thing that you'll hit... and then what?

So it is about creating goals that are longer lasting and really focusing on the journey because that's really where we get our happiness and our satisfaction. Once it's achieved... now what?

Dirk Eddelbuettel: (Belated) 2014 Running Recap

2 January, 2015 - 03:10

Having just blogged about today's 5k, I noticed that I did not blog at all about running in 2014. There wasn't much, but another Ragnar Relay in February, again as an Ultra team. I more-or-less live tweeted this though: Pre-race warning (and that was my route!), Team photo at start, Leg 1, Leg 2, Leg 3, Leg 4, Leg 5, Leg 6, and Chilling

I also ran the Philly Half-Marathon in November following up on the Penn R Workshop, and posted some tweets: Post race selfie, Strava result (thinking it was a 13.5 miler race--WTF), and Post-race chill.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Dirk Eddelbuettel: New Year's Run 2015

2 January, 2015 - 03:07

Nice, crisp and pretty cold morning for the 2015 New Year's 5k. I had run this before with friends: 2003, 2005, 2006 and 2007.

Needless to say, I am a little slower now: A hand-stopped 23:13 for a 7:27 min/mile pace is fine by me given the weather, lack of speedwork in the last few months, and generally reduced running---but also almost a minute slower than the last time I ran this.

In case you're a fellow running geek and into this, Strava has my result nicely aggregated.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Simon Kainz: Pigs-in-a-mud-pond Cake

2 January, 2015 - 02:00

In lack of some New-Years-Charms, this is our attempt at creating something by ourselves:

Step 1

Make some pigs (Pigs being a symbol of luck!)

(Closeup, probably NSFW :-) )

Step 2

Make some pie. We made a simple, two layeres chocolate pie, topped with some jam to prevent that is all gets very dry. We then made the pond "walls" by sticking KitKat on the side of the pie. We used molten chocolate as "glue".

Step 3

Add some more chocolate and add the pigs! (Piggy relaxing in liquid chocolate! :-)

Step 4

Enjoy! Happy 2015!

Lars Wirzenius: The shape of a solution

1 January, 2015 - 22:31

Each software tool exists to solve some problem. For each problem, there are many possible solutions. Even when different programs basically do the same thing, they can have quite different shapes.

As an example, this morning I was wondering if it would be possible for me to use notmuch to index my entire mail archive. For that, I needed to convert a number of mbox folders to Maildir format. That's a resonably easy problem, given access to suitable programming libraries, but there's an existing tool for that, called mb2md. Unfortunately, it has the wrong shape for my needs.

mb2md doesn't just convert one mbox to one maildir. It's designed to for a mail admin converting all server-side mbox folders for a user into a corresponding structure of Maildir folders. This seems to be necessary when switching IMAP servers. That's a fairly specialised problem, and the program has been written to make it easy for a mail admin to do that.

What I need is part of the problem solved by mb2md and indeed it can do just that part. However, the overall shape of mb2md is such that my part is hard to do. The incantation is quite unintuitive and requires careful reading of the documentation.

The shape of a solution matters. mb2md could easily have been written in a way that provides a simple tool for the single folder conversion, and then a more complex tool for the mail admin's more complicated problem. This would have resulted in a much more general tool, and that would make it easier for more people to use it without much effort.

Mail folder format conversions are a fairly esoteric thing to do. However, the lack of generality is a frequent issue with how programs are designed. It is easy to fall into the trap of writing a highly specialised tool, instead of taking a step back and making a more general purpose tool. The specialised tool will help a small number of people. The general tool will help many people.

Examples of this are fairly common. Debian has a set of tools for making Debian live CDs; they are not quite able to make a bootable hard disk image as well (thus, vmdebootstrap). There's programs for computing cyclomatic complexity, which produce HTML reports, rather than something that can be processed by other programs without too much effort. There's tools for managing address books that are limited to specific cultures, e.g., by hardcoding assumptions of what a person's name looks like (thus, clab).

One of my favourite examples is xargs, which by default does the wrong thing by assuming its input is whitespace delimited. Any whitespace, not just newlines. Any sensible use requires adding the -0 option, which makes xargs that much more tedious to use.

Furthermore, I've often found that the more general tool is simpler. It's functional specification is simpler; it's implementation is simpler, and has fewer special cases; it's user experience is simpler. That's not always true, but often it is.

Sometimes the general solution shape is not worth it. But it's always worth considering whether it might be.

One of the parts of the Unix culture I really like is the preference for general tools that are easy to combine together.

Thorsten Alteholz: My Debian Activities in December 2014

1 January, 2015 - 20:59

FTP assistant

This month at the end of the year has been rather quiet as well. The holiday season is not suited for lots of REJECTs, so all in all I marked 91 packages for accept and rejected only 14 packages. But be aware, the period of grace is over now.

Squeeze LTS

This was my sixth month that I did some work for the Squeeze LTS initiative, started by Raphael Hertzog at Freexian.

This month I got assigned a workload of 20.5h and I spent these hours to upload new versions of:

  • [DLA 99-1] flac security update
  • [DLA 100-1] mutt security update
  • [DLA 101-1] jasper security update
  • [DLA 102-1] tcpdump security update
  • [DLA 105-1] graphviz security update
  • [DLA 107-1] unbound security update
  • [DLA 108-1] nfs-utils security update
  • [DLA 110-1] libyaml security update
  • [DLA 109-1] libyaml-libyaml-perl security update
  • [DLA 117-1] qt4-x11 security update
  • [DLA 121-1] jasper security update
  • [DLA 122-1] eglibc security update
  • [DLA 123-1] firebird2.5 security update
  • [DLA 124-1] unzip security update

This month I also sponsored the upload of [DLA 126-1] ettercap security update. As far as I know, this has been the first time that someone who is not (yet?) involved in Debian as a Debian Maintainer or Debian Developer prepared a patch for Squeeze LTS. So many thanks to Nguyen Cong for doing the work. Thanks to Toshiba as well, who allowed him to work on this package. I am sure there is more to come.

As December is the time of gifts, I also uploaded [DLA 104-1] pdns-recursor security update although no LTS sponsor indicated any interest.

Other packages

Unfortunately the Debian Med Advent Calendar wasn’t as successful as the years before. Only five bugs in packages python-mne, avifile , biomaj-watcher, trimmomatic and uc-echo have been closed. Things can only get better …

Junichi Uekawa: Hello 2015.

1 January, 2015 - 19:04
Hello 2015. Wondering what'll come of this year.

Russ Allbery: Review: Three Parts Dead

1 January, 2015 - 13:19

Review: Three Parts Dead, by Max Gladstone

Series: Craft #1 Publisher: Tor Copyright: October 2012 ISBN: 1-4668-0203-0 Format: Kindle Pages: 336

Tara Abernathy was a student in the Hidden Schools, learning Craft, until she was expelled. Literally expelled: thrown from the floating schools to crash painfully to earth in the Badlands, left to return to her family and village and a life of small workings of Craft and contracts on behalf of local farmers. She had largely resigned herself to that life until raiders started killing people. Tara is not the sort of person who could stand by and watch that, or someone to refrain from using Craft to fix the world. The result was undead guardians for the town, perhaps unwisely formed from the town's risen dead, and only a job offer saves Tara from the ungrateful attention of her neighbors.

That's how Tara finds herself employed by the firm of Kelethras, Albrecht, and Ao, in the person of partner Elayne Kevarian. Provisionally, depending on her performance on their job: the investigation of the death of a god.

It's possible to call Three Parts Dead urban fantasy if you squint at it the right way. It is fantasy that takes place largely in cities, it features the investigation of a crime (and, before long, several crimes), and Tara's attitude is reminscent of an urban fantasy heroine. But this is considerably different from the normal fare of supernatural creatures. In this world, magic, called Craft, is an occupation that requires a great deal of precision and careful construction. Small workings are described similar to magic, although with an emphasis on metaphor. Larger workings more often come in the form of energy flows, contracts, and careful hedging, and the large Craft firms bear more resemblence to mergers and acquisitions specialists than to schools of wizards.

This means that the murder investigation of the god of Alt Coulomb involves a compelling mix of danger, magic, highly unusual library investigations, forensic accounting, hidden Craft machinery, unexpected political alliances, and an inhuman police force. Rather than the typical urban fantasy approach of being beaten up until the resolution of the mystery becomes obvious, Tara and her companions do quite a lot of footwork and uncover a more complex political situation than they were expecting. And, in keeping with this take on magic, the story culminates in a courtroom drama (of a sort). I really enjoyed this. It combines the stylistic elements of urban fantasy that I like with some complex and original world-building and a great take on magical contracts. I prefer worlds like this one, where any source of power people have lived with for a long time is surrounded by the controls, formal analysis, and politics that humans create around anything of value.

Tara is also a great protagonist. This is a coming of age story in a sense, and Tara is sometimes unsure of her abilities, but it's refreshingly devoid of worry or angst over new-found abilities. Tara enjoys her work, and approaches it with a well-written mix of uncertainty, impulsiveness, and self-confidence (sometimes warranted, sometimes not). I've read some good stories where the protagonist gets dragged into the story against their will, and some of them are quite good, but it's refreshing to read a book about someone who takes to the story like a duck to water. This is a believable protrayal of a character with a lot of native ability and intelligence, not much wisdom (yet), but a lot of thoughtful enthusiasm. I was disappointed to learn that she isn't the protagonist of the next book in the series.

The biggest flaw I found in this book is that Gladstone doesn't stick reliably to his world conception. At times, Craft collapses into something more like typical fantasy magical battles, instead of legal procedure and contract made concrete. I suppose this makes parts of the book more exciting, but I would have preferred a plot resolution that involved less combat and more argument. This isn't helped by the utterly hissable villain. There's a lot of complexity in understanding what happened and who was going to benefit (and how), but there is absolutely no doubt who the enemy is, and he's essentially without redeeming qualities. I would have preferred more nuance, given how satisfyingly complex the rest of the world-building is.

Three Parts Dead also occasionally suffers from the typical first novel problem of being a bit overstuffed. The world-building comes fast and thick, and nearly everything Tara does involves introducing new concepts. But the world does have a coherent history, and quite a lot of it. It used to be a more typical fantasy world ruled by gods, each with their own territory and worshippers (and Alt Coulomb is a throwback to this era), but an epic war between gods and Craft is in Tara's past, leading to the defeat or destruction of many of the gods. She lives in a time of uneasy truce between human and inhuman powers, featuring some very complex political and economic alliances. There's a lot of material here for an ongoing series.

This is a great first novel. It's not without its flaws, but I enjoyed it from beginning to end, and will definitely keep reading the series. Recommended.

Followed by Two Serpents Rise.

Rating: 8 out of 10

Riku Voipio: Crowdfunding better GCompris graphics

1 January, 2015 - 04:29
GCompris is the most established open source kids educational game. Here we practice use of mouse with an Efika smartbook. In this subgame, mouse is moved around to uncover a image behind.

While GCompris is nice, it needs nice graphics badly. Now the GCompris authors are running a indiegogo crowfund exactly for that - to get new unified graphics.

Why should you fund? Apart from the "I want to be nice for any oss project", I see a couple of reasons specific for this crowdfund.

First, to show kids that apps can be changed! Instead of just using existing iPad apps as a consumer, Gcompris allows you to show kids how games are built and modified. With the new graphics, more kids will play longer, and eventually some will ask if something can be changed/added..

Second, GCompris has recently become QT/QML based, making it more portable than before. Wouldn't you like to see it in your Jolla tablet or a future Ubuntu phone? The crowfund doesn't promise to make new ports, but if you are eager to show your friends nice looking apps on your platform, this probably one of the easiest ways to help them happen.

Finally, as a nice way to say happy new year 2015 :)

Chris Lamb: 2014: Selected highlights

1 January, 2015 - 00:23

Previously: 2012 & 2013.


January

Was lent a 15-course baroque lute.

February

Grandpa's funeral. In December he was posthumously awarded the Ushakov Medal (pictured) for his service in the Royal Navy's Arctic Convoys during the Second World War.

March

A lot of triathlon training but also got back into cooking.

April

Returned to the Cambridge Duathlon.

May

Raced 50 and 100 mile cycling time trials & visited the Stratford Olympic pool (pictured).

June

Ironman Austria.

July

Paced my sister at the Downtow-Upflow Half-marathon. Also released the first version of the Strava Enhancement Suite.

August

Visited Cornwall for my cousin's wedding (pictured). Another month for sport including my first ultramarathon and my first sub-20 minute 5k.

September

Entered a London—Oxford—London cycling brevet, my longest single-ride to date (269 km). Also visited the Tour of Britain and the Sri Chomnoy 24-hour endurance race.

October

London—Paris—London cycling tour (588 km).

November

Performed Handel's Messiah in Kettering.

December

Left Thread.com.

Wouter Verhelst: Perl 'issues'

31 December, 2014 - 23:46

I just watched a CCC talk in which the speaker claims Perl is horribly broken. Watching it was fairly annoying however, since I had to restrain myself from throwing things at the screen.

If you're going to complain about the language, better make sure you actually understand the language first. I won't deny that there are a few weird constructions in there, but hey. The talk boils down to a claim that perl is horrible, because the list "data type" is "broken".

First of all, Netanel, in Perl, lists are not arrays. Yes, that's confusing if you haven't done more than a few hours of Perl, but hear me out. In Perl, a list is an enumeration of values. A variable with an '@' sigil is an array; a construct consisting of an opening bracket ('(') followed by a number of comma- or arrow-separated values (',' or '=>'), followed by a closing bracket, is a list. Whenever you assign more than one value to an array or a hash, you need to use a list to enumerate the values. Subroutines in perl also use lists as arguments or return values. Yes, that last bit may have been a mistake.

Perl has a concept of "scalar context" and "list context". A scalar context is what a sub is in when you assign the return value of your sub to a scalar; a list context is when you assign the return value of your sub to an array or a hash, or when you use the list construct (the thing with brackets and commas) with sub calls (instead of hardcoded values or variables) as the individual values. This works as follows:

sub magic {
    if (wantarray()) {
        print "You're asking for a list!";
        return ('a', 'b', 'c');
    } else {
        print "You're asking for a scalar!";
        return 'a';
    }
}

print ("list: ", magic(), "\n");
print "scalar: " . magic() . "\n";

The above example will produce the following output:

You're asking for a list!
list: abc
You're asking for a scalar!
scalar: a

What happens here? The first print line creates a list (because things are separated by commas); the second one does not (the '.' is perl's string concatenation operator; as you can only concatenate scalars, the result is that you call the magic() sub in scalar context).

Yes, seen as how arrays are not lists, the name of the wantarray() sub is horribly chosen. Anyway.

It is documented that lists cannot be nested. Lists can only be one-dimensional constructs. If you create a list, and add another list as an element (or something that can be converted to a list, like an array or a hash), then the result is that you get a flattened list. If you don't want a flattened list, you need to use a reference instead. A reference is a scalar value that, very much like a pointer in C, contains a reference to another variable. This other variable can be an array, a hash, or a scalar. But it cannot be a list, because it must be a variable -- and lists cannot be variables.

If you need to create multi-dimensional constructs, you need to use references. Taking a reference is done by prepending a backslash to whatever it is you're trying to take a reference of; e.g., if you want to add a non-flattened array to a list, you instead create a reference to an anonymous array, like so:

$arrayref = [ 'this', 'is', 'an', 'anonymous', 'array'];

you can now create a multi-dimensional construct:

@multiarray = ('elem1', $arrayref);

Or you can do that in one go:

@multiarray = ('elem1', [ 'this', 'is', 'an', 'anonymous', 'array']);

Alternatively, you can create a non-anonymous array first:

@onedimarray = ('this', 'is', 'not', 'an', 'anonymous', 'array');
@multiarray = ('elem1', \@onedimarray);

In perl, curly brackets can be used to create a reference to anonymous hashes, whereas square brackets can be used to create a reference to anonymous arrays. This is all a basic part of the language; if you don't understand that, you simply don't understand Perl. In other words, whenever you see someone doing this:

%hash = {'a' => 'b'};

or

@array = [ '1', '2' ];

you can say that they don't understand the language. For reference, the assignment to %hash will result in an (unusable) hash with a single key that is a reference to an anonymous hash (which cannot be accessed anymore) and a value of undef; the assignment to @array will result in a two-dimensional array with one element in the first dimension, and two elements in the second.

The CGI.pm fix which Natanel dismisses in the Q&A part of the talk as a "warning" which won't help (because it would be too late) is actually a proper fix, which should warn people in all cases. That is, if you do this:

%hash = { 'name' => $name, 'password' => $cgi->param('password') };

then CGI.pm's param() sub will notice that it's being called in list context, and issue a warning -- regardless of whether the user is passing one or two password query-parameters. It uses the wantarray() sub, and produces a warning if that returns true.

In short, Perl is not the horribly broken construct that Natanel claims it to be. Yes, there are a few surprises (most of which exist for historical reasons), and yes, those should be fixed. This is why the Perl community has redone much of perl for Perl 6. But the fact that there are a few surprises doesn't mean the whole language is broken. There are surprises in most languages; that is a fact of life.

Yes, the difference between arrays and hashes on the one hand, and lists on the other hand, is fairly confusing; it took me a while to understand this. But once you get the hang of it, it's not all that difficult. And then these two issues that Natanel found (which I suppose could be described as bugs in the core modules) aren't all that surprising anymore.

So, in short:

  • Don't stop using Perl. However, do make sure that whenever you use a language, you understand the language, first, so you don't get bitten by its historical baggage. This is true for any language, not just Perl.
  • Don't assume that just because you found issues with core modules, the whole language is suddenly broken.

What I do agree with is that if you want to use a language, you should understand its features. Unfortunately, this single line in the final slide of Natanel's talk is just about the only thing in the whole talk that sortof made sense to me.

Ah well.

Dirk Eddelbuettel: digest 0.6.8

31 December, 2014 - 19:12

Release 0.6.8 of digest package is now on CRAN and will get to Debian shortly.

This release opens the door to also providing the digest functionality at the C level to other R packages. Wush Wu is going to use the murmurHash C implementation in his recently-created FeatureHashing package.

We plan to export the other hashing function as well. Another small change attempts to overcome a build limitation on that other largely-irrelevant-but-still-check-by-CRAN OS.

CRANberries provides the usual summary of changes to the previous version.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Michal Čihař: No Windows builds for Gammu and Wammu

31 December, 2014 - 18:00

For quite some time I used to produce Windows builds for both Wammu and Gammu using cross compiling on Linux. But this has proven to produce some errors and needed my time to maintain the cross compilation environment. I've decided to stop producing Windows binaries and I don't expect to get back to that anytime soon.

This is actually no news for Wammu, where I've removed Windows builds some about two years ago as they proven to be too broken for normal usage, but for Gammu it's new as previous release had Windows builds. I've lost the cross compilation environment due to hard drive failure and restoring it is simply too much of work and still will not allow me to build complete release (I've not managed to build Python modules properly).

So if anybody is interested in Windows binaries, he needs to produce them on Windows. I can help with fixing code or existing setup scripts (they probably need adjustments as they were tweaked for cross compiling), but somebody has to setup the environment with all dependencies and test the build on Windows.

Filed under: English Gammu Wammu | 0 comments | Flattr this!

Pages

Creative Commons License ลิขสิทธิ์ของบทความเป็นของเจ้าของบทความแต่ละชิ้น
ผลงานนี้ ใช้สัญญาอนุญาตของครีเอทีฟคอมมอนส์แบบ แสดงที่มา-อนุญาตแบบเดียวกัน 3.0 ที่ยังไม่ได้ปรับแก้