Planet Debian

Subscribe to Planet Debian feed
Planet Debian - http://planet.debian.org/
Updated: 1 hour 10 min ago

Julian Andres Klode: apt 1.3 RC4 – Tweaking apt update

3 September, 2016 - 02:50

Did that ever happen to you: You run apt update, it fetches a Release file, then starts fetching DEP-11 metadata, then any pdiff index stuff, and then applies them; all after another? Or this: You don’t see any update progress until very near the end? Worry no more: I tweaked things a bit in 1.3~rc4 (git commit).

Prior to 1.3~rc4, acquiring the files for an update worked like this: We create some object for the Release file, once a release file is done we queue any next object (DEP-11 icons, .diff/Index files, etc). There is no prioritizing, so usually we fetch the 5MB+ DEP-11 icons and components files first, and only then start working on other indices which might use Pdiff.

In 1.3~rc4 I changed the queues to be priority queues: Release files and .diff/Index files have the highest priority (once we have them all, we know how much to fetch). The second level of priority goes to the .pdiff files which are later on passed to the rred process to patch an existing Packages, Sources, or Contents file. The third priority level is taken by all other index targets.

Actually, I implemented the priority queues back in Jun. There was just one tiny problem: Pipelining. We might be inserting elements into our fetching queues in order of priority, but with pipelining enabled, stuff of lower priority might already have their HTTP request sent before we even get to queue the higher priority stuff.

Today I had an epiphany: We fill the pipeline up to a number of items (the depth, currently 10). So, let’s just fill the pipeline with items that have the same (or higher) priority than the maximum priority of the already-queued ones; and pretend it is full when we only have lower priority items.

And that works fine: First the Release and .diff/Index stuff is fetched, which means we can start showing accurate progress info from there one. Next, the pdiff files are fetched, meaning that we can apply them in parallel to any targets downloading later in parallel (think DEP-11 icon tarballs).

This has a great effect on performance: For the 01 Sep 2016 03:35:23 UTC -> 02 Sep 2016 09:25:37 update of Debian unstable and testing with Contents and appstream for amd64 and i386, update time reduced from 37 seconds to 24-28 seconds.

 

In other news

I recently cleaned up the apt packaging which renamed /usr/share/bug/apt/script to /usr/share/bug/apt. That broke on overlayfs, because dpkg could not rename the old apt directory to a backup name during unpack (only directories purely on the upper layer can be renamed). I reverted that now, so all future updates should be fine.

David re-added the Breaks against apt-utils I recently removed by accident during the cleanup, so no more errors about overriding dump solvers. He also added support for fingerprints in gpgv’s GOODSIG output, which apparently might come at some point.

I Also fixed a few CMake issues, fixed the test suite for gpgv 2.1.15, allow building with a system-wide gtest library (we really ought to add back a pre-built one in Debian), and modified debian/rules to pass -O to make. I wish debhelper would do the latter automatically (there’s a bug for that).

Finally, we fixed some uninitialized variables in the base256 code, out-of-bound reads in the Sources file parser, off-by-one errors in the tagfile comment stripping code[1], and some memcpy() with length 0. Most of these will be cherry-picked into the 1.2 (xenial) and 1.0.9.8 (jessie) branches (releases 1.2.15 and 1.0.9.8.4). If you forked off your version of apt at another point, you might want to do the same.

[1] those were actually causing the failures and segfaults in the unit tests on hurd-i386 buildds. I always thought it was a hurd-specific issue…

PS. Building for Fedora on OBS has a weird socket fd #3 that does not get closed during the test suite despite us setting CLOEXEC on it. Join us in #debian-apt on oftc if you have ideas.


Filed under: Debian, Ubuntu

Steve McIntyre: And about time, too!

2 September, 2016 - 20:54

A couple of weekends back, we had an awesome time at Jonathan and Charlene's wedding. I'd have blogged sooner, but I had to wait for this photo of the happy couple with the Debian gang... :-)

Unfortunately, somebody let Charlene hide at the back! Follow the link for more photos...

Iain R. Learmonth: Burgers 2016

2 September, 2016 - 17:20

Me and Ana travelled to Cambridge last weekend for the Debian UK BBQ. We travelled by train and it was a rather scenic journey. In the past, on long journeys, I’ve used APRS-IS to beacon my location and plot my route but I have recently obtained the GPS module for my Yaesu VX-8DE and I thought I’d give some real RF APRS a go this time.

While the APRS IGate coverage in the UK is a little disappointing, as is evidenced by the map, a few cool things did happen. I recieved a simplex APRS message from a radio amateur 2M0RRT with the text “test test IO86ML” (but unfortunately didn’t notice until we’d long passed by, sorry for not replying!) and quite a few of my packets, sent from a 5 watt handheld in Cambridge, were heard by the station M0BPQ-1 in North London (digipeated by MB7UM).

My APRS Position Reports for the Debian UK BBQ 2016

Some of you will know that since my trip to the IETF in Berlin, I’ve been without a working laptop (it got a bit smashed up on the plane). This also caused me to miss the reminder to renew the expiry on my old GPG key, which I have now retired. My new GPG key is not yet in the Debian keyring but can be found in the Tor Project keyring already. A request has already been filed to replace the key in the Debian keyring, and thanks to attendees at the BBQ, I have some nice shiny signatures on my new key. (I’ll get to returning those signatures as soon as I can.)

We’ve been making a lot of progress with Debian Live and the BBQ saw live- tasks being uploaded to the archive. This source package builds a number of metapackages, each of which configures a system to be used an a live CD for a different desktop environment. I would like for as much of the configuration that can be done within the image as possible to be done within the image, as this will help with reproducibility. A new upload for live-wrapper should be coming next week, and this version will allow these live-task-* packages to be used to build images for testing. I hope to have weekly builds for the live images running very soon.

As I’ve been without a computer for a while, I’m just getting back into things now, so expect that I’ll be slow to respond to communication for a while but I’ll also be making commits and uploads and trying to clear this backlog as quickly as I can (including my Tor Project backlog).

Daniel Pocock: Arrival at FSFE Summit and QtCon 2016, Berlin

2 September, 2016 - 15:46


The FSFE Summit and QtCon 2016 are getting under way at bcc, Berlin. The event comprises a range of communities, including KDE and VideoLAN and there are also a wide range of people present who are active in other projects, including Debian, Mozilla, GSoC and many more.

Talks

Today, some time between 17:30 and 18:30 I'll be giving a lightning talk about Postbooks, a Qt and PostgreSQL based free software solution for accounting and ERP. For more details about how free, open source software can make your life easier by helping keep track of your money, see my comparison of free, open source accounting software.

Saturday, at 15:00 I'll give a talk about Free Communications with Free Software. We'll look at some exciting new developments in this area and once again, contemplate the question can we hope to use completely free and private software to communicate with our friends and families this Christmas? (apologies to those who don't celebrate Christmas, the security of your communications is just as important too).

A note about the entrance fee...

There is an entry fee for the QtCon event, however, people attending the FSFE Summit are invited to attend by making a donation. Contact FSFE for more details and consider joining the FSFE Fellowship.

Thadeu Lima de Souza Cascardo: GNU libc and Linux

1 September, 2016 - 18:34

Some time ago, I built a static program that I wanted to run on an Android tablet. What was my surprise when I saw a message saying "FATAL: kernel too old".

After some investigation, it turns out that GNU libc may assume some Linux features are present during build time. This means that given a minimum Linux version, that built libc might only work on that version or newer.

Since 2014, GNU libc itself requires 2.6.32 as the minimum. Previously, it was 2.6.16, changed in 2012.

Debian, as of 2015, builds it with a required minimum Linux version of 3.2.

To give an idea about the history of these kernel releases, we have:

  • December 2009, Linus releases Linux 2.6.32.
  • November 2010, Red Hat releases RHEL 6.0, Linux 2.6.32.
  • February 2011, Debian 6.0 Squeeze, Linux 2.6.32.
  • January 2012, Linus releases Linux 3.2.
  • April 2014, GNU Libc requires Linux 2.6.32.
  • December 2015, Debian GNU Libc uploaded to unstable requires Linux 3.2.

So, at least for GNU libc upstream, it would appear that not many devices would stop being supported, though the situation would not be as good for binary versions of Debian. However, I have a small list of devices that might show otherwise.

  • Nokia N900 shipped with Linux 2.6.28.
  • Android emulator, a platform called Goldfish, uses Linux 2.6.29.
  • The Wii port most widely tested is based on Linux 2.6.32.

Many Android devices have been shipped with Linux 3.4, but I encountered at least one using Linux 3.0.

So, even though many new devices ship with newer versions of Linux, it's still possible to find some new and older devices using versions older than 3.2, and even versions older than 2.6.32 may be found.

Another interesting note is that, without a few patches, it's not possible to build Linux 2.6.32 with GCC 5 and newer. For that and many other reasons, it's important that we update. For bug fixes, and so we can make progress and use better software are some of the other reasons.

So, it's imperative that we have devices support upstream. Otherwise, the task of doing updates with forward porting becomes daunting. And that's the current state for many devices, which is why I have been trying to use new software with older versions of Linux. But as time passes, I realize how hard a task this is, as most new software these days, even a building block like GNU libc, requires ever new versions of Linux.

For now, most gadgets I have support Linux 3.4 or newer. But not long ahead, that support will be dropped as well. And that means there will be no more updates for those devices. It's a consequence of both targeting time-to-market and programmed obsolescence as business practices. Upstream support is no priority, and maintenance is only that required until the next model is available on the stores.

This is one more reason why we need to have more operating systems available for those devices. Systems that are designed to last more than a couple of years. As I said, upstream support is imperative, but as a step forward, I still believe we can provide the GNU experience to lots of devices out there.

Jamie McClelland: Trusted Mobile Device: How hard could it be?

1 September, 2016 - 10:36

I bought a new phone. After my experiences with signal and the helpful comments readers gave regarding the ability to run android and signal without Google Play using microg I thought I would give it a shot.

Since microg reports that signature spoofing is required and comes out-of-the-box with omnirom I thought I'd aim for installing omnirom's version of Android 6 (marshmallow) after years of using cyanomgenmod's version of Android.

The Nexus line of phones seemed well-supported by omnirom in particular (and the alternative ROM community in general) so I bought a Nexus 5x.

I carefully followed the directions for installing omnirom however when it came time to boot into omnirom, I just got the boot sequence animation over and over again.

Frustrated, I decided to go back to cyanogenmod and see if I could use one of the microg recommended methods for getting signature spoofing to work. The easiest seemed to be Needle by moosd but alas no such luck with Marshmallow. Someone else forked the code and might fix it one day. I then spent too much time trying to understand what xposed is before I gave up understanding it and just tried to install it (woops, looks like the installer page is out of date so instead I followed sketchy instructions from a forum thread). Well, to make a long story short it resulted in a boot loop.

So, I decided to return to omnirom. After reading some vague references to omnirom and supersu, I decided to flash both of them together and voila, it worked!

Next, I decided to enable full disk encryption. Not so fast. After clicking through the screens and hitting the final confirmation, my phone rebooted and spent the next 5 hours showing me the omnirom boot animation. Somehow, powering down and starting again resulted in a working machine, but no disk encryption.

After much web searching, guessing and trial and error, I fixed the problem by clicking on the SuperSU option to "Full unroot" the device (I pressed "no" when prompted to attempt to restore stock image). Then I rebooted and followed the directions to encrypt the device. And it worked! Hooray!

I had to reboot and re-flash the supersu to regain su privileges.

All was great.

The first root action I decided to take was to install the cryptfs program from f-droid because using the same password to decrypt your device as you use to unlock the screen seems either tedious or insecure.

That process didn't work so well. I got a message saying: use this command from a root shell before you reboot: vdc cryptfs changepw <password>. I followed the advice, carefully typing in my 12 character password which includes numbers and letters.

Then, I happily did what I expected to be my last reboot when, to my horror, I was prompted to decrypt my disk with ... a numeric-only keypad.

That wasn't going to work. At this point I had already spent 5 days and about 8 hours on this project. Sigh. So, I started over.

Guess what? It only took me 25 minutes but, it seems that cryptfs is broken. Even with a numeric password it fails. Ok, I guess I need a long pin to unlock my phone. This time it only took my 15 minutes to wipe and re-install everything.

There are only two positive things I can think of:

  • TWRP, which provides the recovery image, is really great. Everytime something went wrong I booted into the TWRP recovery image and could fix anything.
  • I'm starting to get used to the error on startup warning me that "Your device is corrupt. It can't be trusted and may not work properly." It's a good thing to remember about all digital devices.

p.s. I haven't even tried to install microg yet... which was the whole point.

Enrico Zini: Links for September 2016

1 September, 2016 - 05:00
A Few Useful Mental Tools from Richard Feynman [archive]

«These tricks show Feynman taking the method of thought he learned in pure science and applying it to the more mundane topics most of us have to deal with every day.»

Pasta [archive]

A comprehensive introduction to pasta, to keep at hand in case I meet someone who has little familiarity with it.

MPTP: One Designer Drug and Serendipity [archive]

Abstract: Through an unlikely series of coincidences and fortunate accidents, the development of Parkinson’s disease in several illicit drug users was traced to their use of a meperidine analog contaminated with 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). The discovery of a chemical capable of producing animal models of the disease has revitalized research efforts and resulted in important new information. The serendipitous finding also prompted consideration of what changes seem advisable if designer drugs are to be dealt with more efficaciously.

The Debunking Handbook: now freely available for download

The Debunking Handbook, a guide to debunking misinformation, is now freely available to download. Although there is a great deal of psychological research on misinformation, there's no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of myths.

Faulty neon light jams radio appliances [archive]

«An apparent interference source began plaguing wireless vehicle key fobs, cell phones, and other wireless electronics. Key fob owners found they could not open or start their vehicles remotely until their vehicles were towed at least a block away, nor were they able to call for help on their cell phones when problems occurred»

Calvin & Muad'Dib

Calvin & Hobbes with text taken from Frank Herbert's Dune. It's been around since 2013 and I consistently found it moving and deep.

When Birds Attack - Bike Helmet Hacks [archive]

Australian magpies attacking cyclists has propted several creative adaptations, including attaching an afro wig to the bike helmet.

Chris Lamb: Free software activities in August 2016

1 September, 2016 - 04:48

Here is my monthly update covering what I have been doing in the free software world (previously):

  • Worked on nsntrace, a userspace tool to perform network traces on processes using kernel namespaces:
    • Overhauled error handling to ensure the return code of the wrapped process is returned to the surrounding environment. (#10).
    • Permit the -u argument to also accept uids as well as usernames. (#16).
    • Always kill the (hard-looping) udp_send utility, even on test failures. (#13).
    • Updated configure.ac to look for iptables in /sbin & /usr/sbin (#11) and to raise an error if pcap.h is missing (#15).
    • Drop bashisms in #!/bin/sh script (#14) and ignore the generated manpage in the Git repository (#12).
  • Independently discovered an regression in the Django web development framework where field__isnull=False filters were not working with some foreign keys, resulting in extending the testsuite and release documentation. (#7104).
  • Proposed a change to django-enumfield (a custom field for type-safe constants) to ensure passing a string type to Enum.get returned None on error to match the documentation. (#36).
  • Fixed an issue in the Mopidy music player's podcast extension where the testsuite was failing tests in extreme timezones. (#40).
  • Proposed changes to make various upstream's reproducible:
    • botan, a crypto/TLS library for C++11. (#587).
    • cookiecutter, a project template generator, removing nondeterministic keyword arguments from appearing in the documentation. (#800).
    • pyicu, a Python wraper for the IBM Unicode library. (#27).
  • Integrated a number of issues raised by @piotr1212 to python-fadvise, my Python interface to posix_fadvise(2), where the API was not being applied to open file descriptors (#1) and moving the .so to a module directory (#2).
  • Various improvements to try.diffoscope.org, a hosted version of the diffoscope in-depth and content-aware diff utility, including introducing an HTTP API (#21), updating the SSL certificate and correcting a logic issue where errors in diffoscope itself were not being detected correctly (b0ff49). Continued thanks to Bytemark for sponsoring the hardware.
  • Fixed a bug in django-slack, my library to easily post messages to the Slack group-messaging utility, correcting an EncodeError exception under Python 3 (#53) and updated the minimum required version of Django to 1.7 (#54).
  • Various updates to tickle-me-email, my Getting Things Done-inspired email toolbox, to also match / in IMAP's LIST separators (#6) and to encode the folder list as UTF-7 (#7). Thanks to @resiak.
  • Clarified the documentation for travis.debian.net — my hosted script to easily test and build Debian packages on the Travis CI continuous integration platform — regarding how to integrate with Github (#20).

Reproducible builds

Whilst anyone can inspect the source code of free software for malicious flaws, most Linux distributions provide binary (or "compiled") packages to end users.

The motivation behind the Reproducible Builds effort is to allow verification that no flaws have been introduced — either maliciously and accidentally — during this compilation process by promising identical binary packages are always generated from a given source.


Toolchain issues

I submitted the following patches to fix reproducibility-related toolchain issues:


My work in the Reproducible Builds project was also covered in our weekly reports. (#67, #68, #69, #70).


Diffoscope

diffoscope is our "diff on steroids" that will not only recursively unpack archives but will transform binary formats into human-readable forms in order to compare them:

  • Added a command-line interface to the try.diffoscope.org web service.
  • Added a JSON comparator.
  • In the HTML output, highlight lines when hovering to make it easier to visually track.
  • Ensure that we pass str types to our Difference class, otherwise we can't be sure we can render them later.
  • Testsuite improvements:
    • Generate test coverage reports.
    • Add tests for Haskell and GitIndex comparators.
    • Completely refactored all of the comparator tests, extracting out commonly-used routines.
    • Confirm rendering of text and HTML presenters when checking non-existing files.
    • Dropped a squashfs test as it was simply too unreliable and/or has too many requirements to satisfy.
  • A large number of miscellaneous cleanups, including:
    • Reworking the comparator setup/preference internals by dynamically importing classes via a single list.
    • Split exceptions out into dedicated diffoscope.exc module.
    • Tidying the PROVIDERS dict in diffoscope/__init__.py.
    • Use html.escape over xml.sax.saxutils.escape, cgi.escape, etc.
    • Removing hard-coding of manual page targets names in debian/rules.
    • Specify all string format arguments as logging function parameters, not using interpolation.
    • Tidying imports, correcting indentation levels and drop unnecessary whitespace.

disorderfs

disorderfs is our FUSE filesystem that deliberately introduces nondeterminism in system calls such as readdir(3).

  • Added a testsuite to prevent regressions. (f124965)
  • Added a --sort-dirents=yes|no option for forcing deterministic ordering. (2aae325)

Other
  • Improved strip-nondeterminism, our tool to remove specific nondeterministic information after a build:
    • Match more styles of Java .properties files.
    • Remove hyphen from "non-determinism" and "non-deterministic" throughout package for consistency.
  • Improvements to our testing infrastucture:
    • Improve the top-level navigation so that we can always get back to "home" of a package.
    • Give expandable elements cursor: pointer CSS styling to highlight they are clickable.
    • Drop various trailing underlined whitespaces after links.
    • Explicitly log that build was successful or not.
    • Various code-quality improvements, including prefering str.format over concatentation.
  • Miscellaneous updates to our filter-packages internal tool:
    • Add --random=N and --url options.
    • Add support for --show=comments.
    • Correct ordering so that --show-version runs after --filter-ftbfs.
    • Rename --show-ftbfs to --filter-ftbfs and --show-version to --show=version.
  • Created a proof-of-concept reproducible-utils package to contain commonly-used snippets aimed at developers wishing to make their packages reproducible.


I also submitted 92 patches to fix specific reproducibility issues in advi, amora-server, apt-cacher-ng, ara, argyll, audiotools, bam, bedtools, binutils-m68hc1x, botan1.10, broccoli, congress, cookiecutter, dacs, dapl, dateutils, ddd, dicom3tools, dispcalgui, dnssec-trigger, echoping, eekboek, emacspeak, eyed3, fdroidserver, flashrom, fntsample, forkstat, gkrellm, gkrellm, gnunet-gtk, handbrake, hardinfo, ircd-irc2, ircd-ircu, jack-audio-connection-kit, jpy, kxmlgui, libbson, libdc0, libdevel-cover-perl, libfm, libpam-ldap, libquvi, librep, lilyterm, mozvoikko, mp4h, mp4v2, myghty, n2n, nagios-nrpe, nikwi, nmh, nsnake, openhackware, pd-pdstring, phpab, phpdox, phpldapadmin, pixelmed-codec, pleiades, pybit, pygtksourceview, pyicu, python-attrs, python-gflags, quvi, radare2, rc, rest2web, roaraudio, rt-extension-customfieldsonupdate, ruby-compass, ruby-pg, sheepdog, tf5, ttf-tiresias, ttf-tiresias, tuxpaint, tuxpaint-config, twitter-bootstrap3, udpcast, uhub, valknut, varnish, vips, vit, wims, winswitch, wmweather+ & xshisen.


Debian GNU/Linux Patches contributed

I also submitted 22 patches to fix typos in debian/rules files against ctsim, f2c, fonts-elusive-icons, ifrit, ldapscripts, libss7, libvmime, link-grammar, menulibre, mit-scheme, mugshot, nlopt, nunit, proftpd-mod-autohost, proftpd-mod-clamav, rabbyt, radvd, ruby-image-science, snmpsim, speech-tools, varscan & whatmaps.

Debian LTS

This month I have been paid to work 15 hours on Debian Long Term Support (LTS). In that time I did the following:

  • "Frontdesk" duties, triaging CVEs, etc.
  • Authored the patch & issued DLA 596-1 for extplorer, a web-based file manager, fixing an archive traversal exploit.
  • Issued DLA 598-1 for suckless-tools, fixing a segmentation fault in the slock screen locking tool.
  • Issued DLA 599-1 for cracklib2, a pro-active password checker library, fixing a stack-based buffer overflow when parsing large GECOS fields.
  • Improved the find-work internal tool adding optional colour highlighting and migrating it to Python 3.
  • Wrote an lts-missing-uploads tool to find mistakes where there was no correponding package in the archive after an announcement.
  • Added optional colour highlighting to the lts-cve-triage tool.
Uploads
  • redis 2:3.2.3-1 — New upstream release, move to the DEP-5 debian/copyright format, ensure that we are running as root in LSB initscripts and add a README.Source regarding our local copies of redis.conf and sentinel.conf.
  • python-django:
    • 1:1.10-1 — New upstream release.
    • 1:1.10-2 — Fix test failures due to mishandled upstream translation updates.

  • gunicorn:
    • 19.6.0-2 — Reload logrotate in the postrotate action to avoid processes writing to the old files and move to DEP-5 debian/copyright format.
    • 19.6.0-3 — Drop our /usr/sbin/gunicorn{,3}-debian and related Debian-specific machinery to be more like upstream.
    • 19.6.0-4 — Drop "template" systemd .service files and point towards examples and documentation instead.

  • adminer:
    • 4.2.5-1 — Take over package maintenance, completely overhauling the packaging with a new upstream version, move to virtual-mysql-server to support MariaDB, updating package names of dependencies and fix the outdated Apache configuration.
    • 4.2.5-2 — Correct the php5 package names.

Bugs filed (without patches) RC bugs

I filed 3 RC bugs with patches:



I additionally filed 8 RC bugs for packages that access the internet during build against autopkgtest, golang-github-xenolf-lego, pam-python, pexpect, python-certbot, python-glanceclient, python-pykka & python-tornado.



I also filed 74 FTBFS bugs against airlift-airline, airlift-slice, alter-sequence-alignment, apktool, atril, auto-apt-proxy, bookkeeper, bristol, btfs, caja-extensions, ccbuild, cinder, clustalo, colorhug-client, cpp-netlib, dimbl, edk2, elasticsearch, ganv, git-remote-hg, golang-codegangsta-cli, golang-goyaml, gr-radar, imagevis3d, jacktrip, jalv, kdepim, kiriki, konversation, libabw, libcereal, libdancer-plugin-database-perl, libdist-zilla-plugins-cjm-perl, libfreemarker-java, libgraph-writer-dsm-perl, libmail-gnupg-perl, libminc, libsmi, linthesia, lv2-c++-tools, lvtk, mate-power-manager, mcmcpack, mopidy-podcast, nageru, nfstrace, nova, nurpawiki, open-gram, php-crypt-gpg, picmi, projectl, pygpgme, python-apt, python-django-bootstrap-form, python-django-navtag, python-oslo.config, qmmp, qsapecng, r-cran-sem, rocs, ruby-mini-magick, seahorse-nautilus, shiro, snap, tcpcopy, tiledarray, triggerhappy, ucto, urdfdom, vmmlib, yara-python, yi & z3.


FTP Team

As a Debian FTP assistant I ACCEPTed 90 packages: android-platform-external-jsilver, android-platform-frameworks-data-binding, camlpdf, consolation, dfwinreg, diffoscope, django-restricted-resource, django-testproject, django-testscenarios, gitlab-ci-multi-runner, gnome-shell-extension-taskbar, golang-github-flynn-archive-go-shlex, golang-github-jamesclonk-vultr, golang-github-weppos-dnsimple-go, golang-golang-x-time, google-android-ndk-installer, haskell-expiring-cache-map, haskell-hclip, haskell-hdbc-session, haskell-microlens-ghc, haskell-names-th, haskell-persistable-record, haskell-should-not-typecheck, haskell-soap, haskell-soap-tls, haskell-th-reify-compat, haskell-with-location, haskell-wreq, kbtin, libclipboard-perl, libgtk3-simplelist-perl, libjs-jquery-selectize.js, liblemon, libplack-middleware-header-perl, libreoffice, libreswan, libtest-deep-json-perl, libtest-timer-perl, linux, linux-signed, live-tasks, llvm-toolchain-3.8, llvm-toolchain-snapshot, lua-luv, lua-torch-image, lua-torch-nn, magic-wormhole, mini-buildd, ncbi-vdb, node-ast-util, node-es6-module-transpiler, node-es6-promise, node-inline-source-map, node-number-is-nan, node-object-assign, nvidia-graphics-drivers, openhft-chronicle-bytes, openhft-chronicle-core, openhft-chronicle-network, openhft-chronicle-threads, openhft-chronicle-wire, pycodestyle, python-aptly, python-atomicwrites, python-click-log, python-django-casclient, python-git-os-job, python-hypothesis, python-nosehtmloutput, python-overpy, python-parsel, python-prov, python-py, python-schema, python-tackerclient, python-tornado, pyvo, r-cran-cairo, r-cran-mi, r-cran-rcppgsl, r-cran-sem, ruby-curses, ruby-fog-rackspace, ruby-mixlib-archive, ruby-tzinfo-data, salt-formula-swift, scapy3k, self-destructing-cookies, trollius-redis & websploit.

Michal &#268;iha&#345;: Weblate 2.8

31 August, 2016 - 16:30

Quite on schedule (just one day later), Weblate 2.7 is out today. This release brings Subversion support or improved zen mode.

Full list of changes:

  • Documentation improvements.
  • Translations.
  • Updated bundled javascript libraries.
  • Added list_translators management command.
  • Django 1.8 is no longer supported.
  • Fixed compatibility with Django 1.10.
  • Added Subversion support.
  • Separated XML validity check from XML mismatched tags.
  • Fixed API to honor HIDE_REPO_CREDENTIALS settings.
  • Show source change in zen mode.
  • Alt+PageUp/PageDown/Home/End now works in zen mode as well.
  • Add tooltip showing exact time of changes.
  • Add option to select filters and search from translation page.
  • Added UI for translation removal.
  • Improved behavior when inserting placeables.
  • Fixed auto locking issues in zen mode.

If you are upgrading from older version, please follow our upgrading instructions.

You can find more information about Weblate on https://weblate.org, the code is hosted on Github. If you are curious how it looks, you can try it out on demo server. You can login there with demo account using demo password or register your own user. Weblate is also being used on https://hosted.weblate.org/ as official translating service for phpMyAdmin, OsmAnd, Aptoide, FreedomBox, Weblate itself and many other projects.

Should you be looking for hosting of translations for your project, I'm happy to host them for you or help with setting it up on your infrastructure.

Further development of Weblate would not be possible without people providing donations, thanks to everybody who have helped so far! The roadmap for next release is just being prepared, you can influence this by expressing support for individual issues either by comments or by providing bounty for them.

Filed under: Debian English Gammu phpMyAdmin SUSE Weblate | 0 comments

Joey Hess: late summer

31 August, 2016 - 08:15

With days beginning to shorten toward fall, my house is in initial power saving mode. Particularly, the internet gateway is powered off overnight. Still running electric lights until bedtime, and still using the inverter and other power without much conservation during the day.

Indeed, I had two laptops running cpu-melting keysafe benchmarks for much of today and one of them had to charge up from empty too. That's why the house power is a little low, at 11.0 volts now, despite over 30 amp-hours of power having been produced on this mostly clear day. (1 week average is 18.7 amp-hours)

September/October is the tricky time where it's easy to fall off a battery depletion cliff and be stuck digging out for a long time. So time to start dusting off the conservation habits after summer's excess.

I think this is the first time I've mentioned any details of living off grid with a bare minimum of PV capacity in over 4 years. Solar has a lot of older posts about it, and I'm going to note down the typical milestones and events over the next 8 months.

Mike Gabriel: credential-sheets: User Account Credential Sheets Tool

31 August, 2016 - 03:05
Preface This little piece of work has been pending on my todo list for about two years now. For our local school project "IT-Zukunft Schule" I wrote the little tool credential-sheets. It is a little Perl script that turns a series of import files (CSV format) as they have to be provided for user mass import into GOsa² (i.e. LDAP) into a series of A4 sheets with little cards on them, containing initial user credential information. The upstream sources are on Github and I have just uploaded this little tool to Debian. Introduction After mass import of user accounts (e.g. into LDAP) most site administrators have to create information sheets (or snippets) containing those new credentials (like username, password, policy of usage, etc.). With this tiny tool, providing these pieces of information to multiple users, becomes really simple. Account data is taken from a CSV file and the sheets are output as PDF using easily configurable LaTeX template files. Usage Synopsis: credential-sheets [options] <CSV-file-1> [<CSV-file-2> [...]] Options The credential-sheets command accepts the following command-line options:
   --help Display a help with all available command line options and exit.

   --template=<tpl-name>
          Name of the template to use.

   --cols=<x>
          Render <x> columns per sheet.

   --rows=<y>
          Render <y> rows per sheet.

   --zip  Do create a ZIP file at the end.

   --zipfilename=<zip-file-name>
          Alternative ZIP file name (default: name of parent folder).

   --debug
          Don't remove temporary files.
CSV File Column Arrangement The credential-sheets tool can handle any sort of column arrangement in given CSV file(s). It expects the CSV file(s) to have column names in their first line. The given column names have to map to the VAR-<column-name> placeholders in credential-sheets's LaTeX templates. The shipped-with templates (students, teachers) can handle these column names:
  • login -- The user account's login id (uid)
  • lastName -- The user's last name(s)
  • firstName -- The user's first name(s)
  • password -- The user's password
  • form -- The form name/ID (student template only)
  • subjects -- A list of subjects taught by a teacher (teacher template only)
If you create your own templates, you can be very flexible in using your own column names and template names. Only make sure that the column names provided in the CSV file(s)'s first line match the variables used in the customized LaTeX template. Customizations The shipped-with credential sheets templates are expected to be installed in /usr/share/credential-sheets/ for system-wide installations. When customizing templates, simply place a modified copy of any of those files into ~/.credential-sheets/ or /etc/credential-sheets/. For further details, see below. The credential-sheets tool uses these configuration files:
  • header.tex (LaTeX file header)
  • <tpl-name>-template.tex (where as <tpl-name> students and teachers is provided on default installations, this is extensible by defining your own template files, see below).
  • footer.tex (LaTeX file footer)
Search paths for configuration files (in listed order):
  • $HOME/.credential-sheets/
  • ./
  • /etc/credential-sheets/
  • /usr/local/share/credential-sheets/
  • /usr/share/credential-sheets/
You can easily customize the resulting PDF files generated with this tool by placing your own template files, header and footer where appropriate. Dependencies This project requires the following dependencies:
  • Text::CSV Perl module
  • Archive::Zip Perl module
  • texlive-latex-base
  • texlive-fonts-extra
Copyright and License Copyright © 2012-2016, Mike Gabriel <mike.gabriel@das-netzwerkteam.de>. Licensed under GPL-2+ (see COPYING file).

Daniel Stender: My work for Debian in August

31 August, 2016 - 00:42

Here's again a little list of my humble off-time contributions I'm happy to add to the large amount of work we're completing all together each month. Then there is one more "new in Debian" (meaning: "new in unstable") announcement. First, the uploads (a few of them are from July):

  • afl/2.21b-1
  • djvusmooth/0.2.17-1
  • python-bcrypt/3.1.0-1
  • python-latexcodec/1.0.3-4 (closed #830604)
  • pylint-celery/0.3-2 (closed #832826)
  • afl/2.28b-1 (closed #828178)
  • python-afl/0.5.4-1
  • vulture/0.10-1
  • afl/2.30b-1
  • prospector/0.12.2-1
  • pyinfra/0.1.1-1
  • python-afl/0.5.4-2 (fix of elinks_dump_varies_output_with_locale)
  • httpbin/0.5.0-1
  • python-afl/0.5.5-1 (closed #833675)
  • pyinfra/0.1.2-1
  • afl/2.33b-1 (experimental, build/run on llvm 3.8)
  • pylint-flask/0.3-2 (closed #835601)
  • python-djvulibre/0.8-1
  • pylint-flask/0.5-1
  • pytest-localserver/0.3.6-1
  • afl/2.33b-2
  • afl/2.33b-3

New packages:

  • keras/1.0.7-1 (initial packaging into experimental)
  • lasagne/0.1+git20160728.8b66737-1

Sponsored uploads:

  • squirrel3/3.1-4 (closed #831210)

Requested resp. suggested for packaging:

  • yapf: Python code formatter
  • spacy: industrial-strength natural language processing for Python
  • ralph: asset management and DCIM tool for data centers
  • pytest-cookies: Pytest plugin for testing Cookiecutter templates
  • blocks: another deep learning framework build on the top of Theano
  • fuel: data provider for Blocks and Python DNN frameworks in general
New in Debian: Lasagne (deep learning framework)

Now that the mathematical expression compiler Theano is available in Debian, deep learning frameworks resp. toolkits which have been build on top of it can become available within Debian, too (like Blocks, mentioned before). Theano is an own general computing engine which has been developed with a focus on machine learning resp. neural networks, which features an own declarative tensor language. The toolkits which have build upon it vary in the way how much they abstract the bare features of Theano, if they are "thick" or "thin" so to say. When the abstraction gets higher you gain more end user convenience up to the level that you have the architectural components of neural networks available for combination like in a lego box, while the more complicated things which are going on "under the hood" (like how the networks are actually implemented) are hidden. The downside is, thick abstraction layers usually makes it difficult to implement novel features like custom layers or loss functions. So more experienced users and specialists might to seek out for the lower abstraction toolkits, where you have to think more in terms of Theano.

I've got an initial package of Keras in experimental (1.0.7-1), it runs (only a Python 3 package is available so far) but needs some more work (e.g. building the documentation with mkdocs). Keras is a minimalistic, high modular DNN library inspired by Torch1. It has a clean, rather easy API for experimenting and fast prototyping. It can also run on top of Google's TensorFlow, and we're going to have it ready for that, too.

Lasagne follows a different approach. It's, like Keras and Blocks, a Python library to create and train multi-layered artificial neural networks in/on Theano for applications like image recognition resp. classification, speech recognition, image caption generation or other purposes like style transfers from paintings to pictures2. It abstracts Theano as little as possible, and could be seen rather like an extension or an add-on than an abstraction3. Therefore, knowledge on how things are working in Theano would be needed to make full use out of this piece of software.

With the new Debian package (0.1+git20160728.8b66737-1)4, the whole required software stack (the corresponding Theano package, NumPy, SciPy, a BLAS implementation, and the nividia-cuda-toolkit and NVIDIA kernel driver to carry out computations on the GPU5) could be installed most conveniently just by a single apt-get install python{,3}-lasagne command6. If wanted with the documentation package lasagne-doc for offline use (no running around on remote airports seeking for a WIFI spot), either in the Python 2 or the Python 3 branch, or both flavours altogether7. While others have to spend a whole weekend gathering, compiling and installing the needed libraries you can grab yourself a fresh cup of coffee. These are the advantages of a fully integrated system (sublime message, as always: desktop users switch to Linux!).

When the installation of packages has completed, the MNIST example of Lasagne could be used for a quick check if the whole library stack works properly8:

$ THEANO_FLAGS=device=gpu,floatX=float32 python /usr/share/doc/python-lasagne/examples/mnist.py mlp 5
Using gpu device 0: GeForce 940M (CNMeM is disabled, cuDNN 5005)
Loading data...
Downloading train-images-idx3-ubyte.gz
Downloading train-labels-idx1-ubyte.gz
Downloading t10k-images-idx3-ubyte.gz
Downloading t10k-labels-idx1-ubyte.gz
Building model and compiling functions...
Starting training...
Epoch 1 of 5 took 2.488s
  training loss:        1.217167
  validation loss:      0.407390
  validation accuracy:      88.79 %
Epoch 2 of 5 took 2.460s
  training loss:        0.568058
  validation loss:      0.306875
  validation accuracy:      91.31 %

The example on how to train a neural network on the MNIST database of handwritten digits is refined (it also provides --help) and explained in detail in the Tutorial section of the documentation in /usr/share/doc/lasagne-doc/html/. Very good starting points are also the IPython notebooks that are available from the tutorials by Eben Olson9 and Geoffrey French on the PyData London 201610. There you have Theano basics, examples for employing convolutional neural networks (CNN) and recurrent neural networks (RNN) for a range of different purposes, how to use pre-trained networks for image recognition, etc.

  1. For a quick comparison of Keras and Lasagne with other toolkits, see Alex Rubinsteyn's PyData NYC 2015 presentation on using LSTM (long short term memory) networks on varying length sequence data like Grimm's fairy tales (https://www.youtube.com/watch?v=E92jDCmJNek 27:30 sq.) 

  2. https://github.com/Lasagne/Recipes/tree/master/examples/styletransfer 

  3. Great introduction to Theano and Lasagne by Eben Olson on the PyData NYC 2015: https://www.youtube.com/watch?v=dtGhSE1PFh0 

  4. The package is "freelancing" currently being in collab-maint, to set up a deep learning packaging team within Debian is in the stage of discussion. 

  5. Only available for amd64 and ppc64el. 

  6. You would need "testing" as package source in /etc/apt/sources.list to install it from the archive at the present time (I have that for years, but if Debian Testing could be advised as productive system is going to be discussed elsewhere), but it's coming up for Debian 9. The cuda-toolkit and pycuda are in the non-free section of the archive, thus non-free (mostly used in combination with contrib) must be added to main. Plus, it's a mere suggestion of the Theano packages to keep Theano in main, so --install-suggests is needed to pull it automatically with the same command, or this must be given explicitly. 

  7. For dealing with Theano in Debian, see this previous blog posting 

  8. Like suggested in the guideline From Zero to Lasagne on Ubuntu 14.04. cuDNN isn't available as official Debian package yet, but could be downloaded as a .deb package after registration at https://developer.nvidia.com/cudnn. It integrates well out of the box. 

  9. https://github.com/ebenolson/pydata2015 

  10. https://github.com/Britefury/deep-learning-tutorial-pydata2016, video: https://www.youtube.com/watch?v=DlNR1MrK4qE 

Christoph Egger: DANE and DNSSEC Monitoring

31 August, 2016 - 00:11

At this year's FrOSCon I repeted my presentation on DNSSEC. In the audience, there was the suggestion of a lack of proper monitoring plugins for a DANE and DNSSEC infrastructure that was easily available. As I already had some personal tools around and some spare time to burn I've just started a repository with some useful tools. It's available on my website and has mirrors on Gitlab and Github. I intent to keep this repository up-to-date with my personal requirements (which also means adding a xmpp check soon) and am happy to take any contributions (either by mail or as "pull requests" on one of the two mirrors). It currently has smtp (both ssmtp and starttls) and https support as well as support for checking valid DNSSEC configuration of a zone.

While working on it it turned out some things can be complicated. My language of choice was python3 (if only because the ssl library has improved since 2.7 a lot), however ldns and unbound in Debian lack python3 support in their bindings. This seems fixable as the source in Debian is buildable and useable with python3 so it just needs packaging adjustments. Funnily the ldns module, which is only needed for check_dnssec, in debian is currently buggy for python2 and python3 and ldns' python3 support is somewhat lacking so I spent several hours hunting SWIG problems.

Rhonda D'Vine: Thomas D

30 August, 2016 - 23:12

It's not often that an artist touches you deeply, but Thomas D managed to do so to the point of that I am (only half) jokingly saying that if there would be a church of Thomas D I would absolutely join it. His lyrics always did stand out for me in the context of the band I found about him, and the way he lives his life is definitely outstanding. And additionally there are these special songs that give so much and share a lot. I feel sorry for the people who don't understand German to be able to appreciate him.

Here are three songs that I suggest you to listen to closely:

  • Fluss: This song gave me a lot of strengh in a difficult time of my life. And it still works wonders when I feel down to get my ass up from the floor again.
  • Gebet an den Planeten: This songs gives me shivers. Let the lyrics touch you. And take the time to think about it.
  • An alle Hinterbliebenen: This song might be a bit difficult to deal with. It's about loss and how to deal with suffering.

Like always, enjoy!

/music | permanent link | Comments: 0 | Flattr this

Joachim Breitner: Explicit vertical alignment in Haskell

30 August, 2016 - 20:35

Chris Done’s automatic Haskell formatter hindent is released in a new version, and getting quite a bit of deserved attention. He is polling the Haskell programmers on whether two or four spaces are the right indentation. But that is just cosmetics…

I am in principle very much in favor of automatic formatting, and I hope that a tool like hindent will eventually be better at formatting code than a human.

But it currently is not there yet. Code is literature meant to be read, and good code goes at length to be easily readable, and formatting can carry semantic information.

The Haskell syntax was (at least I get that impression) designed to allow the authors to write nicely looking, easy to understand code. One important tool here is vertical alignment of corresponding concepts on different lines. Compare

maze :: Integer -> Integer -> Integer
maze x y
| abs x > 4  || abs y > 4  = 0
| abs x == 4 || abs y == 4 = 1
| x ==  2    && y <= 0     = 1
| x ==  3    && y <= 0     = 3
| x >= -2    && y == 0     = 4
| otherwise                = 2

with

maze :: Integer -> Integer -> Integer
maze x y
| abs x > 4 || abs y > 4 = 0
| abs x == 4 || abs y == 4 = 1
| x == 2 && y <= 0 = 1
| x == 3 && y <= 0 = 3
| x >= -2 && y == 0 = 4
| otherwise = 2

The former is a quick to grasp specification, the latter (the output of hindent at the moment) is a desert of numbers and operators.

I see two ways forward:

  • Tools like hindent get improved to the point that they are able to detect such patterns, and indent it properly (which would be great, but very tricky, and probably never complete) or
  • We give the user a way to indicate intentional alignment in a non-obtrusive way that gets detected and preserved by the tool.

What could such ways be?

  • For guards, it could simply detect that within one function definitions, there are multiple | on the same column, and keep them aligned.
  • More general, one could take the approach lhs2Tex (which, IMHO, with careful input, a proportional font and with the great polytable LaTeX backend, produces the most pleasing code listings) takes. There, two spaces or more indicate an alignment point, and if two such alignment points are in the same column, their alignment is preserved – even if there are lines in between!

    With the latter approach, the code up there would be written

    maze :: Integer -> Integer -> Integer
    maze x y
    | abs x > 4   ||  abs y > 4   = 0
    | abs x == 4  ||  abs y == 4  = 1
    | x ==  2     &&  y <= 0      = 1
    | x ==  3     &&  y <= 0      = 3
    | x >= -2     &&  y == 0      = 4
    | otherwise                   = 2

    And now the intended alignment is explicit.

(This post is cross-posted on reddit.)

Petter Reinholdtsen: First draft Norwegian Bokmål edition of The Debian Administrator's Handbook now public

30 August, 2016 - 15:10

In April we started to work on a Norwegian Bokmål edition of the "open access" book on how to set up and administrate a Debian system. Today I am happy to report that the first draft is now publicly available. You can find it on get the Debian Administrator's Handbook page (under Other languages). The first eight chapters have a first draft translation, and we are working on proofreading the content. If you want to help out, please start contributing using the hosted weblate project page, and get in touch using the translators mailing list. Please also check out the instructions for contributors. A good way to contribute is to proofread the text and update weblate if you find errors.

Our goal is still to make the Norwegian book available on paper as well as electronic form.

Dirk Eddelbuettel: RProtoBuf 0.4.5: now with protobuf v2 and v3!

30 August, 2016 - 09:55

A few short weeks after the 0.4.4 release of RProtoBuf, we are happy to announce a new version 0.4.5 which appeared on CRAN earlier today.

RProtoBuf provides R bindings for the Google Protocol Buffers ("Protobuf") data encoding library used and released by Google, and deployed as a language and operating-system agnostic protocol by numerous projects.

This release brings support to the recently-release 'version 3' Protocol Buffers standard, used e.g. by the (very exciting) gRPC project (which was just released as version 1.0). RProtoBuf continues to supportv 'version 2' but now also cleanly support 'version 3'.

Changes in RProtoBuf version 0.4.5 (2016-08-29)
  • Support for version 3 of the Protcol Buffers API

  • Added 'syntax = "proto2";' to all proto files (PR #17)

  • Updated Travis CI script to test against both versions 2 and 3 using custom-built .deb packages of version 3 (PR #16)

  • Improved build system with support for custom CXXFLAGS (Craig Radcliffe in PR #15)

CRANberries also provides a diff to the previous release. The RProtoBuf page has an older package vignette, a 'quick' overview vignette, a unit test summary vignette, and the pre-print for the JSS paper. Questions, comments etc should go to the GitHub issue tracker off the GitHub repo.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

David Moreno: Webhook Setup with Facebook::Messenger::Bot

30 August, 2016 - 01:49

The documentation for the Facebook Messenger API points out how to setup your initial bot webhook. I just committed a quick patch that would make it very easy to setup a quick script to get it done using the unreleased and still in progress Perl’s Facebook::Messenger::Bot:

use Facebook::Messenger::Bot;

use constant VERIFY_TOKEN => 'imsosecret';

my $bot = Facebook::Messenger::Bot->new(); # no config specified!
$bot->expect_verify_token( VERIFY_TOKEN );
$bot->spin();

This should get you sorted. What endpoint would that be, though? Well that depends on how you’re giving Facebook access to your Plack’s .psgi application.

Michal &#268;iha&#345;: motranslator 1.1

29 August, 2016 - 23:00

Four months after 1.0 release, motranslator 1.1 is out. If you happen to use it for untrusted data, this might be as well called security release, though this is still not good idea until we remove usage of eval() used to evaluate plural formula.

Full list of changes:

  • Improved handling of corrupted mo files
  • Minor performance improvements
  • Stricter validation of plural expression

The motranslator is a translation library used in current phpMyAdmin master (upcoming 4.7.0) with focus on speed and memory usage. It uses Gettext MO files to load the translations. It also comes with testsuite (100% coverage) and basic documentation.

Recommended way to install it is using composer from Packagist repository:

composer require phpmyadmin/motranslator

The Debian package will be available probably at point phpMyAdmin 4.7.0 will be out, but if you see need to have it earlier, just let me know.

Filed under: Debian English phpMyAdmin | 0 comments

Zlatan Todorić: Support open source motion comic

29 August, 2016 - 20:25

There is an ongoing campaign for motion comic. It will be done entirely with FLOSS tools (Blender, Krita, GNU/Linux) and besides that, it really looks great (and no, it is not only for the kids!). Please support this effort if you can because it also shows the power of Free software tools. All will be released Creative Commons Atribution-ShareAlike license together with all sources.

Pages

Creative Commons License ลิขสิทธิ์ของบทความเป็นของเจ้าของบทความแต่ละชิ้น
ผลงานนี้ ใช้สัญญาอนุญาตของครีเอทีฟคอมมอนส์แบบ แสดงที่มา-อนุญาตแบบเดียวกัน 3.0 ที่ยังไม่ได้ปรับแก้