Planet Debian

Subscribe to Planet Debian feed
Planet Debian - http://planet.debian.org/
Updated: 52 min 52 sec ago

Mario Lang: Qt 5.11 will fix major issues with JAWS on Windows

28 November, 2017 - 17:22

I published a rant about problems with Qt accessibility on Windows a few months ago. This posting got some unusual amount of attention, as it was shared on HackerNews and almost went to the top for a few minutes.

Apparently, ranting about it after a year of being ignored was not the worst thing to do. I can now confirm that the current dev version of Qt works properly with JAWS for Windows and QTextEdit widgets. This is quite a substantial fix, as it will likely improve the accessibility of many Windows applications written in Qt.

So this bug is finally (after more then a year of waiting) fixed. Thanks to André de la Rocha for implementing UI Automation support, which is apparently what was missing to make JAWS happy.

Thomas Lange: Building your own FAI installation image

28 November, 2017 - 14:59

I just returned from the MiniDebConf in Cambridge, where I gave two talks. One about building disk images (also for cross architecures) using fai-diskimage with FAI and the other about making d-i easier for beginners. The ideas for this talk were also the inspiration for creating the FAI.me web page. Making the usage of FAI easier, using it without installing it. Is this FAI as a service (FaaS)?

On the web page you can easily configure a customized installation image which will then be created for you. Booting this image, you will get a fully unattended installation, based on FAI technique and all software packages are already included on the installation image. The announcement has more details. I'm very excited to get your feedback for this project.

FAI FAI.me

Dirk Eddelbuettel: #11: (Much) Faster Package (Re-)Installation via Caching

28 November, 2017 - 09:19

Welcome to the eleventh post in the rarely rued R rants series, or R4 for short. Time clearly flies as it has been three months since out last post on significantly reducing library size via stripping. I had been meaning to post on today's topic for quite some time, but somehow something (working on a paper, releasing a package, ...) got in the way.

Just a few days ago Colin (of Efficient R Programming fame) posted about speed(ing up) package installation. His recommendation? Remember that we (usually) have multiple cores and using several of them via options(Ncpus = XX). It is an excellent point, and it bears repeating.

But it turns I have not one but two salient recommendations too. Today covers the first, we should hopefully get pretty soon to the second. Both have one thing in common: you will be fastest if you avoid doing the work in the first place.

What?

One truly outstanding tool for this in the context of the installation of compiled packages is ccache. It is actually a pretty old tool that has been out for well over a decade, and it comes from the folks that gave the Samba filesystem.

What does it do? Well, it nutshell, it "hashes" a checksum of a source file once the preprocessor has operated on it and stores the resulting object file. In the case of rebuild with unchanged code you get the object code back pretty much immediately. The idea is very similar to memoisation (as implemented in R for example in the excellent little memoise package by Hadley, Jim, Kirill and Daniel. The idea is the same: if you have to do something even moderately expensive a few times, do it once and then recall it the other times.

This happens (at least to me) more often that not in package development. Maybe you change just one of several source files. Maybe you just change the R code, the Rd documentation or a test file---yet still need a full reinstallation. In all these cases, ccache can help tremdendously as illustrated below.

How?

Because essentially all our access to compilation happens through R, we need to set this in a file read by R. I use ~/.R/Makevars for this and have something like these lines on my machines:

VER=
CCACHE=ccache
CC=$(CCACHE) gcc$(VER)
CXX=$(CCACHE) g++$(VER)
CXX11=$(CCACHE) g++$(VER)
CXX14=$(CCACHE) g++$(VER)
FC=$(CCACHE) gfortran$(VER)
F77=$(CCACHE) gfortran$(VER)

That way, when R calls the compiler(s) it will prefix with ccache. And ccache will then speed up.

There is an additional issue due to R use. Often we install from a .tar.gz. These will be freshly unpackaged, and hence have "new" timestamps. This would usually lead ccache to skip to file (fear of "false positives") so we have to override this. Similarly, the tarball is usually unpackage in a temporary directory with an ephemeral name, creating a unique path. That too needs to be overwritten. So in my ~/.ccache/ccache.conf I have this:

max_size = 5.0G
# important for R CMD INSTALL *.tar.gz as tarballs are expanded freshly -> fresh ctime
sloppiness = include_file_ctime
# also important as the (temp.) directory name will differ
hash_dir = false
Show Me

A quick illustration will round out the post. Some packages are meatier than others. More C++ with more templates usually means longer build times. Below is a quick chart comparing times for a few such packages (ie RQuantLib, dplyr, rstan) as well as igraph ("merely" a large C package) and lme4 as well as Rcpp. The worst among theseis still my own RQuantLib package wrapping (still just parts of) the genormous and Boost-heavy QuantLib library.

Pretty dramatic gains. Best of all, we can of course combine these with other methods such as Colin's use of multiple CPUs, or even a simple MAKE=make -j4 to have multiple compilation units being considered in parallel. So maybe we all get to spend less time on social media and other timewasters as we spend less time waiting for our builds. Or maybe that is too much to hope for...

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Sean Whitton: Debian Policy call for participation -- November 2017, pt. 2

27 November, 2017 - 23:40

Here’s are some more of the bugs against the Debian Policy Manual. In particular, there really are quite a few patches needing seconds from DDs. Please consider getting involved.

Consensus has been reached and help is needed to write a patch

#568313 Suggestion: forbid the use of dpkg-statoverride when uid and gid ar…

#578597 Recommend usage of dpkg-buildflags to initialize CFLAGS and al.

#582109 document triggers where appropriate

#587991 perl-policy: /etc/perl missing from Module Path

#592610 Clarify when Conflicts + Replaces et al are appropriate

#613046 please update example in 4.9.1 (debian/rules and DEB_BUILD_OPTIONS)

#614807 Please document autobuilder-imposed build-dependency alternative re…

#628515 recommending verbose build logs

#664257 document Architecture name definitions

#682347 mark ‘editor’ virtual package name as obsolete

Wording proposed, awaiting review from anyone and/or seconds by DDs

#582109 document triggers where appropriate

#610083 Remove requirement to document upstream source location in debian/c…

#636383 10.2 and others: private libraries may also be multi-arch-ified

#645696 [copyright-format] clearer definitions and more consistent License:…

#649530 [copyright-format] clearer definitions and more consistent License:…

#662998 stripping static libraries

#682347 mark ‘editor’ virtual package name as obsolete

#683495 perl scripts: ”#!/usr/bin/perl” MUST or SHOULD?

#688251 Built-Using description too aggressive

#737796 copyright-format: support Files: paragraph with both abbreviated na…

Merged for the next release

#683495 perl scripts: ”#!/usr/bin/perl” MUST or SHOULD?

#877674 [debian-policy] update links to the pdf and other formats of the do…

#878523 [PATCH] Spelling fixes

Lior Kaplan: AGPL enforced: The Israeli ICT authority releases code

27 November, 2017 - 15:06

Data.gov.il was created in 2011 after the Israeli social justice protests as part of the the public participation initiative and started to offer data held by the government. Back then the website was based on Drupal. In 2016 it was changed to CKAN, a designated system for releasing data. This system is licensed under the AGPLv3 requiring source code availability for anyone who can access the the system over a network, de facto for every user.

Since the change to CKAN, open source people asked the state to release the code according to the license but didn’t get a clear answer. All this time when it’s clear it’s violation.  This led Gai Zomer to file a formal complaint in March 2017 with the Israeli State Comptroller. Absurdly, that same month the ICT authority mentioned a policy to release source code it owns, while failing to release code it has taken from others and adapted.

With the end of the summer break and Jew holidays, and after I wasn’t able to get the source, I decided to switch to legal channels, and with the help of Jonathan Klinger and my company, Kaplan Open Source Consulting, we notified they should provide the source code or we’ll address the court.

Well, it worked. In 3 days time the CKAN extensions where available on the website, but in a problematic way, so users weren’t able to download easily. This is why we decided not to publish this code release and let them fix it first. In addition we made it clear all the source code should be available, not only the extensions. Further more, if they already release it’s recommended to use git format instead of just “dumping” a tarball. So we told them if they aren’t going to make a git repository we’ll do that ourselves, but in any case, would prefer them to do that .

While this issue is still pending, the ICT authority had a conference called “the citizen 360” about e-gov and open government in which they reaffirmed their open source plans.

Now, a month later, after our second latter to them, the about page in data.gov.il was updated with links to the ICT authority GitHub account which has the sources for the website and the extensions. A big improvement, and an important mark point as the commit to the repository was done by an official (gov.il) email address.

Beyond congratulating the Israeli ICT authority for their step forward and the satisfaction of our insisting on them became fruitful, we would like to see the repository get updated on a regular basis, the code being given back to the various CKAN extensions (e.g. Hebrew translation). In general, we’d them to get inspired by how the how data.gov.uk is doing technical transparency. If we allow ourselves to dream, we’d like to see Israel becoming a dominate member in the CKAN community and between the other governments who use it.

We’re happy to be the catalyst for open source in the Israeli government, and we promise to keep insisted where needed. We know that due to other requests and notifications more organizations are on their way to release code.

(This post is a translation from Hebrew of a post in Kaplan Open Source Consulting at https://kaplanopensource.co.il/2017/11/20/data-gov-il-code-release/)


Filed under: Government Policy, Israeli Community, Proud to use free software

Clint Adams: Fundie Ann is still livid about the ark

27 November, 2017 - 03:35

My high school biology teacher is still alive. His wife is in poor health so they sold the Holsteins, but the rest of the farm is still the same.

You may be wondering what dairy cows have to do with Switzerland.

Posted on 2017-11-26 Tags: etiamdisco

Dirk Eddelbuettel: rfoaas 1.1.1: Updated and extended

27 November, 2017 - 02:08

FOAAS upstream is still at release 1.1.0, but added a few new accessors a couple of months ago. So this new version of rfoaas updates to these: asshole(), cup(), fyyff(), immensity(), programmer(), rtfm(), thinking(). We also added test coverage and in doing so noticed that our actual tests never ran on Travis. Yay. Now fixed.

As usual, CRANberries provides a diff to the previous CRAN release. Questions, comments etc should go to the GitHub issue tracker. More background information is on the project page as well as on the github repo

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Sven Hoexter: half-assed Oracle JDK 9 support for java-package

26 November, 2017 - 19:45

Sitting at home in a not so decent state made me finally fiddle with java-package to deal with Oracle Java 9 builds. For now I've added only some half-assed support for JDK 9 amd64 builds. That's what you download as "jdk-9.0.1_linux-x64_bin.tar.gz" from the Oracle Java pages. It's a works for me thing, but maybe someone finds it useful, the source is here.

git clone https://git.sven.stormbind.net/java-package.git
cd java-package
sed -i -e 's#lib_dir="/usr/share/java-package"#lib_dir="./lib"#' make-jpkg

and you can just start using it in this directory without creating and installing the java-package Debian package.

Side note: If you try out Java within a chroot mount /proc into it. Wasted half an hour to find that out this morning.

Urvika Gola: Hack4Climate – Saving Climate while Sailing on the Rhine

26 November, 2017 - 19:39

Last week in Germany, a few miles away from the meeting in COP23 Conference of political leaders & activists to discuss climate there was a bunch, (100 to be exact) of developers and environmentalists participating in Hack4Climate to work on the same global problem – Climate Change.

COP23, Conference of the Parties happens yearly to discuss and plan action about combating climate change, especially the Paris Agreement. This year, it took place in Bonn, Germany which is the home to United Nations Campus. Despite the ongoing efforts by the government, it’s the need of the hour that every single person living on the Earth, contributes at an personal level to fight this problem. After all, we all have, including myself, somehow contributed to the hike in climate change either knowingly or unknowingly. That’s where role of technology comes in. To create a solution by provide pool of resources and correct facts such that everyone can start taking healthy steps.

I will try to put into words explaining all about the thrilling experience Pranav Jain and I had in participating as 2 of the 100 participants selected all over the world earth for Hack4Climate. Pranav was also working closely with Rockstar Recruiting and Hack4Climate team to spread awareness and bring more participatns before the actual event. It was a 4 day hackathon which took place in a *cruise* in front of the United Nations Campus. Before the hackathon began we had informative sessions from the delegates  of various institutions and organisation like UNFCC – United Nations Framework Convention on Climate Change and MIT Media Lab, IOTA, Ethereum. These sessions helped us all to get more insight into the climate problem from a technical and environmental angle. We focussed on using Distributed Ledger Technology – Blockchain & Open Source which can potentially help to combat climate change.

Venue of Hack4Climte – The Scenic Crystal Cruise stopping by the UN Campus in Bonn, Germany  (Source)

 

The 20 teams worked on creating solutions which could be fit into areas like identifying and tracking emissions, carbon pricing, distributed energy, sustainable land use, and sustainable transport.

Pranav Jain and I worked on Green – Low Carbon Diamonds. We used blockchain to track the carbon emission in the mining of the mineral particularly, diamond. Our project helps in tracking the process of mining, cutting, polishing for every unique diamond which is available for purchase. It could also certify a carbon offset for each process and help the diamond company improve efficiency and save money. Our objective was to track carbon emission throughout the supply chain where we considered the kind of machine, transport and power being used.

We wanted to raise awareness among the common customers by putting the putting the numbers (carbon footprint) before them such that they know how much energy and fossils were consumed for the particular mineral. This would help them make a smart and climate friendly and a greener decision during their purchase. After all, our climate is more precious than diamonds.

All project tracks had support from a particular company, who gave more insights and support for data and business model. Our project track was sponsored by EverLedger, a company which believes that transparency is the key to ensure ethical trade. 

Project flow, Source – EverLedger

Everledger’s CEO, Leanne talked about women in technology and swiftly made us realize how we need equal representation of all genders to tackle the global problem. I talked about Outreachy with other female participants and amidst such a diverse set of participants, I felt really connected with a few people I met who were open source contributors. Open source community has always been very warm and fun to interact with. We exchanged what conferences we attend like Fosdem, DebConf and what projects we worked on. Outreachy current round 15 is ongoing however, the applications for the next round 16 of Outreachy internships will open in February 2018 for the May to August 2018 internship round. You can check this link here for more information on projects under Debian and Outreachy. Good luck!

Lastly and most importantly, Thank you Nick Beglinger, (CleanTech21 CEO) and his team who put up this extraordinary event despite the initial challenges and made us all believe that yes we can combat climate change by moving further, faster and together.

Thank you Debian, for always supporting us:)

A few pictures…

Pranav Jain picthing the final product Scenic Crystal, Rhine River and Hack4Climate Tee

 

Thank you for reading!


Renata Scheibler: Chosing a system for the blog

26 November, 2017 - 19:00

This blog was created because I am supposed to report my journey through the Outreachy internship.

Let me start by saying that I'm biased towards systems that use flat files for blogs instead of the ones that require a database. It is so much easier to make the posts available through other means (such as having them backed up in a Git repository) that assure their content will live on even if the site is taken down or dies. It is also so much better to download the content this way, instead of pulling down a huge database file, which may cost a significant amount of money to transfer that amount of data. Having flat files with your content with a format that is shared among many systems (such as Markdown) might also assure a smooth transition to a new system, should the change become a necessity at some point.

I have experimented some options while working on projects. I played with Lektor while contributing to PyBeeWare. I liked Lektor, but I found it's documentation severely lacking. I worked with Grav while we were working towards getting tem.blog.br back online. Grav is a good CMS and it is definitely an alternative to Wordpress, but, well, it needs a server to host it.

At first, I thought about using Jekyll. It is a good site generator and it even has a Code Academy course on how to create a website and deploy it to Github Pages that I took a while ago. I could have chosen it to develop this blog, but it is written in Ruby. Which is fine, of course. The first steps I took into learning how to program were in Ruby, using Chris Pine's Learn to Program | versão pt-br. So, what is my objection with Ruby? It so happens that I expect most of the development for the Outreachy project will be done using Python (and maybe some Javascript) and I thought that adding a third language might make my life a bit harder.

That is how I ended up with Pelican. I had played a bit with it while contributing to the PyLadies Brazil website. During Python Sul, the regional Python conference we had last September, we also had a sprint to make the PyLadies Caxias do Sul website using Pelican and hosting it with Github Pages. It went smoothly. Look how awesome it turned out:

So, how to do one of those? Hang on tight, that I will explain it in detail on my next post! ;)

Sean Whitton: Debian Policy call for participation -- November 2017, pt. 1

26 November, 2017 - 05:14

Here’s are some of the bugs against the Debian Policy Manual. In particular, there really are quite a few patches needing seconds from DDs. Please consider getting involved.

Consensus has been reached and help is needed to write a patch

#172436 BROWSER and sensible-browser standardization

#273093 document interactions of multiple clashing package diversions

#299007 Transitioning perms of /usr/local

#314808 Web applications should use /usr/share/package, not /usr/share/doc/…

#425523 Describe error unwind when unpacking a package fails

#452393 Clarify difference between required and important priorities

#476810 Please clarify 12.5, “Copyright information”

#484673 file permissions for files potentially including credential informa…

#491318 init scripts “should” support start/stop/restart/force-reload - why…

#556015 Clarify requirements for linked doc directories

Wording proposed, awaiting review from anyone and/or seconds by DDs

#756835 Extension of the syntax of the Packages-List field.

#786470 [copyright-format] Add an optional “License-Grant” field

#835451 Building as root should be discouraged

#845255 Include best practices for packaging database applications

#846970 Proposal for a Build-Indep-Architecture: control file field

#864615 please update version of posix standard for scripts (section 10.4)

#874090 Clarify wording of some passages

#874095 copyright-format: Use the “synopsis” term established in the de…

#877674 [debian-policy] update links to the pdf and other formats of the do…

#882445 Proposed change of offensive packages to -offensive

Merged for the next release

#683495 perl scripts: ”#!/usr/bin/perl” MUST or SHOULD?

#877674 [debian-policy] update links to the pdf and other formats of the do…

#878523 [PATCH] Spelling fixes

Joachim Breitner: Existence and Termination

26 November, 2017 - 03:54

I recently had some intense discussions that revolved around issues of existence and termination of functions in Coq, about axioms and what certain proofs actually mean. We came across some interesting questions and thoughts that I’ll share with those of my blog readers with an interest in proofs and interactive theorem proving.

tl;dr
  • It can be meaningful to assume the existence of a function in Coq, and under that assumption prove its termination and other properties.
  • Axioms and assumptions are logically equivalent.
  • Unsound axioms do not necessary invalidate a theory development, when additional meta-rules govern their use.
Preparation

Our main running example is the infamous Collatz series. Starting at any natural number, the next is calculated as follow:

Require Import Coq.Arith.Arith.

Definition next (n : nat) :nat :=
  if Nat.even n then n / 2 else 3*n + 1.

If you start with some positive number, you are going to end up reaching 1 eventually. Or are you? So far nobody has found a number where that does not happen, but we also do not have a proof that it never happens. It is one of the great mysteries of Mathematics, and if you can solve it, you’ll be famous.

A failed definition

But assume we had an idea on how to prove that we are always going to reach 1, and tried to formalize this in Coq. One attempt might be to write

Fixpoint good (n : nat) : bool :=
  if n <=? 1
    then true
    else good (next n).

Theorem collatz: forall n, good n = true.
Proof. (* Insert genius idea here.*) Qed.

Unfortunately, this does not work: Coq rejects this recursive definition of the function good, because it does not see how that is a terminating function, and Coq requires all such recursive function definitions to be obviously terminating – without this check there would be a risk of Coq’s type checking becoming incomplete or its logic being unsound.

The idiomatic way to avoid this problem is to state good as an inductive predicate... but let me explore another idea here.

Working with assumptions

What happens if we just assume that the function good, described above, exists, and then perform our proof:

Theorem collatz
  (good : nat -> bool)
  (good_eq : forall n,
     good n = if n <=? 1 then true else good (next n))
  : forall n, good n = true.
Proof. (* Insert genius idea here.*) Qed.

Would we accept this as a proof of Collatz’ conjecture? Or did we just assume what we want to prove, in which case the theorem is vacuously true, but we just performed useless circular reasoning?

Upon close inspection, we find that the assumptions of the theorem (good and good_eq) are certainly satisfiable:

Definition trivial (n: nat) : bool := true.

Lemma trivial_eq: forall n,
  trivial n = if n <=? 1 then true else trivial (next n).
Proof. intro; case (n <=? 1); reflexivity. Qed.

Lemma collatz_trivial: forall n, trivial n = true.
Proof.
  apply (collatz trivial trivial_eq).
Qed.

So clearly there exists a function of type nat -> bool that satisfies the assumed equation. This is good, because it means that the collatz theorem is not simply assuming False!

Some (including me) might already be happy with this theorem and proof, as it clearly states: “Every function that follows the Collatz series eventually reaches 1”.

Others might still not be at ease with such a proof. Above we have seen that we cannot define the real collatz series in Coq. How can the collatz theorem say something that is not definable?

Classical reasoning

One possible way of getting some assurance it to define good as a classical function. The logic of Coq can be extended with the law of the excluded middle without making it inconsistent, and with that axiom, we can define a version of good that is pretty convincing (sorry for the slightly messy proof):

Require Import Coq.Logic.ClassicalDescription.
Require Import Omega.
Definition classical_good (n:nat) : bool :=
  if excluded_middle_informative (exists m, Nat.iter m next n <= 1)
  then true else false.

Lemma iter_shift:
  forall a f x (y:a), Nat.iter x f (f y) = f (Nat.iter x f y).
Proof.
 intros. induction x. reflexivity. simpl. rewrite IHx. reflexivity. Qed.

Lemma classical_good_eq: forall n,
  classical_good n = if n <=? 1 then true else classical_good (next n).
Proof.
  intros.
  unfold classical_good at 1.
  destruct (Nat.leb_spec n 1).
  * destruct (excluded_middle_informative _); try auto.
    contradict n0. exists 0. simpl. assumption.
  * unfold classical_good.
    destruct (Nat.eqb_spec (next n) 0); try auto.
    destruct (excluded_middle_informative _), (excluded_middle_informative _); auto.
    - contradict n0.
      destruct e0.
      destruct x; simpl in *. omega.
      exists x. rewrite iter_shift. assumption.
    - contradict n0.
      destruct e0.
      exists (S x). simpl. rewrite iter_shift in H0. assumption.
Qed.

Lemma collatz_classical: forall n, classical_good n = true.
Proof. apply (collatz classical_good classical_good_eq). Qed.

The point of this is not so much to use this particular definition of good, but merely to convince ourselves that the assumptions of the collatz theorem above encompass “the” Collatz series, and thus constitutes a proof of the Collatz conjecture.

The main take-away so far is that existence and termination of a function are two separate issues, and it is possible to assume the former, prove the latter, and not have done a vacuous proof.

The ice gets thinner Sections

Starting with the above Theorem collatz, there is another train of thought I invite to to follow along.

Probably the “genius idea” proof will be more than a few lines long, and we probably to be able to declare helper lemmas and other things along the way. Doing all that in the body of the collatz proof is not very convenient, so instead of using assumptions, we might write

Section collatz:
Variable good : nat -> bool.
Variable good_eq : forall n, good n = if n <=? 1 then true else good (next n)

Theorem collatz2 : forall n, good n = true.
Proof. (* Insert genius idea here.*) Qed.
End collatz.

So far so good: Clearly, I just refactored my code a bit, but did not make any significant change. The theorems collatz2 and collatz are equivalent.

Sound axioms

But note that we do not really intend to instantiate collatz2. We know that the assumptions are satisfiable (e.g. since we can define trivial or classical_good). So maybe, we would rather avoid the Section mechanism and simply write

Axiom good : nat -> bool.
Axiom good_eq : forall n, good n = if n <=? 1 then true else good (next n)

Theorem collatz3 : forall n, good n = true.
Proof. (* Insert genius idea here.*) Qed.

I assume this will make a few of my readers’ eyebrows go up: How can I dare to start with such Axioms? Do they not invalidate my whole development?

On the other hand, all that a Coq axiom is doing is saying “the following theorems are under the assumption that the axiom holds”. In that sense, collatz3 and collatz2 are essentially equivalent.

Unsound axioms

Let me take it one step further, and change that to:

Axiom unsafeFix : forall a, (a -> a) -> a.
Axiom unsafeFix_eq : forall f, unsafeFix f = f (unsafeFix f).
Definition good : nat -> bool :=
	unsafeFix (fun good n => if n <=? 1 then true else good (next n)).

Theorem collatz4 : forall n, good n = true.
Proof. (* Insert genius idea here.*) Qed.

At this point, the majority of my readers will cringe. The axiom unsafeFix is so blatantly unsound (in Coq), how do I even dare to think of using it. But bear with me for a moment: I did not change the proof. So maybe the collatz4 theorem is still worth something?

I want to argue that it is: Both unsafeFix and unsafeFix_eq are unsound in their full generality. But as long as I instantiate them only with functions f which have a fixpoint, then I cannot prove False this way. So while “Coq + unsafeFix” is unsound, “Coq + unsafeFix + unsafeFix_eq + metarule that these axioms are only called with permissible f” is not.

In that light, my collatz4 proof carries the same meaning as the collatz3 proof, it is just less convenient to check: If I were to check the validity of collatz3, I have to maybe look for uses of admit, or some misleading use of syntax or other tricks, or other smells. When I have to check the validity of collatz4, I also have to additionally check the meta-rule -- tedious, but certainly possible (e.g. by inspecting the proof term).

Beyond Collatz

The questions discussed here did not come up in the context of the Collatz series (for which I unfortunately do not have a proof), but rather the verification of Haskell code in Coq using hs-to-coq. I started with the idiomatic Haskell definition of “Quicksort”:

quicksort :: Ord a => [a] -> [a]
quicksort [] = []
quicksort (p:xs) = quicksort lesser ++ [p] ++ quicksort greater
    where (lesser, greater) = partition (<p) xs

This function is not terminating in a way that is obvious to the Coq type checker. Conveniently, hs-to-coq can optionally create the Coq code using the unsafeFix axiom above, producing (roughly):

Definition quicksort {a} `{Ord a} : list a -> list a :=
  unsafeFix (fun quicksort xs =>
    match xs with
      | nil => nil
      | p :: xs => match partition (fun x => x <? p) xs with
         | (lesser, greater) => quicksort lesser ++ [p] ++ quicksort greater
      end
    end).

I then proved

Theorem quicksort_sorted:
  forall a `(Ord a) (xs : list a), StronglySorted R (quicksort xs).

and

Theorem quicksort_permutation:
  forall a `(Ord a) (xs : list a), Permutation  (quicksort xs) xs.

These proofs proceed by well-founded induction on the length of the argument xs, and hence encompass a termination proof of quicksort. Note that with a only partially correct but non-terminating definition of quicksort (e.g. quicksort := unsafeFix (fun quicksort xs => quicksort xs)) I would not be able to conclude these proofs.

My (not undisputed) claim about the meaning of these theorems is therefore

If the Haskell equations for |sort| actually have a fixed point, then the use of |unsafeFix| in its definition does not introduce any inconsistency. Under this assumption, we showed that |sort| always terminates and produces a sorted version of the input list.

Do you agree?

Reproducible builds folks: Reproducible Builds: Weekly report #134

25 November, 2017 - 21:12

Here's what happened in the Reproducible Builds effort between Sunday November 12 and Saturday November 18 2017:

Past and upcoming events

We plan to hold an assembly at 34C3 - hope to see you there!

On November 17th Chris Lamb presented at the Open Compliance Summit in Yokohama, Japan on how reproducible builds can ensure the long-term sustainability of technology infrastructure.

GSoC and Outreachy updates

We are pleased to announce that Juliana Oliveira R (jwnx) will be mentored by Mattia Rizzolo on Reproducible Builds / diffoscope in this round of Outreachy!

Reproducible work in other projects Bootstrapping and Diverse Double Compilation

Work on bootstrapping also made progress in MesCC from Jan Nieuwenhuizen. MesCC now compiles a less heavily patched TinyCC into a mes-tcc which passes 41/69 of mescc's C tests.

Qt tests and __FILE__

Our patched GCC that we use for testing Debian unstable has uncovered an interesting issue with certain Qt build-time test suites. The issue is that test suites using QFINDTESTDATA depend on __FILE__ to expand to something that may be reused after compilation as a real filesystem path when starting from the same working directory as the original compilation.

However, this behaviour is not explicitly guaranteed by formal documentation about __FILE__, and thus when it is rewritten to be build-path-independent (by a combination of our patched dpkg and GCC), the Qt tests break because their usage is no longer expanded to a real path as they expected.

Several very short patches were suggested to resolve this issue, including a one-liner that allows our patched GCC to specifically rewrite __FILE__ in Qt test code to a real path which takes advantage of the ability to specify multiple mappings using BUILD_PATH_PREFIX_MAP.

Separately, work is under way to address the other unrelated concerns raised about the patch by GCC upstream back in August.

Packages reviewed and fixed, and bugs filed Reviews of unreproducible packages

35 package reviews have been added, 56 have been updated and 31 have been removed in this week, adding to our knowledge about identified issues.

1 issue type has been updated:

tests.reproducible-builds.org
  • Ed Maste (FreeBSD support):

    • Give ntpd a moment to write its PID file.
    • Start with the correct time.
  • kpcyrd (Archlinux support):

    • Pass SOURCE_DATE_EPOCH from jenkins_node_wrapper.sh.
    • Set SOURCE_DATE_EPOCH.
    • Use $ROOTCMD properly.
    • Set pkgext to .pkg.tar.xz.
    • Fix lost packages.
    • Correctly recognize __END__.
    • pacman.conf is owned by root after upgrade.
    • Add repos to pacman.conf.
  • Holger Levsen:

    • Arch Linux:
      • Re-enable the builders.
      • Add a third builder job to use new resources.
    • FreeBSD:
      • Ignore freebsd_master_git?????????.tar.xz when looking for unreproducible artifacts.
      • Document that munin-node was finally configured (and how denyhosts was configured too).
      • Our test VM has been upgraded to 11.1.
      • Document that poudriere was installed, user mattia created and filesystem resized.
    • Debian: Update documentation to reflect that the database is now kept in PostgreSQL.
    • Redistribute 13 cores and 24GB RAM from pb17 to pb3 and pb4 (used to build LEDE, Arch & coreboot) and the FreeBSD VM.
Weekly QA work

During our reproducibility testing, FTBFS bugs have been detected and reported by:

  • Adam Borowski (1)
  • Adrian Bunk (30)
  • Andreas Beckmann (2)
  • Christoph Biedl (1)
  • Helmut Grohne (2)
  • James Cowgill (1)
  • Matthias Klose (4)
reproducible-website development
  • Chris Lamb:
    • Update some broken links and references on the contribute" page (1, 2, 3)
    • Add a missing ")" Thanks to itd for the patch!
Misc.

This week's edition was written by Chris Lamb, Holger Levsen and Ximin Luo & reviewed by a bunch of Reproducible Builds folks on IRC & the mailing lists.

Joey Hess: Happy Thanksgiving

25 November, 2017 - 04:44

After thanksgiving at my sister's, I got the fun of a big family visit without the turkey day stress. We ate lemons and stuffed three people inside a tree.

Lisandro Damián Nicanor Pérez Meyer: Removing Qt 4 from Debian testing (aka Buster): some statistics

25 November, 2017 - 03:21
I have just looked upon our Qt 4 removal wiki page to see how we are doing. Out of 438 bugs filled:

  • 88 (20.09%) have been already fixed by either porting the app/library to Qt 5 or a removal from the archive has happened. On most cases the code has been ported and most of the deletions are due to Qt 5 replacements already available in the archive and a few due to dead upstreams.
  • 3 (0.68%) packages are marked as fixed an an upload is pending.
  • 65 (14.84%) of the open bugs are maintained inside the Qt/KDE team. Many of them should get a Qt 5 version with the next KF5 uploads.
We started filing bugs around September 9.  That means roughly 11 weeks, which gives us around 8 packages fixed a week, aka 1.14 packages per day. Not bad at all!

So, how can you help?

If you are a maintainer of any of the packages still affected try to get upstream to make a port and package it.

If you are not a maintainer you might want to take a look at the list of packages in our wiki page and try to create a patch for them. If you can submit it directly to upstream, the better.

Dirk Eddelbuettel: Rcpp 0.12.14: Some deprecation and minor updates

24 November, 2017 - 20:49

The fourteenth release in the 0.12.* series of Rcpp landed on CRAN yesterday after a somewhat longer-than-usual gestation period (and word is it may have been due to some unrelated disturbances from lots of changes within the main r-devel build).

This release follows the 0.12.0 release from July 2016, the 0.12.1 release in September 2016, the 0.12.2 release in November 2016, the 0.12.3 release in January 2017, the 0.12.4 release in March 2016, the 0.12.5 release in May 2016, the 0.12.6 release in July 2016, the 0.12.7 release in September 2016, the 0.12.8 release in November 2016, the 0.12.9 release in January 2017, the 0.12.10.release in March 2017, the 0.12.11.release in May 2017, the 0.12.12 release in July 2017 and the 0.12.13.release in late September 2017 making it the eighteenth release at the steady and predictable bi-montly release frequency.

Rcpp has become the most popular way of enhancing GNU R with C or C++ code. As of today, 1246 packages (and hence 77 more since the last release) on CRAN depend on Rcpp for making analytical code go faster and further, along with another 91 in BioConductor.

This release is relatively minor compared to other releases, but follows through on the deprecattion of the old vectors for Date and Datetime (which were terrible: I was influenced by the vector design in QuantLib at the time and didn't really understand yet how a SEXP vector should work) we announced with Rcpp 0.12.8 a year ago. So now the new vectors are the default, but you can flip back if you need to with #define.

Otherwise Dan rounded a corner with the improved iterators he contributed, and Kirill improved the output stream implementation suppressing a warning with newer compilers.

Changes in Rcpp version 0.12.14 (2017-11-17)
  • Changes in Rcpp API:

    • New const iterators functions cbegin() and cend() added to MatrixRow as well (Dan Dillon in #750).

    • The Rostream object now contains a Buffer rather than allocating one (Kirill Müller in #763).

    • New DateVector and DatetimeVector classes are now the default fully deprecating the old classes as announced one year ago.

  • Changes in Rcpp Package:

    • DESCRIPTION file now list doi information per CRAN suggestion.
  • Changes in Rcpp Documentation:

    • Update CITATION file with doi information and PeerJ preprint.

Thanks to CRANberries, you can also look at a diff to the previous release. As always, details are on the Rcpp Changelog page and the Rcpp page which also leads to the downloads page, the browseable doxygen docs and zip files of doxygen output for the standard formats. A local directory has source and documentation too. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Daniel Pocock: Free software in the snow

24 November, 2017 - 15:31

There is an increasing number of events for free software enthusiasts to meet in an alpine environment for hacking and fun.

In Switzerland, Swiss Linux is organizing the fourth edition of the Rencontres Hivernales du Libre in the mountain resort of Saint-Cergue, a short train ride from Geneva and Lausanne, 12-14 January 2018. The call for presentations is still open.

In northern Italy, not far from Milan (Malpensa) airport, Debian is organizing a Debian Snow Camp, a winter getaway for developers and enthusiasts in a mountain environment where the scenery is as diverse as the Italian culinary options. It is hoped the event will take place 22-25 February 2018.

Norbert Preining: TeX Live Cockpit

24 November, 2017 - 14:10

I have been working quite some time on a new front end for the TeX Live Manager tlmgr. Early versions have leaked into TeX Live, but the last month or two has seen many changes in tlmgr itself, in particular support for JSON output. These changes were mostly driven by the need (or ease) of the new frontend: TLCockpit.

With the update of the main TeX Live infrastructure packages that we made available the other day, and the following update of TLCockpit in TeX Live, it is now time to announce TLCockpit a bit more formally. So here it is. We took lots of inspiration from the excellent TeX Live Utility which has and is serving the Mac users since long time. TLCockpit tries to provide similar functionality and ease of use for users of all architectures and not only the Mac.

Cross-platform development is a pain, and the current TeX Live Manager contains a GUI, written in Perl/Tk. While the GUI is functional, in fact nearly functionally complete wrt to the command line version, its appearance is less of pleasure. Complaints about it not being styled according to the underlying OS, UX not being optimal, feature-overload etc are common. And I agree, that back then when I wrote that GUI it was less for actual consumption but to at least have some GUI for those poor Windows users.

Fast-forward about 10 years, and the implementation of the tlmgr shell (something I haven’t reported in detail by now!) opened up new routes for GUI development. And since in my current job Scala is very common, I decided to implement a frontend with Scala, first based on the Swing toolkit but soon I switched over to ScalaFX. That is also the reason there are so many posts on ScalaFX on my blog recently.

Features

Let us go through the features of TLCockpit at the current time. Starting up the program one sees the loading screen on the right. The indicator in the menu bar shows that the underlying tlmgr is busy loading the local and remote databases. When this is done, the list of packages appears. In the list, the binary sub-packages are ordered below the main packages and can be expanded. For each package the package name, the short description, and the installation status is shown:

Below the list of packages there is also a search entry field: Entering a text here and pressing Go (or Enter) will restrict the display of packages to those where the package name or short description contains the search term.

Using the Packages menu entry one can change the list view to sort by collections, as shown on the left.

On switching to the Updates tab one is (after a hopefully short loading time) with the list of updates available. For each update the package name, the status (update, new on server, removed on server etc), short description, local and remote revisions (with TeX Catalogue versions given if available), and the download size is shown:

While visiting the Updates tab, the menu entry Updates allows updating either all, or if there are infra-structure updates available, only the TeX Live manager.


The last of the three tabs is Backup and lists the currently available backups and allows restoring packages. For each package the revision and date of creation is shown.

Clicking on a package anywhere in the three tabs with the right mouse button gives a context menu with options to install, remove, update, and show detailed information of the package. Selecting the Info entry gives a details window, in which all kind of information is listed. For documentation files that are installed clicking on the file name will open the file with the default application for the file type:

Configuration of the TeX Live installation is possible via menu entry Options where there are General options allowing to customize the TeX Live installation and Paper allowing to customize the default paper size for the supported programs:

This completes the list of current functionality.

Well, more or less. The attentive reader might have spotted two additional features at the lower part of the application window: Experts only and Debug panes. The former gives access to a text field that allows entering arbitrary tlmgr shell commands. But be careful what you are entering. The later one gives access to three tabs containing the output, debug, and error output of tlmgr as well as the application itself. It will open automatically in case something unforeseeable has happened.

Requirements

Although written in Scala, all necessary Scala related libraries are packed into the distributed jar file, that means only a recent Java installation which includes JavaFX is necessary. In case you see errors like java.lang.NoClassDefFoundError: javafx/event/EventTarget this means your Java is too old.

I have tested the release jar files on Linux (Debian) with Java 8u152 from the OpenJDK project and on Windows (10) with the 8u154 from Sun. Both worked without any problems. Mac OS is untested due to lack of devices – but there most users anyway use TLU.

Development

Development is done one GitHub tlcockpit project, please use the issues functionality there to report bugs, request features, and suggest improvements. As mentioned above, the program is written in Scala with ScalaFX as toolkit, so if you are an expert in any of those, please join in and help!

I myself use IntelliJ for development, the checkout can be opened with IntelliJ without any problems. Deployment is done via sbt assembly which packs all the necessary libraries into one jar file.

Closing

While far from perfect, I believe it is an improvement over the tlmgr gui, and I hope that some users of TeX Live Manager might find TLCockpit useful.

Enjoy.

Louis-Philippe Véronneau: DebConf Videoteam sprint report - day 5

24 November, 2017 - 12:00

This is the last report of the videoteam sprint in Cambridge. Here are the links to the previous reports:

Cambridge Mini-DebConf livestream

You can find all the instructions on how to watch the livestream for the Cambridge Mini-DebConf here.

Report

We did not have as much time as the previous days to sprint today since we had to move our gear to ARM during the afternoon and prepare the room.

tumbleweed

Stefano scraped metadata for two new DebConfs today! More on the way.

He also worked on tuning and licensing the metadata scraping tools and the automated YouTube upload scripts we are using.

After some tinkering tumbleweed was also able to fix a problem we've been having with our sponsor loop since DC16. The only thing we were able to loop was a bunch of PNG files, but we now can loop a MPEG-2 video. Hurray!

RattusRattus

Andy finished the sponsor loop and wrote documentation on how to make one today.

olasd

Nicolas started his day by building us working Cat6 to Serial adapters for our tally lights. We should now have enough for all of our cameras.

He also worked on displaying a background with the Cambridge Mini-DebConf logo and fixed the Mini-DebConf livestream webviewer on our documentation page.

pollo

I worked a bit on our documentation website to modify the default template and add a webviewer for our livestream. The idea was that we can have a permanent webviewer for our streams for the mini-debconfs.

My thing wasn't working, mainly because I'm not very good with JS and I had to cook us some lunch, so olasd fixed it for me.

Thank you!

This sprint would not have been possible without the support of Debian, the folks at the Cambridge mini-DebConf and their sponsors.

Thank you!

Russ Allbery: Review: Range of Ghosts

24 November, 2017 - 10:51

Review: Range of Ghosts, by Elizabeth Bear

Series: Eternal Sky #1 Publisher: Tor Copyright: March 2012 ISBN: 0-7653-2754-6 Format: Hardcover Pages: 334

Temur is one of many grandsons of the Great Khagan. We meet him on a bloody battlefield, in the aftermath of a fight between his cousin and his brother over the succession. His brother lost and Temur was left for dead with a vicious cut down the side of his neck, improbably surviving among the corpses and even more improbably finding a surviving horse (or being found by one). But a brief reprieve and a potential new family connection are cut brutally short when they are attacked by ghosts and his lover is pulled away into the sky.

Once-Princess Samarkar was heir and then a political wife of a prince, but her marriage was cut short in bloody insult when her husband refused to consummate the marriage. Now, she's chosen a far different path: be neutered in order to become a wizard. If she survives the surgery, she gains independence from her brother, freedom from politics, and possibly (if she's lucky) magical power.

Range of Ghosts is the first book of a fantasy trilogy set in what appears to be an analogue of what's now far northwest China and the central Asian steppes (although the geography doesn't exactly follow ours). There are mountainous strongholds, a large city-based civilization in the east, a civilization with onion domes and a different god in the west, and nomadic horse clans in the north. That said, there's also, as one discovers a bit later in the book, a race of huge bipedal cat people, just to be sure you don't think this is too much like our world.

I had a hard time with the start of this book due to the brutality. Just in the first 70 pages or so, we get a near-fatal wound that a character has to hold closed with his hand (for pages), human sacrifice, essentially medieval invasive surgery, a graphic description of someone losing an eye, and then (I think a little later in the book) serious illness with high fever. And this is Elizabeth Bear, which means the descriptions are very well-written and vivid, which was... not what I wanted. Thankfully, the horror show does slow down by the middle of the book.

The opening also didn't quite connect with me. There's a lot about war, the aftermath of war, and the death of Temur's family, but I found Temur mostly boring. The reader enters the story in at the aftermath, so none of the death and battle really touched me. Temur's steppe mythology is vaguely interesting, but only vaguely.

Samarkar saved this book for me. She's pragmatic, introspective, daring, but also risk-conscious. She pays attention and learns and studies, and she takes the opportunity to learn from everyone she can. The magical system she's learning also has some nicely-described effects without being too mechanical, and I liked the interweaving of magic with science. As she started taking charge of the plot, I thought this book got better and better. Also, full points for the supposedly pampered concubine (one of Samarkar's relatives) turning out to have iron determination and considerable ability to put up with hardship when it was required. That was refreshing.

More positive points to this book for allowing both men and women can choose to become neutered and become wizards. Same principle, so why not the same effect? One of the things I like about fantasy is the opportunity to explore human society with little tweaks and differences, and the ones that poke at gender roles and ask why we gender something that clearly could be gender-neutral make me happy.

I wasn't as fond of the hissable villain. I think I'm getting increasingly grumbly about books in which the villain is so obviously evil as to be basically demonic. Maybe Bear will twist this around in later books, but this one opens with human sacrifice, and the villain doesn't get any more appealing as we go along. I think I wasn't in the mood to read about someone plotting horrible dark things, keeping hostages, and practicing black magic, particularly since Bear's vivid descriptions make it a bit hard to tune the horrors out.

Thankfully, there isn't that much of the villain, and there's rather a lot of Samarkar, which left me generally happy with the book and wanting more. However, be warned that this is in absolutely no way a standalone book. Essentially nothing is resolved in this volume; you will need the sequel (if not the third book as well) for any sense of completed story whatsoever.

Followed by The Shattered Pillars.

Rating: 7 out of 10

Pages

Creative Commons License ลิขสิทธิ์ของบทความเป็นของเจ้าของบทความแต่ละชิ้น
ผลงานนี้ ใช้สัญญาอนุญาตของครีเอทีฟคอมมอนส์แบบ แสดงที่มา-อนุญาตแบบเดียวกัน 3.0 ที่ยังไม่ได้ปรับแก้