Planet Debian

Subscribe to Planet Debian feed
Planet Debian - https://planet.debian.org/
Updated: 2 hours 6 min ago

Norbert Preining: Switching from NVIDIA to AMD (including tensorflow)

14 May, 2020 - 10:55

I have been using my Geforce 1060 extensively for deep learning, both with Python and R. But the always painful play with the closed source drivers and kernel updates, paired with the collapse of my computer’s PSU and/or GPU, I decided to finally do the switch to AMD graphic card and open source stack. And you know what, within half a day I had everything, including Tensorflow running. Yeah to Open Source!

Preliminaries

So what is the starting point: I am running Debian/unstable with a AMD Radeon 5700. First of all I purged all NVIDIA related packages, and that are a lot I have to say. Be sure to search for nv and nvidia and get rid of all packages. For safety I did reboot and checked again that no kernel modules related to NVIDIA are loaded.

Firmware

Debian ships the package amd-gpu-firmware but this is not enough for the current kernel and current hardware. Better is to clone git://git.kernel.org/pub/scm/linux/kernel/git/firmware/linux-firmware.git and copy everything from the amdgpu directory to /lib/firmware/amdgpu.

I didn’t do that at first, and then booting the kernel did hang during the switch to AMD framebuffer. If you see this behaviour, your firmwares are too old.

Kernel

The advantage of having open source driver that is in the kernel is that you don’t have to worry about incompatibilities (like every time a new kernel comes out the NVIDIA driver needs patching). For recent AMD GPUs you need a rather new kernel, I have 5.6.0 and 5.7.0-rc5 running. Make sure that you have all the necessary kernel config options turned on if you compile your own kernels. In my case this is

CONFIG_DRM_AMDGPU=m
CONFIG_DRM_AMDGPU_USERPTR=y
CONFIG_DRM_AMD_ACP=y
CONFIG_DRM_AMD_DC=y
CONFIG_DRM_AMD_DC_DCN=y
CONFIG_HSA_AMD=y

When installing the kernel, be sure that the firmware is already updated so that the correct firmware is copied into the initrd.

Support programs and libraries

All the following is more or less an excerpt from the ROCm Installation Guide!

AMD provides a Debian/Ubuntu APT repository for software as well as kernel sources. Put the following into /etc/apt/sources.list.d/rocm.list:

deb [arch=amd64] http://repo.radeon.com/rocm/apt/debian/ xenial main

and also put the public key of the rocm repository into /etc/apt/trusted.d/rocm.asc.

After that apt-get update should work.

I did install rocm-dev-3.3.0, rocm-libs-3.3.0, hipcub-3.3.0, miopen-hip-3.3.0 (and of course the dependencies), but not rocm-dkms which is the kernel module. If you have a sufficiently recent kernel (see above), the source in the kernel itself is newer.

The libraries and programs are installed under /opt/rocm-3.3.0, and to make the libraries available to Tensorflow (see below) and other programs, I added /etc/ld.so.conf.d/rocm.conf with the following content:

/opt/rocm-3.3.0/lib/

and run ldconfig as root.

Last but not least, add a udev rule that is normally installed by rocm-dkms, put the following into /etc/udev/rules.d/70-kfd.rules:

SUBSYSTEM=="kfd", KERNEL=="kfd", TAG+="uaccess", GROUP="video"

This allows users from the video group to access the GPU.

Up to here you should be able to boot into the system and have X running on top of AMD GPU, including OpenGL acceleration and direct rendering:

$ glxinfo
ame of display: :0
display: :0  screen: 0
direct rendering: Yes
server glx vendor string: SGI
server glx version string: 1.4
...
client glx vendor string: Mesa Project and SGI
client glx version string: 1.4
...
Tensorflow

Thinking about how hard it was to get the correct libraries to get Tensorflow running on GPUs (see here and here), it is a pleasure to see that with open source all this pain is relieved.

There is already work done to make Tensorflow run on ROCm, the tensorflow-rocm project. The provide up to date PyPi packages, so a simple

pip3 install tensorflow-rocm

is enough to get Tensorflow running with Python:

>> import tensorflow as tf
>> tf.add(1, 2).numpy()
2020-05-14 12:07:19.590169: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libhip_hcc.so
...
2020-05-14 12:07:19.711478: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 7444 MB memory) -> physical GPU (device: 0, name: Navi 10 [Radeon RX 5600 OEM/5600 XT / 5700/5700 XT], pci bus id: 0000:03:00.0)
3
>>
Tensorflow for R

Installation is trivial again since there is a tensorflow for R package, just run (as a user that is in the group staff, which normally own /usr/local/lib/R)

$ R
...
> install.packages("tensorflow")
..

Do not call the R function install_tensorflow() since Tensorflow is already installed and functional!

With that done, R can use the AMD GPU for computations:

$ R
...
> library(tensorflow)
> tf$constant("Hellow Tensorflow")
2020-05-14 12:14:24.185609: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libhip_hcc.so
...
2020-05-14 12:14:24.277736: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1247] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 7444 MB memory) -> physical GPU (device: 0, name: Navi 10 [Radeon RX 5600 OEM/5600 XT / 5700/5700 XT], pci bus id: 0000:03:00.0)
tf.Tensor(b'Hellow Tensorflow', shape=(), dtype=string)
>
AMD Vulkan

From the Vulkan home page:

Vulkan is a new generation graphics and compute API that provides high-efficiency, cross-platform access to modern GPUs used in a wide variety of devices from PCs and consoles to mobile phones and embedded platforms.

Several games are using the Vulkan API if available and it is said to be more efficient.

There are Vulkan libraries for Radeon shipped in with mesa, in the Debian package mesa-vulkan-drivers, but they look a bit outdated is my guess.

The AMDVLK project provides the latest version, and to my surprise was rather easy to install, again by following the advice in their README. The steps are basically (always follow what is written for Ubuntu):

  • Install the necessary dependencies
  • Install the Repo tool
  • Get the source code
  • Make 64-bit and 32-bit builds
  • Copy driver and JSON files (see below for what I did differently!)

All as described in the linked README. Just to make sure, I removed the JSON files /usr/share/vulkan/icd.d/radeon* shipped by Debians mesa-vulkan-drivers package.

Finally I deviated a bit by not editing the file /usr/share/X11/xorg.conf.d/10-amdgpu.conf, but instead copying to /etc/X11/xorg.conf.d/10-amdgpu.conf and adding there the section:

Section "Device"
        Identifier "AMDgpu"
        Option  "DRI" "3"
EndSection

.

To be honest, I did not follow the Copy driver and JSON files literally, since I don’t want to copy self-made files into system directories under /usr/lib. So what I did is:

  • copy the driver files to /opt/amdvkn/lib, so I have now there /opt/amdvlk/lib/i386-linux-gnu/amdvlk32.so and /opt/amdvlk/lib/x86_64-linux-gnu/amdvlk64.so
  • Adjust the location of the driver file in the two JSON files /etc/vulkan/icd.d/amd_icd32.json and /etc/vulkan/icd.d/amd_icd64.json (which were installed above under Copy driver and JSON files)
  • added a file /etc/ld.so.conf.d/amdvlk.conf containing the two lines:
    /opt/amdvlk/lib/i386-linux-gnu
    /opt/amdvlk/lib/x86_64-linux-gnu

With this in place, I don’t pollute the system directories, and still the new Vulkan driver is available.

But honestly, I don’t really know whether it is used and is working, because I don’t know how to check.

With all that in place, I can run my usual set of Steam games (The Long Dark, Shadow of the Tomb Raider, The Talos Principle, Supraland, …) and I don’t see any visual problem till now. As a bonus, KDE/Plasma is now running much better, since NVIDIA and KDE has traditionally some incompatibilities.

The above might sound like a lot of stuff to do, but considering that most of the parts are not really packaged within Debian, and all this is rather new open source stack, I was surprised that in half a day I got all working smoothly.

Thanks to all the developers who have worked hard to make this all possible.

Mike Gabriel: Q: Remote Support Framework for the GNU/Linux Desktop?

13 May, 2020 - 14:14

TL;DR; For those (admins) of you who run GNU/Linux on staff computers: How do you organize your graphical remote support in your company? Get in touch, share your expertise and experiences.

Researching on FLOSS based Linux Desktops

When bringing GNU/Linux desktops to a generic folk of productive office users on a high scale, graphical remote support is a key feature when organizing helpdesk support teams' workflows.

In a research project that I am currently involved in, we investigate the different available remote support technologies (VNC screen mirroring, ScreenCasts, etc.) and the available frameworks that allow one to provide a remote support infrastructure 100% on-premise.

In this research project we intend to find FLOSS solutions for everything required for providing a large scale GNU/Linux desktop to end users, but we likely will have to recommend non-free solutions, if a FLOSS approach is not available for certain demands. Depending on the resulting costs, bringing forth a new software solution instead of dumping big money in subscription contracts for non-free software is seen as a possible alternative.

As a member of the X2Go upstream team and maintainer of several remote desktop related tools and frameworks in Debian, I'd consider myself as sort of in-the-topic. The available (as FLOSS) unterlying technologies for plumbing a remote support framework are pretty much clear (x11vnc, recent pipewire-related approaches in Wayland compositors, browser-based screencasting). However, I still lack a good spontaneous answer to the question: "How to efficiently software-side organize a helpdesk scenario for 10.000+ users regarding graphical remote support?".

Framework for Remote Desktop in Webbrowsers

In fact, in the context of my X2Go activities, I am currently planning to put together a Django-based framework for running X2Go sessions in a web browser. The framework that we will come up with (two developers have already been hired for an initial sprint in July 2020) will be designed to be highly pluggable and it will probably be easy to add remote support / screen sharing features further on.

And still, I walk around with the question in mind: Do I miss anything? Is there anything already out there that provides a remote support solution as 100% FLOSS, that has enterprise grade, that up-scales well, that has a modern UI design, etc. Something that I simply haven't come across, yet?

Looking forward to Your Feedback

Please get in touch (OFTC/Freenode IRC, Telegram, Email), if you can fill the gap and feel like sharing your ideas and experiences.

light+love
Mike

Russ Allbery: Review: Gideon the Ninth

13 May, 2020 - 12:22

Review: Gideon the Ninth, by Tamsyn Muir

Series: The Locked Tomb #1 Publisher: Tor Copyright: September 2019 ISBN: 1-250-31317-1 Format: Kindle Pages: 448

Despite being raised there, Gideon Nav is an outsider in the Ninth House. Her mother, already dead, fell from the sky with a one-day-old Gideon in tow, leaving her an indentured servant. She's a grumpy, caustic teenager in a world of moldering corpses, animated skeletons, and mostly-dead adults whose parts are falling off. Her world is sword fighting, dirty magazines, a feud with the house heir Harrowhark, and a determination to escape the terms of her indenture.

Gideon does get off the planet, but not the way that she expects. She doesn't get accepted into the military. She ends up in the middle of a bizarre test, or possibly an ascension rite, mingling with and competing with the nobility of the empire alongside her worst enemy.

I struggled to enjoy the beginning of Gideon the Ninth. Gideon tries to carry the story on pure snark, but it is very, very goth. If you like desiccated crypts, mostly-dead goons, betrayal, frustration, necromancers, black robes, disturbing family relationships, gloom, and bitter despair, the first six chapters certainly deliver, but I was sick of it by the time Gideon gets out. Thankfully, the opening is largely unlike the rest of the book. What starts as an over-the-top teenage goth rebellion turns into a cross between a manor house murder mystery and a competitive escape room. This book is a bit of a mess, but it's a glorious mess.

It's also the sort of glorious mess that I don't think would have been written or published twenty years ago, and I have a pet theory that attributes this to the invigorating influence of fanfic and writers who grew up reading and writing it.

I read a lot of classic science fiction and epic fantasy as a teenager. Those books have many merits, obviously, but emotional range is not one of them. There are a few exceptions, but on average the genre either focused on puzzles and problem solving (how do we fix the starship, how do we use the magic system to take down the dark god) or on the typical "heroic" (and male-coded) emotions of loyalty, bravery, responsibility, authority, and defiance of evil. Characters didn't have messy breakups, frenemies, anxiety, socially-awkward love affairs, impostor syndrome, self-hatred, or depression. And authors weren't allowed to fall in love with the messiness of their characters, at least on the page.

I'm not enough of a scholar to make the argument well, but I suspect there's a case to be made that fanfic exists partially to fill this gap. So much of fanfic starts from taking the characters on the canonical page or screen and letting them feel more, live more, love more, screw up more, and otherwise experience a far wider range of human drama, particularly compared to what made it into television, which was even more censored than what made it into print. Some of those readers and writers are now writing for publication, and others have gone into publishing. The result, in my theory, is that the range of stories that are acceptable in the genre has broadened, and the emotional texture of those stories has deepened.

Whether or not this theory is correct, there are now more novels like this in the world, novels full of grudges, deflective banter, squabbling, messy emotional processing, and moments of glorious emotional catharsis. This makes me very happy. To describe the emotional payoff of this book in any more detail would be a huge spoiler; suffice it to say that I unabashedly love fragile competence and unexpected emotional support, and adore this book for containing it.

Gideon's voice, irreverent banter, stubborn defiance, and impulsive good-heartedness are the center of this book. At the start, it's not clear whether there will be another likable character in the book. There will be, several of them, but it takes a while for Gideon to find them or for them to become likable. You'll need to like Gideon well enough to stick with her for that journey.

I read books primarily for the characters, not for the setting, and Gideon the Ninth struck some specific notes that I will happily read endlessly. If that doesn't match your preferences, I would not be too surprised to hear you bounced off the book. There's a lot here that won't be to everyone's taste. The setting felt very close to Warhammer 40K: an undead emperor that everyone worships, endless war, necromancy, and gothic grimdark. The stage for most of the book is at least more light-filled, complex, and interesting than the Ninth House section at the start, but everything is crumbling, drowning, broken, or decaying. There's quite a lot of body horror, grotesque monsters, and bloody fights. And the ending is not the best part of the book; roughly the last 15% of the novel is composed of two running fight scenes against a few practically unkillable and frankly not very interesting villains. I got exhausted by the fighting long before it was over, and the conclusion is essentially a series cliffhanger.

There are also a few too many characters. The collection of characters and the interplay between the houses is one of the strengths of this book, but Muir sets up her story in a way that requires eighteen significant characters and makes the reader want to keep track of all of them. It took me about halfway through the book before I felt like I had my bearings and wasn't confusing one character for another or forgetting a whole group of characters. That said, most of the characters are great, and the story gains a lot from the interplay of their different approaches and mindsets. Palamedes Sextus's logical geekery, in particular, is a great counterpoint to the approaches of most of the other characters.

The other interesting thing Muir does in this novel that I've not seen before, and that feels very modern, is to set the book in essentially an escape room. Locking a bunch of characters in a sprawling mansion until people start dying is an old fictional trope, but this one has puzzles, rewards, and a progressive physical structure that provides a lot of opportunities to motivate the characters and give them space to take wildly different problem-solving approaches. I liked this a lot, and I'm looking forward to seeing it in future books.

This is not the best book I've read, but I thoroughly enjoyed it, despite some problems with the ending. I've already pre-ordered the sequel.

Followed by Harrowhark the Ninth.

Rating: 8 out of 10

Kunal Mehta: MediaWiki packages for Ubuntu 20.04 Focal available

13 May, 2020 - 03:21

Packages for the MediaWiki 1.31 LTS release are now available for the new Ubuntu 20.04 LTS "Focal Fossa" release in my PPA. Please let me know if you run into any errors or issues.

In the future these packages will be upgraded to the MediaWiki 1.35 LTS release, whenever that's ready. It's currently delayed because of the pandemic, but I expect that it'll be ready for the next Debian release.

Evgeni Golov: Building a Shelly 2.5 USB to TTL adapter cable

12 May, 2020 - 17:44

When you want to flash your Shelly 2.5 with anything but the original firmware for the first time, you'll need to attach it to your computer. Later flashes can happen over the air (at least with ESPHome or Tasmota), but the first one cannot.

In theory, this is not a problem as the Shelly has a quite exposed and well documented interface:

However, on closer inspection you'll notice that your normal jumper wires don't fit as the Shelly has a connector with 1.27mm (0.05in) pitch and 1mm diameter holes.

Now, there are various tutorials on the Internet how to build a compatible connector using Ethernet cables and hot glue or with female header socket legs, and you can even buy cables on Amazon for 18€! But 18€ sounded like a lot and the female header socket thing while working was pretty finicky to use, so I decided to build something different.

We'll need 6 female-to-female jumper wires and a 1.27mm pitch male header. Jumper wires I had at home, the header I got is a SL 1X20G 1,27 from reichelt.de for 0.61€. It's a 20 pin one, so we can make 3 adapters out of it if needed. Oh and we'll need some isolation tape.

The first step is to cut the header into 6 pin chunks. Make sure not to cut too close to the 6th pin as the whole thing is rather fragile and you might lose it.

It now fits very well into the Shelly with the longer side of the pins.

Second step is to strip the plastic part of one side of the jumper wires. Those are designed to fit 2.54mm pitch headers and won't work for our use case otherwise.

As the connectors are still too big, even after removing the plastic, the next step is to take some pliers and gently press the connectors until they fit the smaller pins of our header.

Now is the time to put everything together. To avoid short circuiting the pins/connectors, apply some isolation tape while assembling, but not too much as the space is really limited.

And we're done, a wonderful (lol) and working (yay) Shelly 2.5 cable that can be attached to any USB-TTL adapter, like the pictured FTDI clone you get almost everywhere.

Yes, in an ideal world we would have soldered the header to the cable, but I didn't feel like soldering on that limited space. And yes, shrink-wrap might be a good thing too, but again, limited space and with isolation tape you only need one layer between two pins, not two.

Daniel Silverstone: The Lars, Mark, and Daniel Club

12 May, 2020 - 14:00

Last night, Lars, Mark, and I discussed Jeremy Kun's The communicative value of using Git well post. While a lot of our discussion was spawned by the article, we did go off-piste a little, and I hope that my notes below will enlighten you all as to a bit of how we see revision control these days. It was remarkably pleasant to read an article where the comments section wasn't a cesspool of horror, so if this posting encourages you to go and read the article, don't stop when you reach the bottom -- the comments are good and useful too.

This was a fairly non-contentious article for us though each of us had points we wished to bring up and chat about it turned into a very convivial chat. We saw the main thrust of the article as being about using the metadata of revision control to communicate intent, process, and decision making. We agreed that it must be possible to do so effectively with Mercurial (thus deciding that the mention of it was simply a bit of clickbait / red herring) and indeed Mark figured that he was doing at least some of this kind of thing way back with CVS.

We all discussed how knowing the fundamentals of Git's data model improved our ability to work wih the tool. Lars and I mentioned how jarring it has originally been to come to Git from revision control systems such as Bazaar (bzr) but how over time we came to appreciate Git for what it is. For Mark this was less painful because he came to Git early enough that there was little more than the fundamental data model, without much of the porcelain which now exists.

One point which we all, though Mark in particular, felt was worth considering was that of distinguishing between published and unpublished changes. The article touches on it a little, but one of the benefits of the workflow which Jeremy espouses is that of using the revision control system as an integral part of the review pipeline. This is perhaps done very well with Git based workflows, but can be done with other VCSs.

With respect to the points Jeremy makes regarding making commits which are good for reviewing, we had a general agreement that things were good and sensible, to a point, but that some things were missed out on. For example, I raised that commit messages often need to be more thorough than one-liners, but Jeremy's examples (perhaps through expedience for the article?) were all pretty trite one-liners which perhaps would have been entirely obvious from the commit content. Jeremy makes the point that large changes are hard to review, and Lars poined out that Greg Wilson did research in this area, and at least one article mentions 200 lines as a maximum size of a reviewable segment.

I had a brief warble at this point about how reviewing needs to be able to consider the whole of the intended change (i.e. a diff from base to tip) not just individual commits, which is also missing from Jeremy's article, but that such a review does not need to necessarily be thorough and detailed since the commit-by-commit review remains necessary. I use that high level diff as a way to get a feel for the full shape of the intended change, a look at the end-game if you will, before I read the story of how someone got to it. As an aside at this point, I talked about how Jeremy included a 'style fixes' commit in his example, but I loathe seeing such things and would much rather it was either not in the series because it's unrelated to it; or else the style fixes were folded into the commits they were related to.

We discussed how style rules, as well as commit-bisectability, and other rules which may exist for a codebase, the adherence to which would form part of the checks that a code reviewer may perform, are there to be held to when they help the project, and to be broken when they are in the way of good communication between humans.

In this, Lars talked about how revision control histories provide high level valuable communication between developers. Communication between humans is fraught with error and the rules are not always clear on what will work and what won't, since this depends on the communicators, the context, etc. However whatever communication rules are in place should be followed. We often say that it takes two people to communicate information, but when you're writing commit messages or arranging your commit history, the second party is often nebulous "other" and so the code reviewer fulfils that role to concretise it for the purpose of communication.

At this point, I wondered a little about what value there might be (if any) in maintaining the metachanges (pull request info, mailing list discussions, etc) for historical context of decision making. Mark suggested that this is useful for design decisions etc but not for the style/correctness discussions which often form a large section of review feedback. Maybe some of the metachange tracking is done automatically by the review causing the augmentation of the changes (e.g. by comments, or inclusion of design documentation changes) to explain why changes are made.

We discussed how the "rebase always vs. rebase never" feeling flip-flopped in us for many years until, as an example, what finally won Lars over was that he wants the history of the project to tell the story, in the git commits, of how the software has changed and evolved in an intentional manner. Lars said that he doesn't care about the meanderings, but rather about a clear story which can be followed and understood.

I described this as the switch from "the revision history is about what I did to achieve the goal" to being more "the revision history is how I would hope someone else would have done this". Mark further refined that to "The revision history of a project tells the story of how the project, as a whole, chose to perform its sequence of evolution."

We discussed how project history must necessarily then contain issue tracking, mailing list discussions, wikis, etc. There are exist free software projects where part of their history is forever lost because, for example, the project moved from Sourceforge to Github, but made no effort (or was unable) to migrate issues or somesuch. Linkages between changes and the issues they relate to can easily be broken, though at least with mailing lists you can often rewrite URLs if you have something consistent like a Message-Id.

We talked about how cover notes, pull request messages, etc. can thus also be lost to some extent. Is this an argument to always use merges whose message bodies contain those details, rather than always fast-forwarding? Or is it a reason to encapsulate all those discussions into git objects which can be forever associated with the changes in the DAG?

We then diverted into discussion of CI, testing every commit, and the benefits and limitations of automated testing vs. manual testing; though I think that's a little too off-piste for even this summary. We also talked about how commit message audiences include software perhaps, with the recent movement toward conventional commits and how, with respect to commit messages for machine readability, it can become very complex/tricky to craft good commit messages once there are multiple disparate audiences. For projects the size of the Linux kernel this kind of thing would be nearly impossible, but for smaller projects, perhaps there's value.

Finally, we all agreed that we liked the quote at the end of the article, and so I'd like to close out by repeating it for you all...

Hal Abelson famously said:

Programs must be written for people to read, and only incidentally for machines to execute.

Jeremy agrees, as do we, and extends that to the metacommit information as well.

Petter Reinholdtsen: Debian Edu interview: Yvan Masson

12 May, 2020 - 11:30

It has been way too long since my last interview, but as the Debian Edu / Skolelinux community is still active, and new people keep showing up on the IRC channel #debian-edu and the debian-edu mailing list, I decided to give it another go. I was hoping someone else might pick up the idea and run with it, but this has not happened as far as I can tell, so here we are… This time the announcement of a new free software tool to create a school year book triggered my interest, and I decided to learn more about its author.

Who are you, and how do you spend your days?

My name is Yvan MASSON, I live in France. I have my own one person business in computer services. The work consist of visiting my customers (person's home, local authority, small business) to give advise, install computers and software, fix issues, and provide computing usage training. I spend the rest of my time enjoying my family and promoting free software.

What is your approach for promoting free software?

When I think that free software could be suitable for someone, I explain what it is, with simple words, give a few known examples, and explain that while there is no fee it is a viable alternative in many situations. Most people are receptive when you explain how it is better (I simplify arguments here, I know that it is not so simple): Linux works on older hardware, there are no viruses, and the software can be audited to ensure user is not spied upon. I think the most important is to keep a clear but moderated speech: when you try to convince too much, people feel attacked and stop listening.

How did you get in contact with the Skolelinux / Debian Edu project?

I can not remember how I first heard of Skolelinux / Debian Edu, but probably on planet.debian.org. As I have been working for a school, I have interest in this type of project.

The school I am involved in is a school for "children" between 14 and 18 years old. The French government has recommended free software since 2012, but they do not always use free software themselves. The school computers are still using the Windows operating system, but all of them have the classic set of free software: Firefox ESR, LibreOffice (with the excellent extension Grammalecte that indicates French grammatical errors), SumatraPDF, Audacity, 7zip, KeePass2, VLC, GIMP, Inkscape…

What do you see as the advantages of Skolelinux / Debian Edu?

It is free software! Built on Debian, I am sure that users are not spied upon, and that it can run on low end hardware. This last point is very important, because we really need to improve "green IT". I do not know enough about Skolelinux / Debian Edu to tell how it is better than another free software solution, but what I like is the "all in one" solution: everything has been thought of and prepared to ease installation and usage.

I like Free Software because I hate using something that I can not understand. I do not say that I can understand everything nor that I want to understand everything, but knowing that someone / some company intentionally prevents me from understanding how things work is really unacceptable to me.

Secondly, and more importantly, free software is a requirement to prevent abuses regarding human rights and environmental care. Humanity can not rely on tools that are in the hands of small group of people.

What do you see as the disadvantages of Skolelinux / Debian Edu?

Again, I don't know this project enough. Maybe a dedicated website? Debian wiki works well for documentation, but is not very appealing to someone discovering the project. Also, as Skolelinux / Debian Edu uses OpenLDAP, it probably means that Windows workstations cannot use centralized authentication. Maybe the project could use Samba as an Active Directory domain controller instead, allowing Windows desktop usage when necessary.

(Editors note: In fact Windows workstations can use the centralized authentication in a Debian Edu setup, at least for some versions of Windows, but the fact that this is not well known can be seen as an indication of the need for better documentation and marketing. :)

Which free software do you use daily?

Nothing original: Debian testing/sid with Gnome desktop, Firefox, Thunderbird, LibreOffice…

Which strategy do you believe is the right one to use to get schools to use free software?

Every effort to spread free software into schools is important, whatever it is. But I think, at least where I live, that IT professionals maintaining schools networks are still very "Microsoft centric". Schools will use any working solution, but they need people to install and maintain it. How to make these professionals sensitive about free software and train them with solutions like Debian Edu / Skolelinux is a really good question :-)

Jacob Adams: Roman Finger Counting

12 May, 2020 - 07:00

I recently wrote a final paper on the history of written numerals. In the process, I discovered this fascinating tidbit that didn’t really fit in my paper, but I wanted to put it somewhere. So I’m writing about it here.

If I were to ask you to count as high as you could on your fingers you’d probably get up to 10 before running out of fingers. You can’t count any higher than the number of fingers you have, right? The Romans could! They used a place-value system, combined with various gestures to count all the way up to 9,999 on two hands.

The System

(Note that in this diagram 60 is, in fact, wrong, and this picture swaps the hundreds and the thousands.)

We’ll start with the units. The last three fingers of the left hand, middle, ring, and pinkie, are used to form them.

Zero is formed with an open hand, the opposite of the finger counting we’re used to.

One is formed by bending the middle joint of the pinkie, two by including the ring finger and three by including the middle finger, all at the middle joint. You’ll want to keep all these bends fairly loose, as otherwise these numbers can get quite uncomfortable. For four, you extend your pinkie again, for five, also raise your ring finger, and for six, you raise your middle finger as well, but then lower your ring finger.

For seven you bend your pinkie at the bottom joint, for eight adding your ring finger, and for nine, including your middle finger. This mirrors what you did for one, two and three, but bending the finger at the bottom joint now instead.

This leaves your thumb and index finger for the tens. For ten, touch the nail of your index finger to the inside of your top thumb joint. For twenty, put your thumb between your index and middle fingers. For thirty, touch the nails of your thumb and index fingers. For forty, bend your index finger slightly towards your palm and place your thumb between the middle and top knuckle of your index finger. For fifty, place your thumb against your palm. For sixty, leave your thumb where it is and wrap your index finger around it (the diagram above is wrong). For seventy, move your thumb so that the nail touches between the middle and top knuckle of your index finger. For eighty, flip your thumb so that the bottom of it now touches the spot between the middle and top knuckle of your index finger. For ninety, touch the nail of your index finger to your bottom thumb joint.

The hundreds and thousands use the same positions on the right hand, with the units being the thousands and the tens being the hundreds. One account, from which the picture above comes, swaps these two, but the first account we have uses this ordering.

Combining all these symbols, you can count all the way to 9,999 yourself on just two hands. Try it!

History The Venerable Bede

The first written record of this system comes from the Venerable Bede, an English Benedictine monk who died in 735.

He wrote De computo vel loquela digitorum, “On Calculating and Speaking with the Fingers,” as the introduction to a larger work on chronology, De temporum ratione. (The primary calculation done by monks at the time was calculating the date of Easter, the first Sunday after the first full moon of spring).

He also includes numbers from 10,000 to 1,000,000, but its unknown if these were inventions of the author and were likely rarely used regardless. They require moving your hands to various positions on your body, as illustrated below, from Jacob Leupold’s Theatrum Arilhmetico-Geometricum, published in 1727:

The Romans

If Bede was the first to write it, how do we know that it came from Roman times? It’s referenced in many Roman writings, including this bit from the Roman satirist Juvenal who died in 130:

Felix nimirum qui tot per saecula mortem distulit atque suos iam dextera computat annos.

Happy is he who so many times over the years has cheated death And now reckons his age on the right hand.

Because of course the right hand is where one counts hundreds!

There’s also this Roman riddle:

Nunc mihi iam credas fieri quod posse negatur: octo tenes manibus, si me monstrante magistro sublatis septem reliqui tibi sex remanebunt.

Now you shall believe what you would deny could be done: In your hands you hold eight, as my teacher once taught; Take away seven, and six still remain.

If you form eight with this system and then remove the symbol for seven, you get the symbol for six!

Sources

My source for this blog post is Paul Broneer’s 1969 English translation of Karl Menninger’s Zahlwort und Ziffer (Number Words and Number Symbols).

Markus Koschany: My Free Software Activities in April 2020

12 May, 2020 - 00:53

Welcome to gambaru.de. Here is my monthly report (+ the first week in May) that covers what I have been doing for Debian. If you’re interested in Java, Games and LTS topics, this might be interesting for you.

Debian Games Playonlinux
  • Scott Talbert did a fantastic job by porting playonlinux, a user-friendly frontend for Wine, to Python 3 (#937302). I tested his patch and uploaded the package today. More testing and feedback is welcome. Scott’s work basically prevented the removal of one of the most popular packages in the games section. I believe this will also give interested people more time to package the Java successor of playonlinux called Phoenicis.
  • Reiner Herrmann ported ardentryst, an action role playing game, to Python 3 to fix a release critical Py2 removal bug (#936148). He also packaged the latest release of xaos, a real-time interactive fractal zoomer, and improved various packaging details. I reviewed both of them and sponsored the upload for him.
  • I packaged new upstream releases of minetest, lordsawar, gtkatlantic and cutemaze.
  • I also sponsored a new simutrans update for Jörg Frings-Fürst.
Debian Java Misc
  • I packaged new versions of wabt and binaryen, required to build Webassembly code from source.
Debian LTS

This was my 50. month as a paid contributor and I have been paid to work 11,5 hours on Debian LTS, a project started by Raphaël Hertzog. In that time I did the following:

  • I completed the security update of Tomcat 8 in Stretch released as DSA-4673-1 and Tomcat 8 in Jessie soon to be released as DLA-2209-1.
  • I am currently assigned more hours and my plan is to invest the time in a project to improve our knowledge about embedded code copies and their current security impact which I want to discuss with the security team. The rest will be spent on Stretch security updates which will become the new LTS release soon.
ELTS

Extended Long Term Support (ELTS) is a project led by Freexian to further extend the lifetime of Debian releases. It is not an official Debian project but all Debian users benefit from it without cost. The current ELTS release is Debian 7 „Wheezy“. This was my 23. month and I have been paid to work 2 hours on ELTS.

  • I prepared the fix for CVE-2019-18218 in php5 released as ELA-227-1.
  • I checked jetty for unfixed vulnerabilities and discovered that the version in Wheezy was not affected by CVE-2019-17632. No further action was required.
  • It turned out that the apache2 package in Wheezy was not affected by vulnerable embedded expat code because it depends on already fixed system packages.

Thanks for reading and see you next time.

Julien Danjou: Interview: The Performance of Python

11 May, 2020 - 18:40

Earlier this year, I was supposed to participate to dotPy, a one-day Python conference happening in Paris. This event has unfortunately been cancelled due to the COVID-19 pandemic.

Both Victor Stinner and me were supposed to attend that event. Victor had prepared a presentation about Python performances, while I was planning on talking about profiling.

Rather than being completely discouraged, Victor and I sat down (remotely) with Anne Laure from Behind the Code (a blog ran by Welcome to the Jungle, the organizers of the dotPy conference).

We discuss Python performance, profiling, speed, projects, problems, analysis, optimization and the GIL.

You can read the interview here.

Gunnar Wolf: Certified printer fumes

11 May, 2020 - 14:00

After losing a fair bit of hair due to quality and reliability issues with our home laser multifunctional (Brother DCP1600-series, which we bought after checking it was meant to work on Linux… And it does, but with a very buggy, proprietary driver — Besides being the printer itself of quite low quality), we decided it was time to survey the market again, and get a color inkjet printer. I was not very much an enthusiast of the idea, until I found all of the major manufacturers now offer refillable ink tanks instead of the darn expensive cartridges of past decades. Lets see how it goes!

Of course, with over 20 years of training, I did my homework. I was about to buy an Epson printer, but decided for an HP Ink Tank 410 Wireless printer. The day it arrived, not wanting to fuss around too much to get to see the results, I connected it to my computer using the USB cable. Everything ran smoothly and happily! No driver hunting needed, print quality is superb… I hope, years from now, we stay with this impression.

Next day, I tried to print over WiFi. Of course, it requires configuration. And, of course, configuration strongly wants you to do it from a Windows or MacOS machine — which I don’t have. OK, fall back to Android — For which an app download is required (and does not thrill me, but what can I say. Oh — and the app needs location services to even run. Why‽ Maybe because it interacts with the wireless network in WiFi Direct, non-authenticated way?)

Anyway, things seem to work. But they don’t — My computers can identify and connect with the printer from CUPS, but nothing ever comes out. Printer paused, they say. Entering the printer’s web interface is somewhat ambiguous — Following the old HP practices, I tried http://192.168.1.75:9100/ (no point in hiding my internal IP), and got a partial webpage sometimes (and nothing at all othertimes). Seeing the printer got detected over ipps://, my immediate reaction was to try pointing the browser to port 631. Seems to work! Got some odd messages… But it seems I’ll soon debug the issue away. I am not a familiar meddler in the dark lands of cups, our faithful print server, but I had to remember my toolkit..

# cupsenable HP_Ink_Tank_Wireless_410_series_C37468_ --release

Sucess in enabling, but self-auto-disabled right away… lpstat -t was not more generous, reporting only it was still paused.

… Some hours later (mix in attending kids and whatnot), I finally remember to try cupsctl --debug-logging, and magically, /var/log/cups/error_log turns from being quiet to being quite chatty. And, of course, my first print job starts being processed:

D [10/May/2020:23:07:20 -0500] Report: jobs-active=1
(...)
D [10/May/2020:23:07:25 -0500] [Job 174] Start rendering...
(...)
D [10/May/2020:23:07:25 -0500] [Job 174] STATE: -connecting-to-device
(...)

Everything looks fine and dandy so far! But, hey, given nothing came out of the printer… keep reading one more second of logs (a couple dozen lines)

D [10/May/2020:23:07:26 -0500] [Job 174] Connection is encrypted.
D [10/May/2020:23:07:26 -0500] [Job 174] Credentials are expired (Credentials have expired.)
D [10/May/2020:23:07:26 -0500] [Job 174] Printer credentials: HPC37468 / Thu, 01 Jan 1970 00:00:00 GMT / 28A59EF511A480A34798B6712DEEAE74
D [10/May/2020:23:07:26 -0500] [Job 174] No stored credentials.
D [10/May/2020:23:07:26 -0500] [Job 174] update_reasons(attr=0(), s=\"-cups-pki-invalid,cups-pki-changed,cups-pki-expired,cups-pki-unknown\")
D [10/May/2020:23:07:26 -0500] [Job 174] STATE: -cups-pki-expired
(...)
D [10/May/2020:23:08:00 -0500] [Job 174] envp[16]="CUPS_ENCRYPTION=IfRequested"
(...)
D [10/May/2020:23:08:00 -0500] [Job 174] envp[27]="PRINTER_STATE_REASONS=cups-pki-expired"

My first stabs were attempts to get CUPS not to care about expired certificates, but it seems to have been hidden or removed from its usual place. Anyway, I was already frustrated.

WTF‽ Well, yes, turns out that from the Web interface, I paid some attention to this the first time around, but let it pass (speaks wonders about my security practices!):

So, the self-signed certificate the printer issued at itself expired 116 years before even being issued. (is this maybe a Y2k38 bug? Sounds like it!) Interesting thing, my CUPS log mentions the printer credentials to expire at the beginning of the Unix Epoch (01 Jan 1970 00:00:00 GMT).

OK, lets clickety-click away on the Web interface… Didn’t take me long to get to Network ⇒ Advanced settings ⇒ Certificates:

However, clicking on Configure leads me to the not very reassuring…

I don’t remember what I did for the next couple of minutes. Kept fuming… Until I parsed again the output of lpstat -t, and found that:

# lpstat -t
(...)
device for HP_Ink_Tank_Wireless_410_series_C37468_: ipps://HPF43909C37468.local:443/ipp/print
(...)

Hmmmm… CUPS is connecting using good ol’ port 443, as if it were a Web thingy… What if I do the same?

Click on “New self-signed certificate”, click on Next, a couple of reloads… And a very nice color print came out of the printer, yay!

Now, it still baffles me (of course I checked!): The self-signed certificate is now said to come from Issuer : CN=HPC37468, L=Vancouver, ST=Washington, C=US, O=HP,OU=HP-IPG, alright… not that it matters (I can import a more meaningful one if I really feel like it), but, why is it Issued On: 2019-06-14 and set to Expires On: 2029-06-11?

Anyway, print quality is quite nice. I hope to keep the printer long enough to rant at the certificate being expired in the future!

Dirk Eddelbuettel: #1 T^4: Adding Some Color to the Shell

11 May, 2020 - 06:04

The first proper video (following last week’s announcement) is up for new T^4 series of video lightning talks with tips, tricks, tools, and toys. Today we just to a little enhancement for the shell enabled color output (if not already on by default).

The slides are available here.

Next week we continue on shell customization by looking at the prompt.

Also of note, a new repo at GitHub to support the series: use it to open issues for comments, criticism, suggestions, or feedback.

If you like this or other open-source work I do, you can now sponsor me at GitHub. For the first year, GitHub will match your contributions.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Enrico Zini: Fraudsters and pirates

11 May, 2020 - 06:00
Adele Spitzeder - Wikipedia history people archive.org 2020-05-11 Adelheid Luise "Adele" Spitzeder ([ˈaːdl̩haɪt ʔaˈdeːlə ˈʃpɪtˌtseːdɐ]; 9 February 1832 – 27 or 28 October 1895), also known by her stage name Adele Vio, was a German actress, folk singer, and con artist. Initially a promising young actress, Spitzeder became a well-known private banker in 19th-century Munich when her theatrical success dwindled. Running what was possibly the first recorded Ponzi scheme, she offered large returns on investments by continually using the money of new investors to pay back the previous ones. At the height of her success, contemporary sources considered her the wealthiest woman in Bavaria. Anne Bonny - Wikipedia history people archive.org 2020-05-11 Anne Bonny (possibly 1697 – possibly April 1782)[1][2] was an Irish pirate operating in the Caribbean, and one of the most famous female pirates of all time.[3] The little that is known of her life comes largely from Captain Charles Johnson's A General History of the Pyrates. Mary Read - Wikipedia history people archive.org 2020-05-11 Mary Read (1685 – 28 April 1721), also known as Mark Read, was an English pirate. She and Anne Bonny are two of the most famed female pirates of all time, and among the few women known to have been convicted of piracy during the early 18th century, at the height of the "Golden Age of Piracy". Women in piracy - Wikipedia history people archive.org 2020-05-11 While piracy was predominantly a male occupation, a minority of pirates were women.[1] On many ships, women (as well as young boys) were prohibited by the ship's contract, which all crew members were required to sign.[2] :303

Ben Hutchings: Debian LTS work, April 2020

11 May, 2020 - 00:27

I was assigned 20 hours of work by Freexian's Debian LTS initiative, and carried over 8.5 hours from March. I worked 26 hours this month, so I will carry over 2.5 hours to May.

I sent a (belated) request for testing an update of the linux package to 3.16.82. I then prepared and, after review, released Linux 3.16.83, including a large number of security fixes. I rebased the linux package onto that and will soon send out a request for testing. I also spent some time working on a still-embargoed security issue.

I did not spend signficant time on any other LTS activities this month, and unfortunately missed the contributor meeting.

Russell Coker: IT Asset Management

10 May, 2020 - 20:42

In my last full-time position I managed the asset tracking database for my employer. It was one of those things that “someone” needed to do, and it seemed that only way that “someone” wouldn’t equate to “no-one” was for me to do it – which was ok. We used Snipe IT [1] to track the assets. I don’t have enough experience with asset tracking to say that Snipe is better or worse than average, but it basically did the job. Asset serial numbers are stored, you can have asset types that allow you to just add one more of the particular item, purchase dates are stored which makes warranty tracking easier, and every asset is associated with a person or listed as available. While I can’t say that Snipe IT is better than other products I can say that it will do the job reasonably well.

One problem that I didn’t discover until way too late was the fact that the finance people weren’t tracking serial numbers and that some assets in the database had the same asset IDs as the finance department and some had different ones. The best advice I can give to anyone who gets involved with asset tracking is to immediately chat to finance about how they track things, you need to know if the same asset IDs are used and if serial numbers are tracked by finance. I was pleased to discover that my colleagues were all honourable people as there was no apparent evaporation of valuable assets even though there was little ability to discover who might have been the last person to use some of the assets.

One problem that I’ve seen at many places is treating small items like keyboards and mice as “assets”. I think that anything that is worth less than 1 hour’s pay at the minimum wage (the price of a typical PC keyboard or mouse) isn’t worth tracking, treat it as a disposable item. If you hire a programmer who requests an unusually expensive keyboard or mouse (as some do) it still won’t be a lot of money when compared to their salary. Some of the older keyboards and mice that companies have are nasty, months of people eating lunch over them leaves them greasy and sticky. I think that the best thing to do with the keyboards and mice is to give them away when people leave and when new people join the company buy new hardware for them. If a company can’t spend $25 on a new keyboard and mouse for each new employee then they either have a massive problem of staff turnover or a lack of priority on morale.

Related posts:

  1. Bad Project Management I have just read a rant by Sean Middleditch about...
  2. New Dell Server My Dell PowerEdge T105 server (as referenced in my previous...
  3. CPU time use from WordPress Javascript Currently I have some significant problems with Javascript CPU use...

Norbert Preining: Updating Dovecot for Debian

10 May, 2020 - 19:54

A tweet of a friend pointed me at the removal of dovecot from Debian/testing, which surprised me a bit. Investigating the situation it seems that Dovecot in Debian is lagging a bit behind in releases, and hasn’t seen responses to some RC bugs. This sounds critical to me as dovecot is a core part of many mail setups, so I prepared updated packages.

Based on the latest released version of Dovecot, 2.3.10, I have made a package starting from the current Debian packaging and adjusted to the newer upstream. The package builds on Debian Buster (10), Testing, and Unstable on i386 and x64 archs. The packages are available on OBS, as usual:

For Unstable:

deb https://download.opensuse.org/repositories/home:/npreining:/debian-dovecot/Debian_Unstable/ ./

For Testing:

deb https://download.opensuse.org/repositories/home:/npreining:/debian-dovecot/Debian_Testing/ ./

For Debian 10 Buster:

deb https://download.opensuse.org/repositories/home:/npreining:/debian-dovecot/Debian_10/ ./

To make these repositories work, don’t forget that you need to import my OBS gpg key: obs-npreining.asc, best to download it and put the file into /etc/apt/trusted.gpg.d/obs-npreining.asc.

These packages are provided without any warranty. Enjoy.

NOKUBI Takatsugu: Virtual Background using webcam

10 May, 2020 - 17:50

I made a webpage to produce virtual background with webcam.

https://knok.github.io/virtbg/

Sorcecode:
https://github.com/knok/knok.github.io/tree/master/virtbg

Some online meeting software (Zoom, Microsoft Teams) supports virtual background, but I want to use other software like Jitsi (or google meet), so I made it.

To make this, I referred the article “Open Source Virtual Background”. 
The following figure is the diagram.

It depends on docker, GPU, v4l2loopback (only works on Linux), so I want to make more generic solution. To make as a webpage, and using OBS
Studio with plugins (obs-v4l2sink, OBS-VirtualCam or OBS (macOS) Virtual Camera) you can use the solution on more platforms.

To make as a single webpage, I can reduce overhead using inter-process commuication using http via docker.

This is an example animation:

Using jisti snapshot:

Unfortunately, BodyPix releases only pretraind models, no training data.

I need more improvements:

  • Accept any background images
  • Suppot choose camera device
  • Useful UI

Russ Allbery: Review: Golden Gates

10 May, 2020 - 11:24

Review: Golden Gates, by Conor Dougherty

Publisher: Penguin Copyright: 2020 ISBN: 0-525-56022-X Format: Kindle Pages: 249

This review, for reasons that will hopefully become clear later, starts with a personal digression.

I have been interested in political theory my entire life. That sounds like something admirable, or at least neutral. It's not. "Interested" means that I have opinions that are generally stronger than my depth of knowledge warrants. "Interested" means that I like thinking about and casting judgment on how politics should be done without doing the work of politics myself. And "political theory" is different than politics in important ways, not the least of which is that political actions have rarely been a direct danger to me or my family. I have the luxury of arguing about politics as a theory.

In short, I'm at high risk of being one of those people who has an opinion about everything and shares it on Twitter.

I'm still in the process (to be honest, near the beginning of the process) of making something useful out of that interest. I've had some success when I become enough a part of a community that I can do some of the political work, understand the arguments at a level deeper than theory, and have to deal with the consequences of my own opinions. But those communities have been on-line and relatively low stakes. For the big political problems, the ones that involve governments and taxes and laws, those that decide who gets medical treatment and income support and who doesn't, to ever improve, more people like me need to learn enough about the practical details that we can do the real work of fixing them, rather than only making our native (and generally privileged) communities better for ourselves.

I haven't found my path helping with that work yet. But I do have a concrete, challenging, local political question that makes me coldly furious: housing policy. Hence this book.

Golden Gates is about housing policy in the notoriously underbuilt and therefore incredibly expensive San Francisco Bay Area, where I live. I wanted to deepen that emotional reaction to the failures of housing policy with facts and analysis. Golden Gates does provide some of that. But this also turns out to be a book about the translation of political theory into practice, about the messiness and conflict that results, and about the difficult process of measuring success. It's also a book about how substantial agreement on the basics of necessary political change can still founder on the shoals of prioritization, tribalism, and people who are interested in political theory.

In short, it's a book about the difficulty of changing the world instead of arguing about how to change it.

This is not a direct analysis of housing policy, although Dougherty provides the basics as background. Rather, it's the story of the political fight over housing told primarily through two lenses: Sonja Trauss, founder of BARF (the Bay Area Renters' Federation); and a Redwood City apartment complex, the people who fought its rent increases, and the nun who eventually purchased it. Around that framework, Dougherty writes about the Howard Jarvis Taxpayers Association and the history of California's Proposition 13, a fight over a development in Lafayette, the logistics challenge of constructing sufficient housing even when approved, and the political career of Scott Wiener, the hated opponent of every city fighting for the continued ability to arbitrarily veto any new housing.

One of the things Golden Gates helped clarify for me is that there are three core interest groups that have to be part of any discussion of Bay Area housing: homeowners who want to limit or eliminate local change, renters who are vulnerable to gentrification and redevelopment, and the people who want to live in that area and can't (which includes people who want to move there, but more sympathetically includes all the people who work there but can't afford to live locally, such as teachers, day care workers, food service workers, and, well, just about anyone who doesn't work in tech). (As with any political classification, statements about collectives may not apply to individuals; there are numerous people who appear to fall into one group but who vote in alignment with another.) Dougherty makes it clear that housing policy is intractable in part because the policies that most clearly help one of those three groups hurt the other two.

As advertised by the subtitle, Dougherty's focus is on the fight for more housing. Those who already own homes whose values have been inflated by artificial scarcity, or who want to preserve such stratified living conditions as low-density, large-lot single-family dwellings within short mass-transit commute of one of the densest cities in the United States, don't get a lot of sympathy or focus here except as opponents. I understand this choice; I also don't have much sympathy. But I do wish that Dougherty had spent more time discussing the unsustainable promise that California has implicitly made to homeowners: housing may be impossibly expensive, but if you can manage to reach that pinnacle of financial success, the ongoing value of your home is guaranteed. He does mention this in passing, but I don't think he puts enough emphasis on the impact that a single huge, illiquid investment that is heavily encouraged by government policy has on people's attitude towards anything that jeopardizes that investment.

The bulk of this book focuses on the two factions trying to make housing cheaper: Sonja Trauss and others who are pushing for construction of more housing, and tenant groups trying to manage the price of existing housing for those who have to rent. The tragedy of Bay Area housing is that even the faintest connection of housing to the economic principle of supply and demand implies that the long-term goals of those two groups align. Building more housing will decrease the cost of housing, at least if you build enough of it over a long enough period of time. But in the short term, particularly given the amount of Bay Area land pre-emptively excluded from housing by environmental protection and the actions of the existing homeowners, building more housing usually means tearing down cheap lower-density housing and replacing it with expensive higher-density housing. And that destroys people's lives.

I'll admit my natural sympathy is with Trauss on pure economic grounds. There simply aren't enough places to live in the Bay Area, and the number of people in the area will not decrease. To the marginal extent that growth even slows, that's another tale of misery involving "super commutes" of over 90 minutes each way. But the most affecting part of this book was the detailed look at what redevelopment looks like for the people who thought they had housing, and how it disrupts and destroys existing communities. It's impossible to read those stories and not be moved. But it's equally impossible to not be moved by the stories of people who live in their cars during the week, going home only on weekends because they have to live too far away from their jobs to commute.

This is exactly the kind of politics that I lose when I take a superficial interest in political theory. Even when I feel confident in a guiding principle, the hard part of real-world politics is bringing real people with you in the implementation and mitigating the damage that any choice of implementation will cause. There are a lot of details, and those details matter. Without the right balance between addressing a long-term deficit and providing short-term protection and relief, an attempt to alleviate unsustainable long-term misery creates more short-term misery for those least able to afford it. And while I personally may have less sympathy for the relatively well-off who have clawed their way into their own mortgage, being cavalier with their goals and their financial needs is both poor ethics and poor politics. Mobilizing political opponents who have resources and vote locally isn't a winning strategy.

Dougherty is a reporter, not a housing or public policy expert, so Golden Gates poses problems and tells stories rather than describes solutions. This book didn't lead me to a brilliant plan for fixing the Bay Area housing crunch, or hand me a roadmap for how to get effectively involved in local politics. What it did do is tell stories about what political approaches have worked, how they've worked, what change they've created, and the limitations of that change. Solving political problems is work. That work requires understanding people and balancing concerns, which in turn requires a lot of empathy, a lot of communication, and sometimes finding a way to make unlikely allies.

I'm not sure how broad the appeal of this book will be outside of those who live in the region. Some aspects of the fight for housing generalize, but the Bay Area (and I suspect every region) has properties specific to it or to the state of California. It has also reached an extreme of housing shortage that is rivaled in the United States only by New York City, which changes the nature of the solutions. But if you want to seriously engage with Bay Area housing policy, knowing the background explained here is nearly mandatory. There are some flaws — I wish Dougherty would have talked more about traffic and transit policy, although I realize that could be another book — but this is an important story told well.

If this somewhat narrow topic is within your interests, highly recommended.

Rating: 8 out of 10

Andrew Cater: CD / DVD testing for Buster release 4 - 202005092130 - Slowing down a bit - but still going.

10 May, 2020 - 04:39
Last few architectures are being built in the background. Schweer has just confirmed successful testing of all the Debian Edu images - thanks to him, as ever, and to all involved. We're slowing up a bit - it's been a long, hot day and it's not quite over yet. The images release looks to be well on course. As ever, the point release incorporates security fixes and some packages have been removed. The release announcement at https://www.debian.org/News/2020/20200509 gives the details.

Andrew Cater: CD image testing for Buster release 4 - 202005091950 - Most install images checking out well

10 May, 2020 - 02:56
Lots of hard work going on. schweer has just validated all of the Debian Edu images.  Most of the normal install images have gone through tests with only a few minor hitches. Now moving on to the Live images. These take longer to download and test but we're working through them gradually.

As ever: a point release doesn't mean that the Debian you have is now obsolete - an apt-get / aptitude update will bring you up to the latest release very quickly. If you are updating regularly, you will have most of these files anyway. One small thing: the tools may report that the release version has changed. This is quite normal - base files have changed to reflect the new point release and this causes the notification. The notification is a small warning so that you are not taken by complete surprise but it is quite normal in the circumstances of a Debian point release.

Thanks to the other folk doing the hard work: 10+ hours and continuing.

Pages

Creative Commons License ลิขสิทธิ์ของบทความเป็นของเจ้าของบทความแต่ละชิ้น
ผลงานนี้ ใช้สัญญาอนุญาตของครีเอทีฟคอมมอนส์แบบ แสดงที่มา-อนุญาตแบบเดียวกัน 3.0 ที่ยังไม่ได้ปรับแก้