Language Selection

English French German Italian Portuguese Spanish

Debian

Syndicate content
Planet Debian - https://planet.debian.org/
Updated: 1 hour 45 min ago

Russ Allbery: Review: Cold Fire

Monday 27th of May 2019 02:19:00 AM

Review: Cold Fire, by Kate Elliott

Series: Spiritwalker #2 Publisher: Orbit Copyright: September 2011 ISBN: 0-316-19635-5 Format: Kindle Pages: 512

Cold Fire is the sequel to Cold Magic and picks up directly where the last book left off. Elliott does a good job reminding the reader of the events in the previous book, but as the second book in a series with strong trilogy structure, it's not a good place to start.

The story opens with more political intrigue. Cat, Bea, and Rory meet, somewhat more formally, the force behind the political radicals, who comes complete with intimidating prophecies about the role of Cat and Bea in upcoming political upheavals. This is followed by some startling revelations about the headmaster of the school Cat and Bea were attending at the start of Cold Magic, which cast the politics of this series in a new and more complicated light. But before long, Cat is thrown into the spirit world for a frightening, revealing, and ominous confrontation with an entirely different power, and from there to literally the other side of the world.

The challenge of a trilogy is always what to do in the second book. The first book introduces the characters and lays the groundwork of the story, and the third book is the conclusion towards which the whole series builds. The second book is... awkward. The plot needs to move forward to keep the reader engaged, so it needs some intermediate climax, but it can't resolve the central conflict of the series. The problem is particularly acute when the trilogy is telling a single story split across three books, as is the case here. Elliott takes one of the limited choices: throw the protagonist into an entirely different side quest that can have its own climax without resolving the main plot.

That side quest involves this world's version of the Caribbean, an introduction to a much different type of magic than the two (or arguably three) seen so far, and the salt plague. Cat washes up with little but the clothes on her back, in the worst possible location, and has to navigate a new social structure, a new set of political complexities, and an entirely foreign culture, all while caught in a magical geas. The characters from the first book do slowly filter back into the story, but Cat has to rely primarily on her own ingenuity and her own abilities.

I know very little about the region and therefore am not the reviewer to comment on Elliott's Caribbean, although I do think she was wise (as she mentions in the book) to invent an entirely fictional patois rather than trying to adopt one from our world. I can say that the political situation follows the overall trend of this series: what if no one ever decisively won a war, and every culture remained in an uneasy standoff? This story takes place in Expedition (referred to a few times in the first book): a carved-out enclave of independent local rule that serves as a buffer between traders from Cat's Europe and a powerful local civilization built on substantial fire magic. The trolls are here too and play a significant role, although this is not their home. The careful balance of power, and the lack of conquest or significant colonialism, feel refreshingly different. Elliott manages to pull off combining that world with the threat of a version of the Napoleonic Wars without too much cognitive dissonance, at least for me.

The strength of this book is its ability to portray the simmering anger and hope of rebellion and radicalism. The background politics are clearly inspired by the French Revolution and the subsequent popular uprisings such as the June Rebellion (known in the US primarily due to Les Miserables), and they feel right to me. Society is fractured along class fault lines, people are careful about what they say and to whom, radicals meet semi-openly but not too openly, and the powers-that-be periodically try to crush them and re-establish dominance. But beneath the anger and energy is an excited, soaring optimism, a glimpse at a possible better world to fight for, that I enjoyed as an emotional backdrop to Cat's story.

That said, none of this moves the plot of the first book forward very far, which is a little unsatisfying. We're given some significant revelations about the world at the very start of this book, and pick up the fraught political maneuverings and multi-sided magical conflict at the end of the book, but the middle is mostly Cat navigating friendships and social judgment. Oh, and romantic tensions.

It was obvious from the first book that this was going to turn into a romance of the "bicker until they fall in love" variety. I'm somewhat glad Elliott didn't drag that out into the third book, since I find the intermediate stages of those romances irritating. But that means there's a lot of conflicted feelings and people refusing to talk to each other and miscommunication and misunderstanding and apparent betrayal in this book. It's all very dramatic in a way that I found a little eye-roll-inducing, and I would have preferred to do without some of the nastier periods of blatant miscommunication. But Elliott does even more work to redeem Andevai, and I continue to like Cat even when she's being an idiot. She has the substantial merits of erring on the side of fighting for what she believes and being unable to stay quiet when she probably should.

I think this was a bit weaker than Cold Magic for primarily structural reasons, and it ran into a few of my personal dislikes, but if you liked the first book, I think you'll like this as well. Both Cat and Bea have grown and changed substantially since the first book, and are entering the final book with new-found confidence and power. I'm looking forward to the conclusion.

Followed by Cold Steel.

Rating: 7 out of 10

Keith Packard: snek-scoping

Monday 27th of May 2019 01:02:06 AM
Snek Adopts More Python Scoping

Python's implicit variable declarations are tricky and Snek had them slightly wrong. Fixing this meant figuring out how they work in Python, then figuring out the simplest possible expression to make the result fit in the ROM.

Local Variable Declaration

Local variables are declared in Python either as formal parameter names, or by placing them on the left hand side of a simple assignment operator:

def foo(a, b): c = 12 return a + b + c

There are three local variables in function foo — a, b and c.

Global Variable Declaration

Global variables are declared in Python in one of two ways:

1) A simple assignment at global scope

2) A simple assignment in a function which also has a 'global' statement including the same name

a = 12 def foo(c): global b b = c

This defines both 'a' and 'b' as globals.

Global Variable Usage

Global variables can be used within functions without explicitly declaring them.

a = 12 def foo(c): return a + c

You may be explicit about a's scope using a 'global' statement

a = 12 def foo(c): global a return a + c

These two forms are equivalent, unless you also include an assignment expression with a on the LHS (left hand side):

a = 12 def foo(c): a = 13 return a + c

is not the same as

a = 12 def foo(c): global a a = 13

as the former declares a new local, 'a', and leaves the global unchanged while the latter changes the global value.

Local Variable Usage

Python3 does whole-function analysis to figure out whether a name is local or not. If there is any assignment of a name within a function, that name references a local variable. Consider the following:

a = 12 def foo(c): b = a + c return b def bar(c): b = a + c a = 1 return b

The function 'foo' references the global named 'a', while the function 'bar' attempts to reference the local named 'a' before it has been assigned a value and, hence, generates an error.

Snek doesn't do this whole-function analysis, so 'bar' uses the global 'a' in the first statement as it hasn't yet reached the definition of 'a' as a local variable.

Augmented Assignments

Python Augmented Assignment statements are similar to C's Compound assignment operators — +=, *=, /=, etc. The Python reference has this to say about them:

"An augmented assignment expression like x += 1 can be rewritten as x = x + 1 to achieve a similar, but not exactly equal effect."

Because they work similar to assignment statements, they can declare a new variable in the current scope, if no such name has been included in previous assignment, global or non-local statements. Also, because they reference the variable on the RHS (right hand side), they need that variable to have already been defined before this statement executes.

Scoping in Snek

Because Snek doesn't do whole-function analysis, it can't 'see' later assignments in a function, and so a function with a use-before-assignment generates the following (non-Pythonic) result:

a = 12 def foo(c): b = a + c a = 1 return b > foo(13) 25

Fixing this would require additional tracking within the compiler, which I may add at some point, but for now, saving memory during compilation seems useful.

Snek Augmented Assignments

While Snek doesn't currently handle the general case of use-before-assignment involving separate statements, the simpler case with augmented assignments doesn't require saving any state during compilation and seems like something more useful to catch as without it, you would get:

a = 12 def foo(c): a += c return a > foo(13) 25 > a 12

The value of 'a' is left as 12 because the augmented assignment fetches 'a' first, which finds the global variable 'a', but then when it assigns the resulting value, it creates a new local variable 'a', just as if this code looked like:

a = 12 def foo(c): b = a + c return b

Checking this case requires adding a special-case for augmented assignment within a function to see if the name has been declared or included in 'global' statement in the function.

Dirk Eddelbuettel: nanotime 0.2.4

Monday 27th of May 2019 12:18:00 AM

Another minor maintenance release of the nanotime package for working with nanosecond timestamps arrived on CRAN yesterday.

nanotime uses the RcppCCTZ package for (efficient) high(er) resolution time parsing and formatting up to nanosecond resolution, and the bit64 package for the actual integer64 arithmetic. Initially implemented using the S3 system, it now uses a more rigorous S4-based approach thanks to a rewrite by Leonardo Silvestri.

This release adds the [[ accessor; this had not come up before in direct use or via data.table (which, to its credit, has supported nanotime for years already). But it came up in another usage pattern so we quickly added it as it it really is merely a dispatch to the excellent bit64 package underlying this.

Changes in version 0.2.4 (2019-05-25)
  • Define [[ method (Dirk in #45 fixing #44).

We also have a diff to the previous version thanks to CRANberries. More details and examples are at the nanotime page; code, issue tickets etc at the GitHub repository.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Iustin Pop: Corydalis v0.4 released!

Sunday 26th of May 2019 09:53:41 PM

Today I managed to do two things that I’m proud of: first, I cast my vote in the Romanian Euro-Parliament elections and the referendum (it makes me cringe that we have to vote about such a thing, in 2019; or, it makes me happy we can vote about it, take your pick). Since this is not a political blog, let’s skip my rants about that, and move on to the subject at hand: second, after what seems like an eternity, I’ve finally managed to put together a new Corydalis release. Why so long? Well…

At first, after the previous release (in March last year), I stopped for a while, which turned into ~6 months of no activity, and only in October I really started working on it again. And then, once I re-started working on it, I had three main things I was working on in parallel, and only now I managed to finish them all. April/May felt like a long, hard push to get things finished, and I’m very happy with the result.

Background info: Corydalis is a web-based image/video viewer to be used for local (non-cloud) image collections. Think of it as Geeqie but web-based and understanding your entire image collection. And 100× harder to initially setup and run.

Beware: long post and many/large images below!

New features Movies!

First and most important for me: movies support. Not perfect, as it fully relies on the browser, but good enough for a start.

There are two ways to view a movie: in the old image viewer, click on the preview, or press enter, and it will open the raw bytes in a separate browser window (tab). Clunky, but works, and so far I haven’t found a way to merge nicely a <video> element in the canvas-based image viewer.

Second, in the new browsing UI (next paragraph), they render nicely in a lightbox.

Things to fix here, quite a few:

  • should pre-render movies in formats accepted by the browser
  • and at reasonable sizes (4K is not a good option)
  • orientation is many times problematic
  • etc. etc.

But I can see my movies as well.

New image browsing UI

Second, and very important UI improvement: finally a friendly UI to browse images; the single-image viewer was good (and is still fastest way to look at images _sequentially), but looking at a collection of pictures was beyond painful (tables, oh my!)

So this looks a like this now:

Image browsing view

Or, when opening (clicking/touching) on an image or movie, it opens a “light-box” as follows:

Landscape-orientation lightbox

or:

Portrait-orientation lightbox

Whereas the previous version of “image browsing” was just showing a table view with some small thumbnails:

Image list view

Quite a change, right?

The table-based view is still present, as sometimes (for library analysis) it’s more useful, but it’s no longer the default. Well, in all cases but one—folder view (to be fixed). And one can switch at will between image/folder views on one hand, and grid/list on the other.

The same new UI applies also when looking at folders, except that activating a folder doesn’t open a light-box, but instead goes to the old image viewer for that specific image. Not sure what the best UI would be here.

All the new features here courtesy of many cool JavaScript libraries: masonry, infinite scroll, images loaded, fancybox.

Improved repository scanning

And finally, for actually being able to work with large libraries: reworked repository scanning, and now it works a) reliably, and b) with a nice progress report. It looks like this when starting:

Starting scan

and later it looks like this:

Rendering previews

Or, if you’ve misconfigured something, it can look like:

Oops! Other changes

There have been many internal improvements, which will unblock more features in the future, but among the more user visible: more search atom, fewer cases of “image not viewable” when looking at folders, better/more lenient exif parsing, search atoms parser fixes, etc.

On the library curation side, there have been a number of low-profile improvements, but not very solid. I would need something like “you haven’t used camera/lens X in N months, you should sell it”, or some other kind of more practical insights into all the picture data, but for now, only having some graphs like:

Camera statistics over time

Note the above graph is very simplistic since there were very few images in the public site when I generated it. Normally, with tens of thousands of images, it gives a much more interesting “trends-over-time” view.

Demo site and documentation

The above screenshots are from the demo site at demo.corydalis.io, which runs the new version. I also took the opportunity to refresh and increase the number of pictures I have here, to show a (tiny) bit how it works for larger repositories.

The documentation, including a detailed changelog, is as usual on corydalis.readthedocs.io.

The code is, as before, on GitHub. The installation process is probably still the clunkiest thing, but… I should probably create a docker image that allows quick experimentation. To me installation is clear, which most likely only means that nobody else tried it yet :-P.

Code stats

As before, Corydalis is still a tiny bit of Haskell on the backend, driving a humongous amount of JavaScript libraries on the frontend.

More precisely, and counting only code lines, not comment/white space: my own code is ~5.5K lines of Haskell, 500 lines of JavaScript, and less than one hundred lines of CSS. The libraries I depend on, on the other hand: 83K lines of JavaScript, and 12.7K lines of CSS. It’s true that Plotly is the elephant in the room, since it accounts for 58K of those 83K lines, i.e. 70% of the code.

What this tells is that while I might be able to write some Haskell, I’m still very unaccustomed to front-end development, beyond putting other people’s things together. Well, it works so far, but I should really learn more—there’s so much more that could be done before this small webapp would be a real app.

Just looking at changes on my own part of the code-base since last release:

$ git diff v0.3.0.. src/ static/corydalis/ test/ app/ doc/ … 55 files changed, 4223 insertions(+), 1425 deletions(-).

I.e. significant churn given the files in all those directories are (now) ~10K lines.

On the (frontend) dependencies side, the list of things shipped with Corydalis has increased, and I still don’t know what the best model for keeping up-to-date would be. Just linking to CDNs is something I don’t want to do, so I’m stuck for the moment with embedding things.

Testing

One positive aspect of the changes during the improved repository scanning is that it enabled a significant internal change: decoupling the app state from “global” variables (IORefs) and moving it to a context/handler pattern, which means testing more complex setups can now be done. I haven’t done much on expanding testing in this release (sadly), but nevertheless, coverage (“expressions”, as hpc says) improved to a paltry ~30% from before ~10% I think.

However, that’s just the Haskell/backend side. I have no idea how to test the JS part, and mostly relying for now on the fact that I don’t have much JS code myself. Also, see notes above about front-end development. One day I’ll learn PureScript. Or Elm. Or wait for GhcJS. Or or or…

Future

I’m at the stage now where Corydalis has definitely scratched 90% of the initial itch, but there’s still some things that I need myself, so I think I’ll keep developing it more, but not like in the past two months. Ideally a lower but more constant progress.

On the other hand, I’m somewhat tempted to move my public pictures from SmugMug to it, becoming self-hosting (yay!), but that would require some more work to make it somewhat viable as an external image hosting site (and not one for internal use), so not sure yet which direction to go into.

Or, it might be another year without a release, we’ll see.

And as usual, comments are very welcome, either here or on GitHub.

Laura Arjona Reina: MiniDebConf Marseille 2019

Sunday 26th of May 2019 09:14:25 AM

I’ve attended the MiniDebConf Marseille (France) during this weekend (25–26 May 2019).

I’m very happy that I could meet new Debian and free software friends and meet again other Debian friends.

I gave a talk about the Welcome team and some examples of non-packaging contributions to Debian. You can see the slides in the Welcome team wiki page and the video will be linked there when it is available (probably soon, thanks to our awesome Debian Video Team!).

I could also talk face to face with some Debian mates about the publicity team and other things. Everybody was very welcoming, a day full of good moments.

The conference is still going on today, but I’m already on my way home (I couldn’t attend on Sunday).

I’ve also paid attention to learn about the organisation, maybe I can find the opportunity and the resources to host a Debian event in my area in the future.

I gave away some Debian/free software stickers and got some more made by Debian France, to ensure I always have some, for future Debian contributors

Steve Kemp: Language/Communication development is fascinating

Sunday 26th of May 2019 09:01:00 AM

We have a child who is now reaching 2.5 years old, and watching the language develop is fascinating. No doubt every parent experiences a similar level of amazement, but it's still new to me.

Part of the fun of watching our child grow his communication skills is obviously his bilingual nature; his mother speaks Finnish to him, and I speak English. Of course the pair of us communicate in English almost exclusively, but by contrast basically every other conversation he hears will be in Finnish.

Today I was struck by a new milestone, as he said the word "dog" for the first time.

I continue to read books to him, and of course they're simple books with lots of pictures. For over a year now he's been able to follow simple instructions:

  • Can you point to the dog?
  • Can you point to the cat?

I remember being really impressed when he was coming up to two years old, and he was consistently able to play that game - despite the various dogs/cats being drawn in different styles, and from different perspectives. (Cartoon dogs vary a lot; but he always was able to recognize them. He's obviously internalized the nature of the dog...)

Anyway he speaks pretty well, getting into two-four word sentences now. His favourite words are predictably enough "Äiti" (mother) and "en" (no). But he's never said dog until today, instead he's said:

  • Woof-woof
    • Probably as a result of months of me saying "dogs go woof", "cats go miow", & etc.
  • Hauva
    • Finnish for "doggy".
  • Dana
    • We have a neighbour with a dog. This dog moved in with us for a few days, and he fell in love.
    • That dog is called Dana, so suddenly all dogs became Dana.

Anyway today we were walking to the park and he said "iso-dog", "iso" being Finnish for "big". Indeed there was a big dog in front of him.

Good dog. Good boy.

Some of our conversations are quite intricate, some of the instructions we give him he can clearly understand/follow along - in two languages - but when I hear him use a new word, especially an English word, I'm suddenly reminded how awesome everything is.

Gunnar Wolf: Towel Day 2019

Sunday 26th of May 2019 03:39:32 AM

Today we went to celebrate a good friend's birthday. And while most of my social circles are in some way geeky or geekier... This one is definitively geekiest. Not so much in the Free Software alignment scale, but in many, many other ways.

I was (pleasantly!) surprised to find we were four fellow potential hitchhikers (on the photo above, Jesús Wong; Susana and Aaron were also towel-bearers).
Oh, but you are still asking yourself what this is about?
I gather you have not yet read The Hitchhiker's Guide to the Galaxy, by Douglas Adams. The international Towel Day is observed annually on May 25, since 2001. And why? In Adams' words:

A towel, it says, is about the most massively useful thing an interstellar hitchhiker can have. Partly it has great practical value. You can wrap it around you for warmth as you bound across the cold moons of Jaglan Beta; you can lie on it on the brilliant marble-sanded beaches of Santraginus V, inhaling the heady sea vapours; you can sleep under it beneath the stars which shine so redly on the desert world of Kakrafoon; use it to sail a miniraft down the slow heavy River Moth; wet it for use in hand-to-hand-combat; wrap it round your head to ward off noxious fumes or avoid the gaze of the Ravenous Bugblatter Beast of Traal (such a mind-bogglingly stupid animal, it assumes that if you can't see it, it can't see you — daft as a brush, but very very ravenous); you can wave your towel in emergencies as a distress signal, and of course dry yourself off with it if it still seems to be clean enough.

More importantly, a towel has immense psychological value. For some reason, if a strag (strag: non-hitch hiker) discovers that a hitchhiker has his towel with him, he will automatically assume that he is also in possession of a toothbrush, face flannel, soap, tin of biscuits, flask, compass, map, ball of string, gnat spray, wet weather gear, space suit etc., etc. Furthermore, the strag will then happily lend the hitch hiker any of these or a dozen other items that the hitch hiker might accidentally have "lost." What the strag will think is that any man who can hitch the length and breadth of the galaxy, rough it, slum it, struggle against terrible odds, win through, and still knows where his towel is, is clearly a man to be reckoned with.

Thanks to the DC18 organizers for providing such a handy gift, thanks to Andreas Tille for kindly reminding us the observation of this important festivity, and thanks to Felipe Esquivel for providing photographic evidence.

Russ Allbery: Review: The Raven Tower

Sunday 26th of May 2019 12:32:00 AM

Review: The Raven Tower, by Ann Leckie

Publisher: Orbit Copyright: February 2019 ISBN: 0-316-38871-8 Format: Kindle Pages: 432

Mawat is the heir to the Raven's Lease, raised with the self-assurance, determination, stubbornness, and certainty of one who will become the interface between his people and their god. Thankfully, he also has some good sense. One sign of that good sense is Eolo, his servant and companion: thoughtful, careful, curious, guarded, and well-accustomed to keeping secrets and private counsel. Eolo is the window through which the reader sees the city-state of Vastai, confident and certain in its divine protection and its long-standing bargain with the god Raven.

Raven manifests in the Instrument, a designated raven who can speak and give advice to the ruler of Vastai, the Raven's Lease. The Lease cannot be harmed, cannot be killed, because they are a sacrifice to the Raven. When the Instrument dies, so does the Lease, and a new Lease is chosen as the new Instrument is hatched. Vastai and the kingdom of Iraden have flourished under this arrangement for centuries. Mawat, hot-headed and sure of himself, will be the next Lease once his father sacrifices his life to the Raven. As this book opens, Mawat and Eolo are hurrying back to the city in anticipation of that event.

Mawat is very surprised when he arrives in Vastai and finds the Instrument dead, his uncle the apparent new Raven's Lease, and his father supposedly fled but not properly dead as the expected sacrifice. This is not how the world was supposed to work. Either someone is lying, or things have gone horribly wrong.

In another fantasy novel, that would be the story. Hot-headed but good-hearted Mawat walks into unexpected political intrigue, and his loyal and cautious servant Eolo untangles it for him, proving that Mawat's one redeeming feature is his good choice in friends. This is not that book, because the protagonist of The Raven Tower is not actually Eolo.

The protagonist is a large rock.

This is the second work I've read recently, after N.K. Jemisin's Broken Earth trilogy, that uses a second-person narrator as a world-building hint. From the start of The Raven Tower, Eolo is addressed as "you" and observed and commented on by the narrator. The easy initial assumption is that the narrator is the Raven, but if so there's a drastic mismatch between how the people of Iraden see their god and how the narrator describes events. The reader learns more about the nature of the narrator only slowly, through flashbacks into the far past. It becomes clear quickly that the narrator is a god, a being whose every statement must either be true, become true, or lead to their death. What god, and how the narrator relates to Raven, Eolo, Mawat, or the city of Vastai, remains murky until the very end of the book.

I think this story is going to wrong-foot some readers. It starts in the form of a fantasy political intrigue involving lines of succession made more complicated by divinity and magic, but that's not what The Raven Tower is about at all. The flashbacks are less background than the heart of the story: a slow and careful examination of the nature of power and the relationship between humans and gods in this world. If your reaction to the antics of the gods of classical mythology is bafflement at why they risk so much and involve themselves in so much drama, this might be the book for you.

I'm not sure I can do better than Light's comment in her review (spoilers in the comments): "I fuckin' love that rock." In a world full of gods who meddle and support kingdoms and go to wars, the narrator of this novel much prefers to watch and analyze and take time to draw proper conclusions. They're also prone to deciding to think about something for a week or two. This is a relentlessly self-aware and introspective book in a way that I found soothing and oddly compelling, particularly once the narrator rock makes friends largely by accident and has to work through the unexpected feelings of emotional entanglement. (My favorite supporting character in this book by far is Myriad, and it takes some doing to get me to fall in love with a mosquito swarm.) It turns into a story about restraint, careful navigation of dangerous situations, oppression, historical injustice, and a very long game.

The downside is that Eolo's story gets somewhat sidelined. The ending is going to be unsatisfying for a lot of readers since the surface story doesn't get a lot of closure. I liked Eolo for a whole host of reasons and wanted more of an end to their story than I got. (I'm using "they" for pronouns by default here. Eolo is trans and passing as male, but it's unclear to me from the story whether they identify as male or non-binary.) Eolo and Mawat provide an important outside perspective, and ground the longer story and make it more immediate, but they're present here more to provide key pieces of the puzzle than to drive the story themselves.

That caveat aside, I really enjoyed this book. It's less immediately engaging and emotionally engrossing than the Imperial Radch novels, but it's a story that slowly grew on me and will stick with me for a long time. There's something deeply relatable in how the narrator relentlessly examines their interactions with the world, and what makes them happy, sad, and interested, and still arrives at conclusions that are a messy combination of logic and emotion because that's what is actually true. The story is beautifully constructed to show that change over time. It's full of tradeoffs and limitations and partial truths, and it lets them sit there on the page and be felt rather than resolving all of them. It's the sort of novel that gets better the more I think about it.

Be warned going in that you're not getting the medieval court drama with gods that you may think you're getting, but otherwise, highly recommended.

Rating: 8 out of 10

Jonathan Wiltshire: RC candidate of the day (5)

Saturday 25th of May 2019 05:00:47 PM

Sometimes the list of release-critical bugs is overwhelming, and it’s hard to find something to tackle.

In #929269 we find that coturn always overwrites its database with a blank file during upgrades. It should probably be created once and not shipped as part of the package.

Iustin Pop: The rain _will_ come

Saturday 25th of May 2019 03:00:00 PM

I haven’t wrote a blog post in a month because I’ve been entirely caught up in preparing a new Corydalis release… to the extend that today I felt completely fed up and wanted to do something else.

Initially the weather forecast said rain all afternoon, and around noon it was quite overcast, but then it lightened up, and the forecast moved the rain to late afternoon (5-6pm) and decreased the intensity (<1mm/hr, which is nothing). So (as it was still a bit before 3pm) a bike ride seemed in order. Especially as I recently realised I haven’t ridden my road bike in—gasp—10 months :( So, thinking about a 2-2½ hour slow ride, I got ready, took my road bike shoes, took the Garmin, took the bike bump, got into the bike room, pumped up the wheels, and got ready to ride.

First partial surprise, power meter battery dead. Like dead dead, didn’t even manage to wake it up. Kind of expected, since on my last ride in July last year it said low battery, but I was kind of hoping it still has some juice. Never-mind, a ride is a ride.

Second surprise, as I exit the garage, the sun is nowhere to be seen, clouds all around, and the air is cooler. So scratch my planned ride, I’ll just take a short ride around the neighbourhood to move a bit (both myself and the bike). I re-check the forecast and it says, some showers all over for the next 2 hours, but still not bad (3-5mm/h, but short spikes only).

Everything starts nice, and to my surprise, I haven’t forget how to clip in when starting at a stop light on the first try. The road bike system is asymmetric (you can only clip in on one side of the pedal), and I’m very much used to the SPD one from MTB, where you don’t have to look, just clip in. But there’s a trick even with the road system, you have to put your foot on the pedal quite precisely at “12:15” o’clock, and then you’re good as the inertia of the pedal will have put the correct side up at this position. I feared I forgot this, but all good, and I take a leisure ride.

Towards Regensdorf one could clearly see rain already, and all through my ride thunderstrikes were loud, but I was dry. Very dry, no rain at all. I thought the forecast was too pessimistic, I rode 30 minutes with nothing more than a few scattered raindrops.

And then, right at minute 30, and with less than 2km to go, the road starts getting wet. I was surprised as I couldn’t yet feel the rain, but the road was quite wet. A minute later, the rain is quite heavy, and another 30 seconds later, it turns a bit into hail, as it was making a “plonk” noise hitting my helmet and was painful on my skin. So in this last basically one kilometre I go from dry to completely drenched and cold, and I was very happy this was the end of my short ride, as I was not dressed nor prepared for rain.

Checking later the forecast, it showed that for about 5 or so minutes it rained with a rate of 15mm/h, which is pretty strong, before mellowing down to what it was predicted in the morning.

Lesson learned, but thankfully it was on a short ride, and was otherwise pleasant to get on a light and fast bike again :)

Jaskaran Singh: Hello World

Saturday 25th of May 2019 12:00:00 AM

I’ll be blogging about my activities in GSoC 2019 w/ Debian, and other stuff as well.

Jonathan Wiltshire: RC candidate of the day (4)

Friday 24th of May 2019 09:56:49 PM

Sometimes the list of release-critical bugs is overwhelming, and it’s hard to find something to tackle.

#928282 is a security issue in Filezilla with a patch upstream, a clean path to testing via unstable and no maintainer response so far this month.

Joey Hess: hacking water (teaser)

Friday 24th of May 2019 08:53:30 PM

Molly de Blanc: Enbies and women in FOSS Wikipedia edit-a-thon

Thursday 23rd of May 2019 08:24:31 PM

To be brief, I’ll be hosting a Wikipedia edit-a-thon on enbies and women in free and open source software, on June 2nd, from 16:00 – 19:00 EDT. I’d love remote participants, but if you’re in the Boston area you are more than welcome over to my place for pancakes and collaboration times.

Busy during that time? I recommend making some edits between now and then. Feel free to share them with me, so I can share your work with others!

For details and ideas, check out: this super cool etherpad!

Jonathan Wiltshire: RC candidate of the day (3)

Thursday 23rd of May 2019 05:00:21 PM

Sometimes the list of release-critical bugs is overwhelming, and it’s hard to find something to tackle.

Bug #929017 includes a patch which needs reviewing and, if it’s appropriate, uploading.

Michael Stapelberg: Optional dependencies don’t work

Thursday 23rd of May 2019 12:55:17 PM

In the i3 projects, we have always tried hard to avoid optional dependencies. There are a number of reasons behind it, and as I have recently encountered some of the downsides of optional dependencies firsthand, I summarized my thoughts in this article.

What is a (compile-time) optional dependency?

When building software from source, most programming languages and build systems support conditional compilation: different parts of the source code are compiled based on certain conditions.

An optional dependency is conditional compilation hooked up directly to a knob (e.g. command line flag, configuration file, …), with the effect that the software can now be built without an otherwise required dependency.

Let’s walk through a few issues with optional dependencies.

Inconsistent experience in different environments

Software is usually not built by end users, but by packagers, at least when we are talking about Open Source.

Hence, end users don’t see the knob for the optional dependency, they are just presented with the fait accompli: their version of the software behaves differently than other versions of the same software.

Depending on the kind of software, this situation can be made obvious to the user: for example, if the optional dependency is needed to print documents, the program can produce an appropriate error message when the user tries to print a document.

Sometimes, this isn’t possible: when i3 introduced an optional dependency on cairo and pangocairo, the behavior itself (rendering window titles) worked in all configurations, but non-ASCII characters might break depending on whether i3 was compiled with cairo.

For users, it is frustrating to only discover in conversation that a program has a feature that the user is interested in, but it’s not available on their computer. For support, this situation can be hard to detect, and even harder to resolve to the user’s satisfaction.

Packaging is more complicated

Unfortunately, many build systems don’t stop the build when optional dependencies are not present. Instead, you sometimes end up with a broken build, or, even worse: with a successful build that does not work correctly at runtime.

This means that packagers need to closely examine the build output to know which dependencies to make available. In the best case, there is a summary of available and enabled options, clearly outlining what this build will contain. In the worst case, you need to infer the features from the checks that are done, or work your way through the --help output.

The better alternative is to configure your build system such that it stops when any dependency was not found, and thereby have packagers acknowledge each optional dependency by explicitly disabling the option.

Untested code paths bit rot

Code paths which are not used will inevitably bit rot. If you have optional dependencies, you need to test both the code path without the dependency and the code path with the dependency. It doesn’t matter whether the tests are automated or manual, the test matrix must cover both paths.

Interestingly enough, this principle seems to apply to all kinds of software projects (but it slows down as change slows down): one might think that important Open Source building blocks should have enough users to cover all sorts of configurations.

However, consider this example: building cairo without libxrender results in all GTK application windows, menus, etc. being displayed as empty grey surfaces. Cairo does not fail to build without libxrender, but the code path clearly is broken without libxrender.

Can we do without them?

I’m not saying optional dependencies should never be used. In fact, for bootstrapping, disabling dependencies can save a lot of work and can sometimes allow breaking circular dependencies. For example, in an early bootstrapping stage, binutils can be compiled with --disable-nls to disable internationalization.

However, optional dependencies are broken so often that I conclude they are overused. Read on and see for yourself whether you would rather commit to best practices or not introduce an optional dependency.

Best practices

If you do decide to make dependencies optional, please:

  1. Set up automated testing for all code path combinations.
  2. Fail the build until packagers explicitly pass a --disable flag.
  3. Tell users their version is missing a dependency at runtime, e.g. in --version.

François Marier: Installing Ubuntu 18.04 using both full-disk encryption and RAID1

Thursday 23rd of May 2019 04:30:00 AM

I recently setup a desktop computer with two SSDs using a software RAID1 and full-disk encryption (i.e. LUKS). Since this is not a supported configuration in Ubuntu desktop, I had to use the server installation medium.

This is my version of these excellent instructions.

Server installer

Start by downloading the alternate server installer and verifying its signature:

  1. Download the required files:

    wget http://cdimage.ubuntu.com/ubuntu/releases/bionic/release/ubuntu-18.04.2-server-amd64.iso wget http://cdimage.ubuntu.com/ubuntu/releases/bionic/release/SHA256SUMS wget http://cdimage.ubuntu.com/ubuntu/releases/bionic/release/SHA256SUMS.gpg
  2. Verify the signature on the hash file:

    $ gpg --keyid-format long --keyserver hkps://keyserver.ubuntu.com --recv-keys 0xD94AA3F0EFE21092 $ gpg --verify SHA256SUMS.gpg SHA256SUMS gpg: Signature made Fri Feb 15 08:32:38 2019 PST gpg: using RSA key D94AA3F0EFE21092 gpg: Good signature from "Ubuntu CD Image Automatic Signing Key (2012) <cdimage@ubuntu.com>" [undefined] gpg: WARNING: This key is not certified with a trusted signature! gpg: There is no indication that the signature belongs to the owner. Primary key fingerprint: 8439 38DF 228D 22F7 B374 2BC0 D94A A3F0 EFE2 1092
  3. Verify the hash of the ISO file:

    $ sha256sum --ignore-missing -c SHA256SUMS ubuntu-18.04.2-server-amd64.iso: OK

Then copy it to a USB drive:

dd if=ubuntu-18.04.2-server-amd64.iso of=/dev/sdX

and boot with it.

Manual partitioning

Inside the installer, use manual partitioning to:

  1. Configure the physical partitions.
  2. Configure the RAID array second.
  3. Configure the encrypted partitions last

Here's the exact configuration I used:

  • /dev/sda1 is 512 MB and used as the EFI parition
  • /dev/sdb1 is 512 MB but not used for anything
  • /dev/sda2 and /dev/sdb2 are both 4 GB (RAID)
  • /dev/sda3 and /dev/sdb3 are both 512 MB (RAID)
  • /dev/sda4 and /dev/sdb4 use up the rest of the disk (RAID)

I only set /dev/sda2 as the EFI partition because I found that adding a second EFI partition would break the installer.

I created the following RAID1 arrays:

  • /dev/sda2 and /dev/sdb2 for /dev/md2
  • /dev/sda3 and /dev/sdb3 for /dev/md0
  • /dev/sda4 and /dev/sdb4 for /dev/md1

I used /dev/md0 as my unencrypted /boot partition.

Then I created the following LUKS partitions:

  • md1_crypt as the / partition using /dev/md1
  • md2_crypt as the swap partition (4 GB) with a random encryption key using /dev/md2
Post-installation configuration

Once your new system is up, sync the EFI partitions using DD:

dd if=/dev/sda1 of=/dev/sdb1

and create a second EFI boot entry:

efibootmgr -c -d /dev/sdb -p 1 -L "ubuntu2" -l \EFI\ubuntu\shimx64.efi

Ensure that the RAID drives are fully sync'ed by keeping an eye on /prod/mdstat and then reboot, selecting "ubuntu2" in the UEFI/BIOS menu.

Once you have rebooted, remove the following package to speed up future boots:

apt purge btrfs-progs

To switch to the desktop variant of Ubuntu, install these meta-packages:

apt install ubuntu-desktop gnome

then use debfoster to remove unnecessary packages (in particular the ones that only come with the default Ubuntu server installation).

Fixing booting with degraded RAID arrays

Since I have run into RAID startup problems in the past, I expected having to fix up a few things to make degraded RAID arrays boot correctly.

I did not use LVM since I didn't really feel the need to add yet another layer of abstraction of top of my setup, but I found that the lvm2 package must still be installed:

apt install lvm2

with use_lvmetad = 0 in /etc/lvm/lvm.conf.

Then in order to automatically bring up the RAID arrays with 1 out of 2 drives, I added the following script in /etc/initramfs-tools/scripts/local-top/cryptraid:

#!/bin/sh PREREQ="mdadm" prereqs() { echo "$PREREQ" } case $1 in prereqs) prereqs exit 0 ;; esac mdadm --run /dev/md0 mdadm --run /dev/md1 mdadm --run /dev/md2

before making that script executable:

chmod +x /etc/initramfs-tools/scripts/local-top/cryptraid

and refreshing the initramfs:

update-initramfs -u -k all Disable suspend-to-disk

Since I use a random encryption key for the swap partition (to avoid having a second password prompt at boot time), it means that suspend-to-disk is not going to work and so I disabled it by putting the following in /etc/initramfs-tools/conf.d/resume:

RESUME=none

and by adding noresume to the GRUB_CMDLINE_LINUX variable in /etc/default/grub before applying these changes:

update-grub update-initramfs -u -k all Test your configuration

With all of this in place, you should be able to do a final test of your setup:

  1. Shutdown the computer and unplug the second drive.
  2. Boot with only the first drive.
  3. Shutdown the computer and plug the second drive back in.
  4. Boot with both drives and re-add the second drive to the RAID array:

    mdadm /dev/md0 -a /dev/sdb3 mdadm /dev/md1 -a /dev/sdb4 mdadm /dev/md2 -a /dev/sdb2
  5. Wait until the RAID is done re-syncing and shutdown the computer.

  6. Repeat steps 2-5 with the first drive unplugged instead of the second.
  7. Reboot with both drives plugged in.

At this point, you have a working setup that will gracefully degrade to a one-drive RAID array should one of your drives fail.

Arthur Diniz: Welcome

Thursday 23rd of May 2019 12:00:00 AM

Welcome, in this blog I intend to present my contributions to Debian Project and update about the Cloud Image Finder project from Google Summer of Code 2019.

More in Tux Machines

Running Deep Learning Models On Intel Hardware? It's Time To Consider A Different OS

Firstly, Intel has done extensive work to make the Xeon family of processors highly optimized for AI. The Intel Xeon Scalable processors outsmart GPUs in accelerating the training on large datasets. Intel is telling its customers that they don’t need expensive GPUs until they meet a threshold. Most of the deep learning training can be effectively done on CPUs that cost a fraction of their GPU counterparts. Beyond the marketing messages and claims, Intel went onto prove that their deep learning stack performs better than NVIDIA GPU-based stack. Recently, Intel published a benchmark to show its leadership in deep learning. Intel Xeon Scalable processers trained a deep learning network with 7878 images per second on ResNet-50 outperforming 7844 images per second on NVIDIA Tesla V100. Intel’s performance optimization doesn’t come just from its CPUs. It is delivered by a purpose-built software stack that is highly optimized at various levels. From the operating system to the TensorFlow framework, Intel has tweaked multiple layers of software to deliver unmatched performance. To ease the process of running this end-to-end stack, Intel has turned to one of its open source projects called Clear Linux OS. Clear Linux project was started as a purpose-built, container-optimized, and lightweight operating system. It was started with the premise that the OS running a container doesn’t need to perform all the functions of a traditional OS. Container Linux, the OS developed by CoreOS (now a part of Red Hat) followed the same philosophy. Within a short span, Clear Linux gained popularity among open source developers. Intel kept improving the OS, making it relevant to run modern workloads such as machine learning training jobs, AI inferencing, analytics and edge computing. Read more Also: Intel Core i9 9900KS Allowing 5.0GHz All-Core, Icelake News Coming This Week

Games: Pathfinder: Kingmaker, MidBoss, CorsixTH, Railway Empire and Unbound: Worlds Apart

  • The RPG 'Pathfinder: Kingmaker' is getting a free Enhanced Edition update next month + new DLC
    Pathfinder: Kingmaker, the party-based RPG from Owlcat Games and Deep Silver is going to expand with a free Enhanced Edition and another DLC. They say it's going to include plenty of "gameplay-enriching content additions" along with the usual quality of life improvements to existing features, new abilities and ways to build your character, a new Slayer class, new items and weapons, improved balance especially in the beginning and last two chapters, an improved kingdom management system, an increased variety to the random encounters on the map and so on.
  • MidBoss, the unique body-snatching roguelike turns 2 with a big sale and future plans details
    MidBoss is a game we've covered here numerous times, mainly due to how unique it is. You take down enemies, take their body and it's pretty amusing. The developer, Kitsune Games, has supported Linux rather nicely and now that MidBoss is over two years old they've decided to put it on a big sale. Not just that, they've also announced a fancy sounding DLC that's coming along with a free update for everyone. The DLC will have brand new pixel-art for all of the monsters, which will include idle animations for them too so the DLC should make the game look a lot more interesting. Also being added in the DLC is a "randomizer mode", to make repeated runs in the game vastly different.
  • FOSS game engine 'CorsixTH' for Theme Hospital update 0.63 is out
    The first major release for the FOSS game engine in some time, CorsixTH 0.63 is out following the recent release candidate build. CorsixTH might not be "finished" but it's incredibly playable and does provide a better experience (mostly) over running the original Theme Hospital.
  • Railway Empire has another update and it's off to France in the latest DLC out now
    There appears to be no stopping this train, Railway Empire continues to see plenty of post-release support and extra optional content. Firstly, the latest "Community Update" is out taking feedback from (you guessed it) the community of players. They've introduced modding support to DLC scenarios, increased the total number of trains and stations you can have, new tooltips, you can skip the current music track using the new "P" hotkey, the train list will actually show problems employees have, new train list filtering options, train speed reduced if they're missing supplies and lots of other nice quality of life updates.
  • A Linux version of the mind-bending multi-dimensional 'Unbound: Worlds Apart' will come at release
    Unbound: Worlds Apart from Alien Pixel Studios is currently crowdfunding on Kickstarter, this hand-crafted puzzler looks like it could melt my mind with the portal system.

Linux 5.2-rc2

Hey, what's to say? Fairly normal rc2, no real highlights - I think most of the diff is the SPDX updates. Who am I kidding? The highlight of the week was clearly Finland winning the ice hockey world championships. So once you sober up from the celebration, go test, Linus Read more Also: Linux 5.2-rc2 Kernel Released As The "Golden Lions"

Audiocasts/Shows: Linux Action News, Linux Gaming News Punch, Open Source Security Podcast and GNU World Order