Language Selection

English French German Italian Portuguese Spanish

Kde Planet

Syndicate content
Planet KDE -
Updated: 5 hours 46 min ago

Krita 2019 Sprint

Monday 12th of August 2019 09:32:26 AM

Officially, on Friday the 2019 Krita Sprint was over. However, most people stayed until Saturday… It’s been a huge sprint! Almost a complete convention, a meeting of developers and artists.

All the sprinters together

The sprinters, artists, contributors and artists/contributors, all together. Photo taken by Krzyś.


On Monday, people started arriving. It was pretty great to meet again so many people who hadn’t seen each other for a long time, and to see so many people who hadn’t been to any Krita sprint before. We had rigged a HDR test system in the sprint area, which was, probably for the last time, since it’s getting too small, the 12th-century cellar underneath the Krita maintainer’s house, in the town centre of Deventer. Wolthera was kept busy all week giving an introduction to painting in HDR — she is, after all, the first person in the world to have actually done creative painting in HDR.

There was other hardware to test as well, like the Android tablet with Sharaf Zaman’s Android port of Krita. Sharaf couldn’t come to the sprint; his visum was denied, probably because the Dutch authorities were informed beforehand of the intention of the Indian government to cancel Kashmir’s special status. With a blanket closure of internet, mobile telephony and landline telephony, it was impossible to be in touch with Sharaf. We did some thorough testing, and we hope contact with Sharaf will be restored soon.


Since we still weren’t complete, we postponed the meeting until Thursday, so this day was a day for hacking and discussions. We had many more artists joining us than previously, so the discussions were lively and the meetings were good — but there were more bugs reported than bugs fixed.


On Wednesday, we went on an expedition, to the Openlucht Museum. With twenty-three people attending, it was more efficient to actually rent a bus for the lot of us. The idea about the outing was to make sure people who had never been at a sprint and people who had been at sprints before would mingle and get to know each other. That worked fine!

We had a somewhat disappointing guided tour. I had asked for a solid introduction in the social history of the Netherlands, but the guide still felt he needed to make endless inane and borderline sexist jokes that all fell very flat. Oh well, the buildings were great and the people inside the buildings were giving quite interesting information, with the rosmolen being a high point:

From David Revoy’s sketchbook

And as you can see, it gave the artists and developer/artists amongst us a chance to do some analog painting (althoug at least one sprint attendant tried to paint with Krita on a Surface tablet, unfortunately cut short by battery problems):

We followed up with dinner at an Indonesian restaurant, and went home tired but satisfied. There was still hacking and painting going on, though, until midnight.


Today we really had the core of our sprint. Some sprints are for coding, but this sprint was for bringing together people and gathering ideas. On Thursday, we discussed the future of Krita in quite a bit of detail.

In 2018/2019 the focus was fully on fixing bugs. There are now two full-time developers working on fixing bugs and improving stability more than this time last year, and both Boudewijn and Dmitry have dedicated all their coding time to fixing bugs as well. Weirdly enough, that doesn’t seems to make much of a dent in our number of open bugs:

Bugs, open, new and closed, since April.

One reason is that we manage to introduce too many regressions. That’s partly explained by our new hackers needing to learn the codebase, partly by an enormous increase in our user base (we’re on track to break 2,500,000 downloads a year in 2019), but mostly by our changes not getting enough testing before releasing. So, taking things out of the order we discussed them at the meeting, let’s report on our Bugs and Stability discussion first.

Bugs and Stability

As David Revoy reports, the 4.2 releases don’t feel as stable as the 4.1 releases. As noted above, this is not unexpected since we have two new full-time developers working on Krita, who aren’t that deep in the codebase yet. Another reason we have so much trouble with the 4.2 releases is that we updated to Qt 5.12, which seems to come with many regressions we either have to fix in Qt (and we do submit patches upstream, and those are getting accepted), or work around in Krita itself. On the other hand, we are merging bug fixes into our release branch until the last minute before the release, so those fixes get barely any testing: so the lack of testing isn’t something we can blame on our users, it’s to a large extent our own fault.

Yet Raghukamath and Deevad noted that they both don’t actually test master or the stable branch from day to day anymore because they are too busy actually using Krita, and the same goes for the other artists present. It’s clear that the developers cannot do regression testing, that our extensive set of unittests (although most are more like integration tests, technically) doesn’t catch the regressions — we have to find a better way.

Coincidentally (or not…) Anna (who was not present) had started some time ago a discussion about this on phabricator: T11021: Enhancements to quality assurance). There are many parts to that discussion, but one thing we concluded, based on discussions during the sprint, is that we will try the following:

  • We will release once a month, at the end of the month (we already try to do that…)
  • We will merge bug fixes to the stable branch until the middle of the month: the merge window thus is two weeks, while master is always open to bug fixes.
  • We will publish an article on telling our users what landed in the next stable version. That article will show up in all our users’ welcome screen news widget, right inside Krita. There will be links to download a portable version of Krita for every OS.
  • We will add a link to a survey (not bugzilla) in the welcome screen of those builds, and in the survey ask people for the results of testing the changes noted in the release article.
  • And then, two weeks later, we will release the next stable version of Krita, with only fixes merged during the test period for noted regressions.

Slightly related: we also want to do a monthly update article on changes for master, but without the survey mechanism. However, that would make two development updates a month, which might be a bit much to digest, so we’re starting slowly, with the stable release system.

Development focus for 2019/2020

In October we will release a hopefully super-stable Krita 4.3, with a bunch of new features as well, but still focused on stability. Boudewijn is still working on fixing the resource handling system, but that is going really slow, and is being really hard. It’s also hard for the maintainer of the whole project to find time to work on big coding projects, and it’s getting harder, the more management-like tasks there are.

Everyone in the meeting agreed that the text tool still needs much more work, maybe even another rewrite to get rid of the dependency on Qt’s internal rich text editor: the conversion between QTextDocument and SVG is lossy, and gives problems. We were all aware of the missing bits and the problems and bugs, so we didn’t need to discuss this in detail. So one focus is:

  • Text Tool: it is still not usable for the primary goal, namely comic book text balloons. We need to make it suitable. We know what to do, we just don’t have the time while fighting incoming bugs all the time.

So it’s clear that we still need to work on…

  • Stability and performance. During the discussion some particular issues were noted:
    • Raghukamath reported that the move tool has become very slow. This definitely is a regression:
    • One year ago, at the 2018 Krita Sprint, we made a list of Unfinished Stuff, ranging from missing features after the vector rewrite, unimplemented layer style options, half-implement animated transform mask, missing undo/redo support in the scripting layer. Most of that is still relevant. See the original sprint report.
    • Dmitry: noted he had made some expirements that show that we could make our brushes much faster by using new versions of AVX, but this would only help people with newer laptops. Boudewijn wondered whether the brush engines are the true bottleneck — if the improvement only shows in benchmarks, users won’t notice much difference. We might want to do another round of measuring using Intel’s VTune, if we can get another license for a year.
    • Resource handling is still being rewritten. That means that when Steven noted that he cannot update a workspace, Boudewijn replied that is part of the bigger problem with the current resource system. Boud is rewriting that, and has been working on it for two years now, leading to a huge merge request — big enough to almost bring gitlab to its knees. The rewrite might be be too big to actually finish, and it’s hard to distribute the work over multiple developers.

Since we had some many artists around whose view we had never before been able to canvass, we decided that digging into workflow issues might be the best thing to do: it would make a good theme for the next fundraiser, too. So:

    • Workflow issues
      • Stefan brought in multiple issues with animation, like editing on multiple frames at the same time or finally getting the clones feature done. Most issues are already in bugzilla as wishes, but unfortunately, we don’t have someone to work on animation full-time at the moment. Some progress was made and demonstrated during the sprint, though!
      • In the nineties and oughties, all desktop applications looked the same and followed more or less the same guidelines. That made sure that users knew they could investigate the contents of the application menus; it would be the first place to look for something relevant for their task at hand. That barely seems to happen anymore. So, discoverability of features in Krita is a problem that is getting worse. We made a list of things that were hard to find:
      • Our dockers are overcrowded and the contents hard to find; a docker hidden behind another docker isn’t going to be discovered by many users.
      • If there are tabs in a single docker, like with the transform tool, and some of those tabs aren’t visible because there are too many of them, like the liquify transform functionality, that functionality might as well not exist, it won’t be found.
      • Deevad suggested making each transform tool option into a separate tool; however, as Raghukamath noted, the objection to that is that we don’t want six new tools crowding the toolbox, nor having the weird pop-out tool selection buttons Photoshop has. This is a perennial discussion, and it’s next to impossible to come to a conclusion here.
    • Related to that is the question of the tool options docker. Right now, users can choose between putting the tool options in a docker, or in a popup button on the toolbar. Another option would be to make a toolbar out of the tool options, with some pop-ups, like Corel Painter and Photoshop did. But none of these options is a real solution. We might do want to do a survey, but from experience. users want to have at least the overview, tool options, color selector, layer docker, brush preset docker visible, as well as some others, and on most displays, there just isn’t the space for that…
    • Actually figuring out which dockers we have is pretty hard, too. Most people don’t seem to find the dockers submenu in the settings menu, or in the right-click menu on the docker titlebars, and if they find it, the list is too intimidating.
    • So, one thing we decided to do is create a tool to search through Krita’s functionality. Other applications are apparently facing the same problem, and this is an easy and cheap solution. Of course, people started asking for this tool to also search e.g. layer names. That lead to the thought that this is starting to look a bit like QtCreator’s locator widget…
    • A workflow improvement Steven suggested was to autosave the current session (that is, open windows, views, images) and restoring this on restarting Krita.
    • Mariya said that the combination of clone layers and transform masks doesn’t work as well for her as Photoshop’s smart objects. After some discussion, it seems that we might want to rethink showing masks in the hierarchy if there’s only one mask of a certain type: it’s easier to show them as a toggle in the layer’s row in the layerbox.
    • Sara asked whether Krita had a screen recorder. This should only record the canvas stroke by stroke and export to PNG or JPG. The old screen recorder could do this, but it was very broken and removed, and never intentionally released. This would be a biggish project, gsoc-sized, and needs to build on first finishing the porting to the generic strokes sytem, and then extend that system with recording. Another option would be to add a timer to save incremental versions and add an option to export incremental in addition to save incremental.
    • Emmett and Eoin discussed working with painting assistants: it would be interesting to make a hierarchy with grouped assistants. The conclusion was that we might want to show the assistants in the layer docker on top of the layers; other suggestions were a separate docker or putting the treeview in the tool option widget. If the first solution is chosen, it would be useful to also show reference images and maybe even guides in the layer^Wimage structure docker.

Some of the workflow issues mentioned already sound like new features, and then there were a number of discussions about what really would be new features:

    • It would help a lot with user support if the a statusbar indicator would show whether the stabilizer, canvas acceleration, instant preview (and others) are on or off. A screenshot would then immediately answer the questions we most ask users who need support.
    • After so many months of bug fixes, Dmitry really wants to work on one or two new brush engines: the first has thin bristles that could each have their own color. Sara mentions she really misses a brush like this. The other engine would be more like a calligraphy tool. Dmitry estimates needing two months per engine, which is quite a bit of time.
    • Eoin wants to have a tool, or a brush engine, or at least, something that would make it very easy to paste images or designs and transform them before pasting the next one. The images should come from an ordered or random collection. Currently, people use pipe brushes for that, but that is not convenient. A new tool is suggested.
    • Steven notes that it would be much more convenient to create a pipe brush from an animation on the timeline than the current system based on layers. This is true – we just never thought of it.
    • Mariya really wants to have a font selector where she can mark X fonts as her favourite and created a wish bug for it. It turns out that Calligra doesn’t have widget like that, and the standard Qt Font combobox doesn’t support it either, nor does Scribus: it might be that such a thing hasn’t been written in Qt.

Anoother thing that was discussed briefly was telemetry (we tried that, the project failed).

Marketing and Outreach

Our presence on Twitter, Mastodon, Tumblr, DeviantArt, Reddit is fine. On Facebook, non-official groups are more used than Krita’s own account (which is because Boud is the last maintainer standing, and he cannot stand facebook). Youtube needs improvement, we are absent on Instagram.


Sara Tepes volunteers to handle Krita’s instagram account (which we don’t have). Sara also wants to run “competitions” on Instagram with the prize being having the a number of selected images shown on either Krita’s splash or Krita’s welcome screen.

The splash screen is our main branding location, so we shouldn’t put other images in there other than the holiday jokes.

We could redesign the welcome screen to include an image location; it needs redesign in any case because it’s too drab right now. We wanted it to be not in-your-face, but it’s a bit too much not so now.

Once we have the image location, getting and selecting images can be a problem, as it was for the art book or main release announcements. Sara notes that instagram gives easy tools to select images from a larger set; other platforms are not so good.

In any case, selecting images will be quite a bit of work, and we do need to make sure we’re not playing favorites or forgetting where we come from: free software, open culture.

Conclusion: we are going to try to run the competition on all social networks for which we have a maintainer (instagram, twitter, mastodon, reddit). We can always extend this later to other places. Each maintainer can propose two images + attribution info + links, which will be shown in rotation for a month.

The system for doing this should be ready for the 4.3 release in October.

Note: we have to make a page with a very clear text explaining the rules: we don’t take ownership of the images, the images will be shown in Krita, there will be no licensing requirements for the images, certain kinds of images cannot be used.

Note 2: Scott should ask Ben Cooksley how we can get the welcome screen news widget traffic information on a regular basis.


We have already started improving our presence on YouTube. We feature existing Krita-related channels, and we are working with Ramon to provide interesting videos. We could do more, but let’s give Ramon a chance to build up some momentum first.

Development Fund and Fundraiser

Financially, Krita is doing okay. We do get between 2000 and 2700 euros a month in donations: that translates to one full time developer (yes, we’re not getting rich from working on Krita, these are not commercial fees). Windows Store + Steam bring in enough for three to four extra full-time developers. It would be good to become less dependent on the Windows Store, though, since Microsoft is getting more and more aggressive in promoting getting applications from the Windows Store.

Enter the Krita Development Fund. Like in most things, we try to look at what Blender is doing, and then try to find out whether that works for us. Often it does. We already have a notional Development Fund, but it’s basically a monthly paypal subscription or recurring bank transfer. We don’t have any feedback or extras for the subscribers, and the subscribers have no way to manage their subscription, or reach us other than in the usual way. We tried to implement a CiviCRM system for this, but that was way too complex for us to manage.

We need to reboot our Development Fund and migrate existing subscribers to the new fund. A basic list of requirements is:

      • A place on the website where people can subscribe and unsubscribe
      • A place where the names of people who want that are shown
      • A way to tell people what we are doing with the money and what we will be doing
      • Make sure companies and sponsors will also be able to join

And no doubt there will be other considerations and requirements. We should check Blender’s dev fund website, of course. We created a Phabricator task to track this, and it’s something we really want help with!

What wasn’t discussed

Interestingly, the new gitlab workflow seems to work for everyone. Gitlab’s UI is even less predictable and discoverable than Krita’s, but we didn’t need to discuss anything, people can work with it without much trouble.

Steam wasn’t much of a discussion item either: Windows is doing fine on Steam, our macOS version of Krita still has too many niggles to make it worth-while to put on Steam (or the Apple Store either, even if that were possible, license-wise), and the Linux market share is still too small to make it worth the time investment: still Emmet promised to contact Valve to see how we can get the appimage into Steam. At a first glance, the problem seems to be the version of libc required, which might mean we’ll have to figure out a way to build Qt 5.12 on older versions of Ubuntu or CentOS. But let’s wait and see, first.

Tangentially, we did discuss how to get more people involved in user support, but Agata already has plans towards involving people who already trying to help others in places like Reddit and the forum more recognition. It was late, and the discussion degenerated into hilarity pretty soon — still, this is something to work on, since the core development team just doesn’t have the capacity anymore to help every new Krita user’s teething problems anymore.


To create: a task for rethinking what goes into dockers, and what goes somewhere else.


Friday was the real hacking day. Some people already started leaving, but many people were staying around and started hacking on the issues identified during the meeting, like the action search widget. Bugs were being fixed, regressions identified and blogs posted. And even later on, on Saturday and Sunday, there was still hacking, like on the detached canvas feature.

Step-by-Step Execution and Examples

Monday 12th of August 2019 12:00:00 AM

Last week I finished writing all the new examples for the ROCS, together with a little description of each commented in the beginning of the code. The following examples were implemented:

  • Breadth First Search;
  • Depth First Search;
  • Topological Sorting Algorithm;
  • Kruskal Algorithm;
  • Prim Algorithm;
  • Dijkstra Algorithm;
  • Bellman-Ford Algorithm;
  • Floyd-Warshall Algorithm;
  • Hopcroft-Karp Bipartite Matching Algorithm.

It is good to note that while Prim algorithm and BFS were already in rocs, they were broken and could not be run. The following image is an example of a simple description of an algorithm:

About the step-by-step execution, I am considering our possibilities. My first idea was to take a look into the debugger for the QScriptEngine class, which is the QScriptEngineDebugger class. An instance of this class can be attached to our script engine, and it provides the programmer with a interface with all the necessary tools.

Although useful, I personally think our rocs don’t need all this tools. (but they can be provided separately) There are 3 ways to stop the code execution using this debugger:

  • By an not treated Exception inside the javascript code;
  • By a call to the debugger instruction, that automatically invokes the debugger interface;
  • By a breakpoint, that can be put in any line of the javascript code.

The first one is not really useful for us, as it halts the code execution. The second and the third can be really useful couple with an Continue command. But the second invokes the full debugger interface, which we don’t really want.

So, by using the third one, we can stop the execution in any line of the javascript code and create a step button with the Continue command to continue executing the code. The only problem is how to add the breakpoints, as there is no direct function to add them, and usually the programmer has to use the ConsoleWidget interface or the BreakpointsWidget to do this. The following image shows the Continue button, which is already working:

But the challenge of adding the breakpoints still remains. One of my ideas is to modify the code editor to accept an click on the line number bar, which triggers an signal to add/remove an breakpoint to that line. This is an clean alternative for me. But for that I have to check if the KTextEditor have this type of signal and create a way to add breakpoints in the code by function.

Instant Workstation

Sunday 11th of August 2019 10:00:00 PM

Some considerable time ago I wrote up instructions on how to set up a FreeBSD machine with the latest KDE Plasma Desktop. Those instructions, while fairly short (set up X, install the KDE meta-port, .. and that’s it) are a bit fiddly.

So – prompted slightly by a Twitter exchange recently – I’ve started a mini-sub-project to script the installation of a desktop environment and the bits needed to support it. To give it at least a modicum of UI, dialog(1) is used to ask for an environment to install and a display manager.

The tricky bits – pointed out to me after I started – are hardware support, although a best-effort is better than having nothing, I think.

In any case, in a VBox host it’s now down to running a single script and picking Plasma and SDDM to get a usable system for me. Other combinations have not been tested, nor has system-hardware-setup. I’ll probably maintain it for a while and if I have time and energy it’ll be tried with nVidia (those work quite well on FreeBSD) and AMD (not so much, in my experience) graphics cards when I shuffle some machines around.

Here is the script in my GitHub repository with notes-for-myself.

Installing FreeBSD is not like installing a Linux distribution. A Linux distro hands you something that will provide an operating system and more, generally with a selection of pre-installed packages and configurations. Take a look at ArcoLinux, which offers 14(?) different distribution ISOs depending on your preference for installed software.

FreeBSD doesn’t give you that – you end up with a text-mode prompt (there’s FreeBSD distributions, though, that do some extra bits, but those are outside my normal field-of-view). So it’s not really expected that you have a complete desktop experience, post-installation (nor, for that matter, a complete GitLab-server, or postfix-mail-relay, or any of the other specialised purposes for which you can use FreeBSD).

I could vaguely imagine bunging this into bsdinstall as a post-installation option, but that’s certainly not my call. Not to mention, I think there’s an effort ongoing to update the FreeBSD installer anyway, led by Devin Teske.

So to sum up: to install a FreeBSD machine with KDE Plasma, download script and run it; other desktop environments might work as well.

One week to go!

Sunday 11th of August 2019 09:02:07 AM

There is one week left of the call for papers for the foss-north IoT and Security Day. The conference takes place on October 21 at WTC in Stockholm.

We’ve already confirmed three awesome speakers and will fill the day with more contents in the weeks following the closing of the CfP, so make sure to get your submission in.

Patricia Aas

The first confirmed speaker is Patricia Aas who will speak about election security – how to ensure transparency and reliability into the election system so that it can be trusted by all – including a less technologically versed public.

Also, this is the first stage in our test of the new foss-north conference administration infrastructure, and it seems to have worked this far :-). Big thanks goes to Magnus for helping out.

KDE Usability & Productivity: Week 83

Sunday 11th of August 2019 04:40:32 AM

This week in KDE’s Usability & Productivity initiative is massive, and I want to start by announcing a big feature: GTK3 apps with client-side decorations and headerbars using the Breeze GTK theme now respect the active KDE color scheme!

Pretty cool, huh!? This feature was written by Carson Black, our new Breeze GTK theme maintainer, and will be available in Plasma 5.17. Thanks Carson!

As you can see, the Gedit window still doesn’t display shadows–at least not on X11. shadows are displayed on Wayland, but on X11 it’s a tricky problem to solve. However I will say that that anything’s possible!

Beyond that, it’s also been a humongously enormous week for a plethora of other things too:

New Features Bugfixes & Performance Improvements User Interface Improvements

If you’re getting the sense that KDE’s momentum is accelerating, you’re right. More and more new people are appearing all the time, and I am constantly blown away by their passion and technical abilities. We are truly blessed by… you! This couldn’t happen without the KDE community–both our contributors for making this warp factor 9 level of progress possible, and our users for providing feedback, encouragement, and being the very reason for the project to exist. And of course, the overlap between the two allows for good channels of communication to make sure we’re on the right track.

Many of those users will go on to become contributors, just like I did once. In fact, next week, your name could be in this list! Not sure how? Just ask! I’ve helped mentor a number of new contributors recently and I’d love to help you, too! You can also check out, and find out how you can help be a part of something that really matters. You don’t have to already be a programmer. I wasn’t when I got started. Try it, you’ll like it! We don’t bite!

If you find KDE software useful, consider making a tax-deductible donation to the KDE e.V. foundation.

Digging through the past

Saturday 10th of August 2019 10:00:00 PM

As part of migrating this blog from a defunct hosting company and a Wordpress installation, to a non-defunct hosting company and Jekyll, I’m re-visiting a lot of old posts. Assuming the RSS generator is ok, that won’t bother the feed aggregators (the KDE planet in particular). The archives are slowly being filled in, and one entry from 2004 struck me:

Ok, my new machine is installed (an amd64 running FreeBSD -CURRENT, which puts me firmly at the forefront of things-unstable).

Not much has changed in 15 years, except maybe the “unstable” part. Oh, and I tend to run -STABLE now, because that’s more convenient for packaging.

Something else I spotted: in 2004 I was working on KPilot as a hobby project (alongside my PhD and whatever else was paying the bills then), so there’s lots of links to the old site.

Problem is, I let the domain registration expire long ago when Palm, Inc., the Palm Pilot, and KDE 4 ceased to be a going concern. So, that domain has been hijacked, or squatted, or whatever, with techno bla-bla-bla and recognizable scraps of text from the ancient website. Presumably downloading anything from there that pretends to be KPilot will saddle you with plenty of malware.

In any case it’s a reminder that links from (very) old blog posts are not to be trusted, particularly. Since the archives are being updated (from old Wordpress backups, and from the Internet Archive) I’ll try to fix links or point them somewhere harmless if I spot something, but no guarantess.

Krita Sprint 2019

Saturday 10th of August 2019 06:48:20 PM

The sprint has officially ended yesterday and most of the participants have already left, except me, Ivan, Wolthera and Jouni. Well I would have also left as planned but I read my flight timings wrong and it would leave after 3 hours of what I thought the departure time was.

Kate - More languages supported via LSP!

Saturday 10th of August 2019 04:41:00 PM

The default configuration for the Kate LSP client does now support more stuff than just C/C++ and Python out of the box.

In addition to the recently added Rust support we now support Go and LaTeX/BibTeX, too.


The default supported server are configured via some JSON settings file we embed in our plugin resources.

Currently this looks like:

{ "servers": { "bibtex": { "use": "latex" }, "c": { "command": ["clangd", "-log=error", "--background-index"], "commandDebug": ["clangd", "-log=verbose", "--background-index"], "url": "" }, "cpp": { "use": "c" }, "latex": { "command": ["texlab"], "url": "" }, "go": { "command": ["go-langserver"], "commandDebug": ["go-langserver", "-trace"], "url": "" }, "python": { "command": ["python3", "-m", "pyls", "--check-parent-process"], "url": "" }, "rust": { "command": ["rls"], "rootIndicationFileNames": ["Cargo.lock", "Cargo.toml"], "url": "" } } }

The file is located at kate.git/addons/lspclient/settings.json. Merge requests to add additional languages are welcome.

I assume we need still to improve what we allow to specify in the configuration.

Currently supported configuration keys

At the moment, the following keys inside the per-language object are supported:


Tell the LSP client to use the LSP server for the given language for this one, too. Useful to dispatch stuff to a server supporting multiple languages, like clangd for C and C++.


Command line to start the LSP server.


Command line to start the LSP server in debug mode. This is used by Kate if the LSPCLIENT_DEBUG environment var is set to 1. If this variable is set, the LSP client itself will output debug information on stdout/stderr and the commandDebug command line should try to trigger the same for the LSP server, like e.g. using -log=verbose for clangd.


For the Rust rls LSP server we added the possibility to specify a list of file names that will indicate which folder is the root for the language server. Our client will search upwards for the given file names based on the file path of the document you edit. For Rust that means we first try to locate some Cargo.lock, if that failed, we do the same for Cargo.toml.


URL of the home page of the LSP server implementation. At the moment not used internally, later should be shown in the UI to give people hints where to find further documentation for the matching LSP server (and how to install it).

Current State

For C/C++ with clangd the experience is already good enough for day-to-day working. What is possible can be seen in one of my previous posts, video included. I and some colleagues use the master version of Kate at work for daily coding. Sometimes Kate confuses clangd during saving of files but otherwise, no larger hiccups occur.

For Rust with rls many things work, too. We now discover the root directory for it more easily thanks to hints to look for the Cargo files. We adapted the client to support the Hover message type rls emits, too.

For the other languages: Beside some initial experiments that the servers start and you get some completion/…, not much work went into that. Help is welcome to improve their configuration and our client code to get a better experience.

Just give Kate from the master branch a test drive, here is our build it how-to. We are open for feedback on or directly via patches on

Btw., if you think our how-to or other stuff on this website is lacking, patches are welcome for that, too! The complete page is available via our GitHub instance, to try changes locally, see our

Order your Akademy t-shirt *NOW*

Friday 9th of August 2019 11:46:40 PM

If you want an Akademy 2019 t-shirt you have until Monday 12th Aug at 1100CEST (i.e. in 2 days and a bit) to order it.

Head over to and get yourself one of the exclusive t-shirts with Jen's awesome design :)

[GSoC – 5] Achieving consistency between SDDM and Plasma

Friday 9th of August 2019 08:53:51 PM

Previously: 1st GSoC post 2nd GSoC post 3rd GSoC post 4th GSoC post In this GSoC entry I’ll mention two things implemented since the last blog post: syncing of scaling and NumLock settings. Aside from that, I’ll reflect on syncing of locally-installed files. Even thought I thought scaling would require changes on the SDDM side...... Continue Reading →

ASCII Transliteration without ICU or iconv

Friday 9th of August 2019 08:44:59 PM

So far, most of my blog postings that appeared on Planet KDE were release announcements for KBibTeX. Still, I had always planned to write more about what happens on the development side of KBibTeX. Well, here comes my first try to shed light on KBibTeX&aposs internal working …

Active development of KBibTeX happens in its master branch. There are other branches created from time to time, mostly for bug fixing, i. e. allowing bug reporters to compile and test a bug fix before before the change is merged into master or a release branch. Speaking of release branches, those get forked from master every one to three years. At the time of writing, the most recent release branch is kbibtex/0.9. Actual releases, including alpha or beta releases, are tagged on those release branches.

KBibTeX is developed on Linux; personally I use the master branch on Gentoo Linux and Arch Linux. KBibTeX compiles and runs on Windows with the help of Craft (master better than kbibtex/0.9). It is on my mental TODO list to configure a free Windows-based continuous integration service to build binary packages and installers for Windows; suggestions and support are welcome. Craft supports macOS, too, to some extend as well, so I gave KBibTeX a shot on this operating system (I happen to have access to an old Mac from time to time). Running Craft and installing packages caused some trouble, as macOS is the least tested platform for Craft. Also, it seems to be more difficult to find documentation on how to solve compilation or linking problems on macOS than it is for Windows (let alone Linux). However, with the help of the residents in #kde-craft and related IRC channels, I was eventually able to start compiling KBibTeX on macOS (big thanks!).

The main issue that came up when crafting KBibTeX on macOS was the problem of linking against ICU (International Components for Unicode). This library is shipped on macOS as it is used in many other projects, but seemingly even if you install Xcode, you don't get any headers or other development files. Installing a different ICU version via Craft doesn't seem to work either. However, I am no macOS expert, so I may have gotten the details wrong …

Discussing in Craft&aposs IRC channel how to get KBibTeX installed on macOS despite its dependency on ICU, I got asked why KBibTeX needs to use ICU in the first place, given that Qt ships QTextCodec which covers most text encoding needs. My particular need is to transliterate a given Unicode text like ‘äåツ’ into a 7-bit ASCII representation. This is used among others to rewrite identifiers for BibTeX entries from whatever the user wrote or an imported BibTeX file contained to an as close as possible 7-bit ASCII representation (which is usually the lowest common denominator supported on all systems) in order to reduce issues if the file is fed into an ancient bibtex or shared with people using a different encoding or keyboard layout.

Such a transliteration is also useful in other scenarios such as if filenames are supposed to be based on a person&aposs name but still must be transcribed into ASCII to be accessible on any filesystem and for any user irrespective of keyboard layout. For example, if a filename needs to have some resemblance the Scandinavian name ‘Ångström’, the name&aposs transliteration could be ‘Angstrom’, thus a file could be named Angstrom.txt.

So, if ICU is not available, what are the alternatives? Before I adopted ICU for the transliteration task, I had used iconv. Now, my first plan to avoid hard-depending on ICU was to test for both ICU and iconv during the configuration phase (i. e. when cmake runs) and use ICU if available and fall back to iconv if no ICU was available. Depending on the chosen alternative, paths and defines (to enable or disable specific code via #ifdefs) were set.
See commit 2726f14ee9afd525c4b4998c2497ca34d30d4d9f for the implementation.

However, using iconv has some disadvantages which motivated my original move to ICU:

  1. There are different iconv implementations out there and not all support transliteration.
  2. The result of a transliteration may depend on the current locale. For example, ‘ä’ may get transliterated to either ‘a’ or ‘ae’.
  3. Typical iconv implementations know less Unicode symbols than ICU. Results are acceptable for European or Latin-based scripts, but for everything else you far too often get ‘?’ back.

Is there a third option? Actually, yes. Qt&aposs Unicode code supports only the first 216 symbols anyway, so it is technically feasible to maintain a mapping from Unicode character (essentially a number between 0 and 65535) to a short ASCII string like AE for ‘Æ’ (0x00C6). This mapping can be built offline with the help of a small program that does link against ICU, queries this library for a transliteration for every Unicode code point from 0 to 65535, and prints out a C/C++ source code fragment containing the mapping (almost like in the good old days with X PixMaps). This source code fragment can be included into KBibTeX to enable transliteration without requiring/depending on either ICU or iconv on the machines where KBibTeX is compiled or run. Disadvantages include the need to drag along this mapping as well as to updated it from time to time in order to keep up with updates in ICU&aposs own transliteration mappings.
See commit 82e15e3e2856317bde0471836143e6971ef260a9 where the mapping got introduced as the third option.

The solution I eventually settled with is to still test for ICU during the configuration phase and make use of it in KBibTeX as I did before. However, in case no ICU is available, the offline-generated mapping will be used to offer essentially the same functionality. Switching between both alternatives is a compile-time thing, both code paths are separated by #ifdefs.

Support for iconv has been dropped as it became the least complete solution (see commit 47485312293de32595146637c96784f83f01111e).

Now, how does this generated mapping look like? In order to minimize the data structure&aposs size I came up with the following approach: First, there is a string called const char *unidecode_text that contains any occurring plain ASCII representation once, for example only one single a that can be used for ‘a’, ‘ä’, ‘å’, etc. This string is about 28800 characters long for 65536 Unicode code points where a code point&aposs ASCII representation may be several characters long. So, quite efficient.

Second, there is an array const unsigned int unidecode_pos[] that holds a number for every of the 65536 Unicode code points. Each number contains both a position and a length telling which substring to extract from unidecode_text to get the ASCII representation. As the observed ASCII representations' lengths never exceed 31, the array&aposs unsigned ints contain the representations' lengths in their lower (least significant) five bits, the remaining more significant bits contain the positions. For example, to get the ASCII representation for ‘Ä’, use the following approach:

const char16_t unicode = 0x00C4; ///< 'A' with two dots above (diaeresis) const int pos = unidecode_pos[unicode] >> 5; const int len = unidecode_pos[unicode] & 31; const char *ascii = strndup(unidecode_text + pos, len);

If you want to create a QString object, use this instead of the last line above:

const QString ascii = QString::fromLatin1(unidecode_text + pos, len);

If you would go through this code step-by-step with a debugger, you would see that unidecode_pos[unicode] has value 876481 (this value may change if the generated source code changes). Thus, pos becomes 27390 and len becomes 1. Indeed and not surprisingly, in unidecode_text at this position is the character A. BTW, value 876481 is not just used for ‘Ä’, but also for ‘À’ or ‘Â’, for example.

Above solution can be easily adjusted to work with plain C99 or modern C++. It is in no way specific to Qt or KDE, so it should be possible to use it as a potential solution to musl (a libc implementation) to implement a //TRANSLIT feature in their iconv implementation (I have not checked their code if that is possible at all).


Fixes for recent KDE desktop vulnerability

Friday 9th of August 2019 08:18:10 AM

As you may have been made aware on some news articles, blogs, and social media posts, a vulnerability to the KDE Plasma desktop was recently disclosed publicly. This occurred without KDE developers/security team or distributions being informed of the discovered vulnerability, or being given any advance notice of the disclosure.

KDE have responded quickly and responsibly and have now issued an advisory with a ‘fix’ [1].

Kubuntu is now working on applying this fix to our packages.

Packages in the Ubuntu main archive are having updates prepared [2], which will require a period of review before being released.

Consequently if users wish to get fixed packages sooner, packages with the patches applied have been made available in out PPAs.

Users of Xenial (out of support, but we have provided a patched package anyway), Bionic and Disco can get the updates as follows:

If you have our backports PPA [3] enabled:

The fixed packages are now in that PPA, so all is required is to update your system by your normal preferred method.

If you do NOT have our backports PPA enabled:

The fixed packages are provided in our UPDATES PPA [4].

sudo add-apt-repository ppa:kubuntu-ppa/ppa
sudo apt update
sudo apt full-upgrade

As a precaution to ensure that the update is picked up by all KDE processes, after updating their system users should at the very least log out and in again to restart their entire desktop session.


Kubuntu Team

[1] –
[2] –
[3] –
[4] –

Git Alligator

Thursday 8th of August 2019 10:00:00 PM

This is a short description of a workflow I apply in git repositories that I “own”; it mostly gets applied to Calamares, the Linux installer framework, because I spend most of my development hours on that. But it also goes into ARPA2 projects and home experiments.

It’s a variation on “always summer in master”, and I call it the Git Alligator because when you draw the resulting tree in ASCII-art, horizontally (I realise that’s a pretty niche artform), you get something like this:

/-o-o-\ /-o-o-o-\ /-o-\ o--o-------o-o---------o-----o--o

To me, that looks like the bumps on an alligator’s back. If I were a bigger fan of Antoine de Saint-Exupéry, I would probably see it as a python that has eaten multiple elephants.

Anyway, the idea is twofold:

  • master is always in a good state
  • I work on (roughly) one thing at a time

For each thing that I work on, I make a branch; if it’s attached to a Calamares issue, I’ll name it after the issue number. If it’s a different bit of work, I’ll name it more creatively. The branch is branched off of master (which is always in a good state). Then I go and work on the branch – commit early, commit often – until the issue is resolved or the feature implemented or whatever.

In a codebase where I’m the only contributor, or the gatekeeper for it so that I know that master remains unchanged, I know a merge can go in painlessly. In a codebase with more contributors, I might merge upstream master into my branch right at the end as a sanity check (right at the end because most of these branches are short-lived, a day or two at most for any given issue).

The alligator effect comes in when merging back to master: I always use --no-ff and I try to write an additional summary description of the branch in the merge commit.

Here’s a screenshot of Calamares history, from qgit, turned on its side like an alligator crawling to the right, (cropped a little so you don’t see where I don’t follow my own precepts and annotated with branch names).

Aside from the twofold ideas of “always summer in master” and “focus on one thing” I see a couple of other benefits:

  • History if desired; this approach preserves history (all the little steps, although I do rebase and fixup and amend stuff as I go along, I don’t materially squash things).
  • Conciseness when needed; having all the history is nice, but if you follow the “alligator’s tummy branch” (that is, master, along the bottom of the diagrams) you get only merge nodes with a completed bugfix or feature and a little summary: in other words, following that line of commits gives you a squashed view of what happened.
  • Visual progress; each “bump” on the alligator’s back is a unit of progress. If I were to merge without --no-ff the whole thing would be smooth like a garter snake, and then it’s much harder to see the “things” that I’ve done. Instead I’d need to look at the log and untangle commit messages to see what I was working on. This has a “positivity” benefit: I can point and say “I did a thing!”

I won’t claim this approach works for everybody, or for larger teams, but it keeps me happy most days of the week, and as a side benefit I get to think about ol’ Albert the Alligator.

Finally, I’m back… more or less

Thursday 8th of August 2019 08:38:52 PM
…and I’ve made a new wallpaper!

Yes, finally i’m back on my favourite application, Inkscape.

Hope this is a cool presentation

I called this wallpaper Mountain, because … well, there are mountains with a sun made with the KDE Neon logo. hope you like it

You can find it HERE

See you soon with other wallpapers …



Google Code-in 2018 trip report

Thursday 8th of August 2019 02:09:17 PM

Hello! In June, I had the opportunity to be the mentor representing KDE in the Google Code-in (GCi) 2018 trip in San Francisco, California. For those who don't know what GCi is, it is basically a competition organized by Google for students with ages between 13-17 years old that introduces them into open source contributions … Continue reading Google Code-in 2018 trip report

Kate - Initial Rust LSP support landed!

Wednesday 7th of August 2019 07:02:00 PM

Initial support for the rls Rust LSP server has landed in kate.git master. The matching rls issue about this can be found here.

Given I am no Rust expert, I can only verify that at least some operations seem to work for me if the Rust toolchain is setup correctly ;=)

The current experience isn’t that nice as with clangd, for example I get no symbols outline here. What is possible with clangd can be seen in one of my previous posts, video included.

Any help to improve this is welcome. The patch that landed can be viewed here, lend a hand if you know how to fix up stuff!

Technical vision for Qt 6

Wednesday 7th of August 2019 12:05:46 PM

7 years ago, Qt 5 was released. Since then, a lot of things have changed in the world around us, and it is now time to define a vision for a new major version. This blog post captures the most important points that can and should define Qt 6.

Qt 6 will be a continuation of what we have been doing in the Qt 5 series and should as such not be disruptive to our users. But a new major version will give us a higher degree of freedom to implement new features, functionality and better support the requirements of today and tomorrow than we currently can within the Qt 5 series. As described in more detail below, Qt 6 will aim for a large degree of compatibility with the Qt 5 series. We are also still working on new versions of Qt 5, and we’re aiming to bring some of the features that will define Qt 6 in a slightly reduced form to Qt 5.14 and Qt 5.15 LTS. With the feature freeze of Qt 5.14, more R&D focus will shift towards Qt 6, and we’re aiming to have Qt 6.0 ready for a first release by the end of 2020. Before we dive into all the things that will be new, let’s also remember some of the core values of Qt for our users, to define the things we don’t want to change.

What makes Qt valuable to our users?

Qt is a horizontal product that is being used in many different markets. The core values Qt has for our customers and users are:

  1. Its cross-platform nature, allowing users to deploy their applications to all desktop, mobile and embedded platforms using one technology and from a single code base
  2. Its scalability from low-end, single-purpose devices to high-end complex desktop applications or connected system
  3. World-class APIs and tools and documentation, simplifying the creation of applications and devices
  4. Maintainability, stability, and compatibility, allowing to maintain large code bases with minimal effort
  5. A large developer ecosystem with more than 1 million users

A new version of Qt needs to adjust our product to new market demands while keeping the 5 items above at the heart of what we’re doing.

The desktop market is at the root of our offering and is a strong and important market for Qt. It is where most of our users get the first contact with Qt and forms the basis of our tooling. Keeping it healthy and growing is a pre-requirement to be able to grow also in other markets.

Embedded and connected devices are where we have our biggest growth. Touch screens are coming to an exponentially increasing number of devices, but there is strong pressure on the price point of the hardware for these devices. Low-end chipsets, microcontrollers, combined with small to medium-sized touch screens will be used everywhere. Most of those devices will have relatively simple functionality but require polished and smooth user interfaces. Large volumes of such devices will be created, and we need to ensure we can target that space with our offering to be able to live up our scalability promise.

At the same time, user interfaces at the high end of the device spectrum will continue to increase in complexity, containing thousands of different screens and many applications. Merging 2D and 3D elements into one user interface will be common, as will be the usage of augmented and virtual reality.

Elements of artificial intelligence will be more commonly used in applications and devices, and we will need to have easy ways to integrate with those.

The strong growth in the number of connected devices being created as well as much higher requirements on user experience makes it more important for us to focus on world-class tooling to simplify the creation of applications and devices. Integrating UX designers into the development workflow is one of our goals, but there will be many other areas where we need to try to simplify the lives of our users.

Qt 6 will be a new major version for Qt. The main goal with such a new major version is to prepare Qt for the requirements coming in 2020 and beyond, clean up our codebase and make it easier to maintain. As such the focus will be on those items that require architectural changes within Qt and cannot be done without breaking some level of compatibility with Qt 5.x.

Below are some of the key changes we need to make in Qt to make it fit for the next years to come.

Next-generation QML

QML and Qt Quick have been the main technologies fueling our growth over the last years. The intuitive ways of creating User Interfaces using those technologies are a unique selling point of our offering.

But QML, as it was created for Qt 5, has some quirks and limitations. This, in turn, means that there is the potential for significant enhancements, that we are planning to implement with Qt 6. The main changes planned here are:

Introduce strong typing. Weak typing makes it hard for our users to apply large changes to their codebases. A strong type system allows for IDEs and other tools to support our users in this task and dramatically ease the maintenance. Also, we will be able to generate much better-performing code and reduce overhead.

Make JavaScript an optional feature of QML. Having a full JavaScript engine when using QML can complicate things and is an overhead especially when targeting low-end hardware such as microcontrollers. It is however extremely useful in many use cases.

Remove QML versioning. By simplifying certain lookup rules in QML and changing the way context properties work, we can remove the need for versioning in QML. This, in turn, will lead to large simplifications in the QML engine, greatly simplify our workload of maintaining Qt Quick and simplify usage of QML and Qt Quick for our users

Remove the duplication of data structures between QObject and QML
Currently, quite some data structures are duplicated between our meta-object system and QML, degrading startup performance and increasing memory usage. By unifying those data structures, we will be able to cut away most of that overhead.

Avoid runtime generated data structures. This relates to the point before, where many of those duplicated data structures are currently being generated at runtime. It should be perfectly possible to generate most of them at compile time.

Support compiling QML to efficient C++ and native code. With strong typing and simpler lookup rules we can convert QML to efficient C++ and native code, significantly increasing runtime performance

Support hiding implementation details. ‘Private’ methods and properties have been a long-time requirement to be able to hide data and functionality in QML components

Better tooling integration. Our current code model for QML is often incomplete, making refactoring, and detection of errors at compile time difficult to impossible. With the above changes, it should be possible to offer compile-time diagnostics that can compete with C++, as well as much improved refactoring support.

Next-generation graphics

A lot of things have changed in the graphics area since we did Qt 5.0, leading to us having to do significant changes to our graphics stack to stay competitive.

With Qt 5, we used OpenGL as the unified API for 3D graphics. Since then a host of new APIs have been defined. Vulkan is the designated successor of OpenGL on Linux, Apple is pushing for Metal, and Microsoft has Direct 3D. This means that Qt will in the future have to seamlessly work with all those APIs. To make that possible a new layer abstracting the graphics APIs (like QPA for the platform integration) called the Rendering Hardware Interface (RHI) has to be defined. We will need to base all our rendering infrastructure (QPainter, the Qt Quick Scenegraph, and our 3D support) on top of that layer.

The set of different graphics APIs also leads to us having to support different shading languages. The Qt Shader Tools module will help us to cross-compile shaders both at compile and at runtime.

3D is playing a more and more important role, and our current offering doesn’t have a unified solution for creating UIs that contain both 2D and 3D elements. Integrating QML with content from Qt 3D or 3D Studio is currently cumbersome and causes some performance overhead. In addition, it is impossible to sync animations and transitions on a frame by frame level between 2D and 3D content.

The new integration of 3D content with Qt Quick is aiming to solve this problem. In this case, a full new renderer will allow rendering 2D and 3D content together and support arbitrary nesting between the two. This will turn QML into our UI definition language for 3D UIs and remove the need for the UIP format. We will provide a technology preview of the ‘new’ Qt Quick with 3D support already with Qt 5.14, more information will come in a separate blog post.

Finally, the new graphics stack needs to be supported by a decent pipeline for graphical assets, that allows preparing those at compile time for the target hardware and use cases in question. Convert PNG files to compressed textures, compile many of them into texture atlases, convert shaders and meshes into optimized binary formats and more.

We also aim to bring a unified theming/styling engine to Qt 6, which will allow us to get a native look & feel on Desktop and mobile platforms to both Qt Widgets and Qt Quick.

Unified and consistent tooling

Our graphical tooling to create User interfaces has been split into two with Qt 3D Studio and Qt Design Studio. Additionally, Qt 3D Studio is slightly disconnected from the rest of Qt leading to quite some duplicated efforts.

We will unify those by merging the required functionality from Qt 3D Studio back into Design Studio. Design Studio shares a lot of code and the application/plugin framework with Qt Creator allowing for a great design experience and giving us the tools to bridge the gap between designers and developers.

The Design tooling also needs good integration with content creation tools such as Photoshop, Sketch, Illustrator, Maya, 3D Max, and others.

The developer tooling needs a lot of focus and attention so that we can offer the best in class support for C++, QML, and Python. A unified tooling offering also implies that developers can easily use the design functionality from within Qt Creator and that UX designers can benefit from features of the developer tooling such as compiling a project or on-device testing.

QMake as the build system used in Qt 5 has lots of quirks and limitations. For Qt 6, we aim to use CMake as a standard 3rd party build system to build Qt itself. CMake is by far the most widely used build system in the C++ world, and better integration with it is sorely needed. We will continue to support our users on QMake, but not develop it further or use it to build the Qt framework itself.

Enhancing our C++ APIs

C++ has changed a lot over the last years. While we had to base Qt 5.0 on C++98, we can now rely on C++17 for Qt 6. This implies that C++ offers a lot more functionality out of the box that wasn’t available when we did Qt 5. Our goal with Qt 6 has to be to better integrate with this functionality, without losing backward compatibility.

For Qt 6, we aim to make some of the functionality introduced with QML and Qt Quick available from C++. We work towards introducing a new property system for QObject and related classes, integrate the binding engine from QML into the core of Qt and make it available from C++. The new property system and the binding engine will lead to a significant reduction in runtime overhead and memory consumption for bindings and make them accessible for all parts of Qt, not only Qt Quick.

Language support

With Qt 5.12, we introduced support for Python, and we also added the browser as a new platform through Qt for WebAssembly. Keeping and further extending that cross-platform focus will be an important part of the Qt 6 series after 6.0 has been released.

Compatibility with Qt 5 and incremental improvements

Compatibility with older versions is extremely important and is a major requirement when we develop Qt 6. There are billions of lines of code written using our framework and any incompatible change we do will thus have a cost for our users. Furthermore, the more work the change to Qt 6 requires from our users the slower the adoption will be, which leads to more cost on our side to maintain the last version of Qt 5.

As such, we should aim to avoid breaking Qt in a way that triggers compile-time or runtime errors in our users’ codebase. If we must break compatibility, a compile-time error is preferable over a silent breakage at runtime (as those are much harder to detect).

While we do need to remove certain deprecated parts of Qt, we need to ensure that our users have the functionality they require. That implies that key functionality, such as Qt Widgets and other parts used by a large portion of our users, will, of course, stay available.

We are planning for many incremental improvements to our core classes and functionality that we could not do in the Qt 5 series. The aim is to keep full source compatibility, but as we can break binary compatibility with Qt 6, we can do quite a lot of cleanups and improvements that couldn’t be done within Qt 5.

Nevertheless, we need to move forward, and some house cleaning is required with Qt 6. We will remove most functionality (functions, classes or modules) that have been deprecated in Qt 5. This house cleaning will help free up our developers’ time in the longer term and allow us to have more focus on the maintained and current codebase.

Porting away from those deprecated parts does however need to be as simple as possible and our users can ideally do this incrementally using Qt 5.15 LTS. Our goal should be that Qt 6 is compatible enough with Qt 5.15 LTS so that one can easily maintain a large code base that can compile against both versions at the same time.

Marketplace & technical product structure

In addition to improving the Qt framework and tools, we aim to create a new marketplace for components and development tools. The marketplace will be focused on our direct users developing and designing applications and embedded devices, not targeted at consumers. As such it will be a central rallying point for the Qt ecosystem. It will give 3rd parties a place to publish their additions to Qt, allowing for both free and paid content.

Qt has been growing a lot over the last years, to the point where delivering a new version of it is a major undertaking. With Qt 6 there is an opportunity to restructure our product offering and have a smaller core product that contains the essential frameworks and tooling. We will use the market place to deliver our add-on frameworks and tools, not as a tightly coupled bundle with the core Qt product. This will give us additional flexibility on when and how we deliver things and allows us to decouple release schedules for some add-ons.

Give us your feedback and get involved

The technical vision will evolve further until the first release of Qt 6. While I believe that this document captures many of the most important points for the next version of Qt it is certainly not complete. If you have any further ideas, please get involved in the development of Qt 6 and discussions around it through Qt’s open governance model.

The post Technical vision for Qt 6 appeared first on Qt Blog.

KDevelop 5.4 released

Tuesday 6th of August 2019 10:00:00 AM

KDevelop 5.4 released

We are happy to announce the availability of KDevelop 5.4 today featuring support for a new build system, a new scratchpad feature, analyzer support from Clang-Tidy plus a bunch of bug fixes and wee improvements.


Projects using the new rising star in the build system scene, Meson, can now also be managed with KDevelop, thanks to the work of Daniel Mensinger.

Current features are:

  1. Native support for Meson projects (configuring, compiling, installing)

  2. Support for KDevelop code autocompletion (plugin reads Meson introspection information)

  3. Initial support for the Meson rewriter: modifying basic aspects of the project (version, license, etc.)

Support for adding / removing files from a build target will follow in future releases of KDevelop.


Thanks to the work of Amish Naidu there is now a tool to keep "scratches" of code or text to experiment or quickly run something without the need to create a full project.

The plugin adds a new tool view, which maintains a list of your scratches which you can compile and run. The data from scratches is managed and stored by KDevelop internally but is presented as regular documents in the editor giving all the editing convenience of e.g. code-completion and diagnostics. Commands used to run the scratches are saved for each scratch, while new scratches are pre-set with the last command used for that file type.


The plugin for Clang-Tidy had been developed and released independently so far, but starting with version 5.4 is now part of KDevelop's default plugins. Learn more about the plugin on Friedrich Kossebau's blog.


More work was done on stabilizing and improving our C++ language support, which uses a Clang based backend. Notable fixes include:

  • Add working directory to clang parser. (commit. code review D22197)

  • Clang Plugin: Report some problems from included files. (commit. code review D18224)

  • Make it possible to select -std=c++2a for our language support. (commit)

  • Rename c++1z to C++17. (commit)

  • Clang CodeCompletion: No auto-completion for numbers. (commit. code review D17915)

  • Add assistant to generate header guards. (commit. code review D17370)

  • Always set maximum file size for internal parse job. (commit)

  • Bypass the 5 MB maximum file size limit for the phpfunctions.php internal file. (commit)

  • Fix linking with ld.lld. (commit)


The developers have been concentrating on fixing bugs, which already have been added into the 5.3 series.

There are no new features compared to 5.3.

Other Changes
  • [Documentation] Set size policy of providers combobox to AdjustToContents (commit)

  • Contextbrowser: Remove 'separated by only whitespace' possibility for showing the problem tooltip. (commit)

  • Contextbrowser: Minor improvement to tooltip showing behavior. (commit)

  • CMake plugin: Also show an error message if the CMake configuration becomes invalid due to a change, and add an instruction to reload the project manually. (commit)

  • CMake plugin: Show a message box if configuration fails. (commit)

  • Projectfilter: Include .clang-format by default. (commit)

  • Add a predefined clang-format custom script formater. (commit)

  • Fix code completion for nameless structs/unions with the same member. (commit. fixes bug #409041. code review D22455)

  • Support newer kdebugsettings .categories file format. (commit)

  • Show session name in the Delete Session confirmation dialog. (commit. code review D22456)

  • Remove invalid check from test_projectload test. (commit. code review D22350)

  • Document tree view close on middle button. (commit. code review D22160)

  • Follow KTextEditor changes for hidpi rendering of icon border. (commit)

  • Note visibilty tag also with signature of friend-declared method. (commit)

  • Guard against crashes when IStatus object gets destroyed at bad times. (commit)

  • Attempt to fix a crash on shutdown. (commit)

  • Astyle: support the system astyle library. (commit. code review D17760)

  • Renovate kdevelop bash completion file. (commit)

  • Fix deadlock exception in FileManagerListJob. (commit)

  • DVCS Branch Manager with filtering and sorting proposal. (commit. code review D20142)

  • Also find clang include path based on runtime libclang library path. (commit)

  • TestFile: On destruction, close associated document if open and stop the background parser. (commit. code review D18567)

  • CMake: discover more unit tests. (commit. fixes bug #405225. code review D19673)

  • Be less restrictive with failures while searching for LLVM. (commit)

  • Allow the maximum file size of parse jobs to be configurable. (commit)

  • Optimize CMakeBuildDirChooser::buildDirSettings(). (commit. code review D18857)

  • [Sessions Runner] Use icon name. (commit. code review D19159)

  • Don't eat the backspace event when no alt modifier is set. (commit)

  • "Reparse Entire Project" action for the ProjectController. (commit. code review D11934)

  • Introduce QuickOpenEmbeddedWidgetCombiner. (commit)

  • Add 'back' to QuickOpenEmbeddedWidgetInterface. (commit)

  • Update documentation: the keyboard shortcuts use ALT not SHIFT. (commit)

  • Fix up/down keyboard navigation for 'Show documentation' links. (commit)

  • Lock duchain in AbstractIncludeNavigationContext::html. (commit)

  • Don't crash when background listing outlasts file manager list job. (commit)

  • Don't crash when project is closed before it was fully opened. (commit)

  • Make sure we use the same compiler settings as the project is by default. (commit. code review D11136)

  • Debugger plugin fixes. (commit. code review D18325)

  • Kdevelop-msvc.bat finds VS-2017 based on a registry key on Windows. (commit. code review D17908)

  • CMakeBuildDirChooser: avoid calling deprecated KUrlRequester::setPath(). (commit. code review D18856)

  • Flatpak+cmake: put the cmake build directories into .flatpak-builder. (commit)

  • Allow KDEV_DEFAULT_INSTALL_PREFIX specify a default install prefix. (commit)

  • Flatpak: Improve runtime title. (commit)

  • Adapt indentation mode after a new project was opened. (commit)

  • Flatpak: Fix listing runtimes. (commit)

  • Workaround the bug found by ASan, which can be seen on FreeBSD CI. (commit. code review D18463)

  • Properly cleanup FileManagerListJob when folder items are deleted. (commit. fixes bug #260741)

  • Provide debugger name and pid when registering a debugger to DrKonqi. (commit. code review D18511)

  • Support for indent-after-parens astyle option. (commit. code review D18371)

  • Fix bug 389060 (Heaptrack analysis keeps firing /usr/bin/plasmoidviewer). (commit. fixes bug #389060. code review D15565)

  • Contextbrowser: Ability to show combined problems and decl tooltip. (commit. code review D18229)

  • Properly display argument names of template functions. (commit. code review D18218)

  • Show size and alignment information in tooltips for typedef or alias. (commit. code review D18097)

  • GrepView: Extend default file extensions to search. (commit. Implements feature #402207. code review D17892)

  • Fix crash in documentation view. (commit. fixes bug #402026)

  • [clang-tidy] Fix context-menu crash for files not in a project. (commit. fixes bug #401917)

  • Polish Flatpak integration. (commit)

  • Don't add 'override' specifier for non-modern project settings. (commit. fixes bug #372280. code review D16773)

  • [clang-tidy] Disable/Block Run actions in projects without buildsystem manager. (commit)

  • Add VcsAnnotationItemDelegate, for control of rendering and tooltip. (commit. code review D8709)

  • Qmljs: Update qmljs from QtCreator v4.7.2. (commit)

  • LoadedPluginsDialog: Fix initial size. (commit)

  • Place cursor after opening brace for function implementation. (commit. code review D16386)

  • Replace leading typed text when completing function implementation. (commit. fixes bug #384710. code review D16326)

  • Fix crashes when document gets destroyed directly after load. (commit)

  • Prevent QWebEngine from overriding signal handlers. (commit. code review D16188)

  • Add missing break in QmlJs code completion. (commit)

  • CMake: fix missing addition of policies to documentation index. (commit. code review D15882)

  • Create action to jump to the current execution line in debug mode. (commit. Implements feature #361411. code review D14618)

  • Fix segfaults in OutputWidget. (commit. fixes bug #398615. code review D15326)

  • Fix double delete bug in OutputWidget. (commit. code review D15241)

  • Cleanup Perforce test case, and thereby its output a little. (commit. code review D14959)

Get it

Together with the source code, we again provide a pre-built one-file-executable for 64-bit Linux as an AppImage. You can find it on our download page.

The 5.4.0 source code and signatures can be downloaded from

Should you find any issues in KDevelop 5.4, please let us know on the bug tracker.

kossebau Tue, 2019/08/06 - 12:00 Category News Tags release

Interview with Ray Waysider

Tuesday 6th of August 2019 09:09:49 AM
Could you tell us something about yourself?

I’m uncomfortable with this question from which you may surmise that I’m quite introverted or that I’m conscious of my tendency to overshare. Both are true. I’m a white, heterosexual male but I do keep bees, so…

Do you paint professionally, as a hobby artist, or both?

Mostly as a hobby unfortunately. My day job is graphic design so I can only paint in the evenings and at weekends but I’d love to spend more time doing illustration. One day I’d like to illustrate a children’s book.

What genre(s) do you work in?

I don’t consciously work within genres. I guess I’m influenced by fantasy, cartoons, horror…I’m not over serious or precious about art. Most of what I do is rather light-hearted (though I do put a serious amount of work into it). I do enjoy dark humour which may be evident in some of my work. I did work as an artist in the games industry for a few years but that was a long time ago when game characters were sixteen by thirty two pixel sprites so really before game art had the chance to develop into what it is today but I do enjoy looking at all the game art on Artstation… check it out if you’re into elfies.

Perhaps If I’d been fanatical about a particular genre I’d have devoted the work and effort into developing a style that would fit that genre and been more successful but I have more of a generalist approach, taking influences in an eclectic way from anything that grabs my attention.

I’m not snobby or judgemental about art. I find pleasure in all genres, in fine art and popular culture though I do find the animé/manga style of depicting highly sexualised bodies with prepubescent looking faces morally troubling to say the least..

Whose work inspires you most — who are your role models as an artist?

Earliest influences were definitely cartoons, Disney, Warner Bros etc. Then comic books Marvel, DC then album art of the seventies and eighties – people like Roger Dean and Hypgnosis. This sparked an interest in surrealism – Dali, Magritte, Khalo. From there a general interest in art from the Renaissance to Dada. I’m as happy studying Goya’s etchings as I am getting into Tracy Emin’s bed.

Lately I’m into the pulp fiction novel cover art of the 50s but I also admire Illustrators Norman Rockwell, Frazetta, Vallejo and going really old school, Sir John Tenniel, Arthur Rackham and Aubrey Beardsley.

How and when did you get to try digital painting for the first time?

In the 90s, when I bought my first computer but the lack of a tablet was rather limiting. I remember I’d scan pencil drawings and paint over them using the mouse. Drawing with a mouse is like trying to roll a cigarette wearing oily boxing gloves.

What makes you choose digital over traditional painting?

The freedom to adjust just about anything at any stage. Digital painting allows you to be experimental on a piece you’ve invested a lot of time in without fear of ruining it by making mistakes. Also no need to clean the brushes.

How did you find out about Krita?

Through a YouTube review by the excellent Mr Borodante.

What was your first impression?


What do you love about Krita?

The brush engines and interface makes it easy to edit the brushes. The perspective tools (assistants) are really well designed for artists. I also like how customisable the colour selector is. The multibrush tool is great for rotational symmetry and the wrap-around mode is perfect for creating tiling patterns. Honestly I like pretty much everything!

What do you think needs improvement in Krita? Is there anything that really annoys you?

I’d love it if you could specify which perspective guide the brush follows more easily. Sometimes if I have guides running at similar angles it snaps to the wrong one. I can disable the one I don’t want, but not as easily as I’d like.

What sets Krita apart from the other tools that you use?

It’s free (obviously) but apart from that it’s also a very intuitive program. So rich in features and it’s easy to set it up to your own personal taste. I have Photoshop installed on my PC as well as Krita but I just find Krita so comfortable to use and so capable of doing anything I want that I’ve hardly used Photoshop since I installed Krita.

If you had to pick one favourite of all your work done in Krita so far, what would it be, and why?

It’s usually my last piece, which right now is a painting of a character called Ambrose because on this occasion the finished result is something like what I was originally aiming at when I first conceived the piece. Also people have commented saying it made them feel uncomfortable which pleased me immensely.

What techniques and brushes did you use in it?

I don’t use many brushes. Mostly I use a basic brush with pressure sensitive opacity, varying the size with a button on my tablet (I use a GAOMON PD1560 screen monitor). I use the blending brush sparingly and sometimes the soft round airbrush particularly for atmospheric effects. I got the initial reference for the pose in this painting using an android app called easy poser but this was supplemented by quite a few reference images from online and some taken on my phone of my own hands and feet. I do mess around with the filter masks a lot. Particularly the colour
adjustment and levels. I also often use a layer set either to multiply or burn to enhance shadows or a layer set to screen or dodge for highlights. I sometimes use a layer set to colour mode to correct or change hues.

Where can people see more of your work?

Anything else you’d like to share?

They say you should do one thing everyday that frightens you so I clean my toothbrush in the toilet.

About deprecation of QFontMetrics::width()

Monday 5th of August 2019 09:54:40 PM

With any new version of the Qt toolkit comes some clean-up of its APIs to keep it clean, consistent, and future-proof. Part of this clean-up is to rename API functions to make it more clear what they actually do.

Starting with Qt 5.11, the QFontMetrics::width() function was deprecated. You could still compile code that uses this function, but since it is marked obsolete, you were encouraged to port away from it.

So what is unclear or not consistent about it? The function name and signature suggest that you can retrieve the width of a text string (or single character) taking into account the metrics of a specific font. Such metrics are needed to create widget layouts that automatically adapt to user specified fonts or different system display DPIs.

Reading the API description, the result is actually not the width. The graphics shown at illustrates that there is a difference between the horizontal advance, i.e. the number of pixels from one character to the next character, and the bounding rectangle width, which is needed to encompass all pixels including so called bearings that can overlap the next or the previous character.

Since it was not clear from the confusingly named function QFontMetrics::width() that it actually returned the horizontal advance, instead of the bounding width, this method is now obsolete. You must port to either QFontMetrics::horizontalAdvance() or QFontMetrics::boundingRect().width().

Please make sure you are aware of the difference, and do not port blindly. I am pretty sure that in most cases QFontMetrics::boundingRect() is what you want, unless you are writing custom text shaping/layouting code. Using the wrong function can cause clipped text or text that suddenly wraps to the next line despite calculating the width that it needs.

More in Tux Machines

Games: Smith and Winston, 7 Billion Humans Sale

Servers: Ampere Computing, SUSE and Red Hat

  • Ampere Computing Is Keeping Close Track Of The Linux Performance For Their ARM Servers

    Hardware vendor Ampere Computing with their impressive ARM servers is doing a great job on closely following their hardware's Linux performance as part of a rigorous continuous testing regiment or ensuring quality, compatibility, and stability while being fully-automated. Ampere Computing's Travis Lazar talked at this week's Linux Foundation events in San Diego over the importance of continuous regression testing for software and hardware development by talking about their internal workflow and software in place. Their internal system is the "Totally Automated Regression System" or TARS for short. TARS makes use of various open-source components including the Phoronix Test Suite and its vast collection of benchmarks for providing comprehensive test coverage plus Ampere's own "extensions" to the Phoronix Test Suite. TARS also incorporates the provisioning/configuration responsibilities as well as analysis of the data.

  • [SUSE] Learn how the Multimodal OS can benefit your organization.
  • From ProdOps to DevOps: Surviving and thriving

    For many of us in Production Operations (ProdOps), change is the enemy. If something changes, there is now an opportunity for things that were working just fine to experience problems. It is like a game of Jenga. When will the tower fall because a seemingly minor change unbalances the whole stack of pieces? ProdOps teams hate change so much, that countless frameworks have been invented to "manage" changes; in reality, these frameworks make the procedure for effecting a change so onerous that most people give up and accept the status quo. Actually, that statement is a bit unfair. These frameworks are an attempt to wrap planning and consensus around production changes, thus minimizing potential downtime caused by random or rogue changes (see Why the lone wolf mentality is a sysadmin mistake).

  • Meet Red Hat at VMworld

    As Red Hat’s Ashesh Badani said in his blog post about the reference architecture for OpenShift on VMware’s SDDC stack “… this is just the first step — Red Hat OpenShift 4 brings optimized installation capabilities to a variety of infrastructures and for this, the companies are working towards a VMware Validated Design. We are excited that VMware is working closely with Red Hat to deliver a simplified experience there in the coming months.”

Late Coverage of Confidential Computing Consortium

  • Microsoft Partners With Google, Intel, And Others To Form Data Protection Consortium

    The software maker joined Google Cloud, Intel, IBM, Alibaba, Arm, Baidu, Red Hat, Swisscom, and Tencent to establish the Confidential Computing Consortium, a group committed to providing better private data protection, promoting the use of confidential computing, and advancing open source standards among members of the technology community.

  • #OSSUMMIT: Confidential Computing Consortium Takes Shape to Enable Secure Collaboration

    At the Open Source Summit in San Diego, California on August 21, the Linux Foundation announced the formation of the Confidential Computing Consortium. Confidential computing is an approach using encrypted data that enables organizations to share and collaborate, while still maintaining privacy. Among the initial backers of the effort are Alibaba, Arm, Baidu, Google Cloud, IBM, Intel, Microsoft, Red Hat, Swisscom and Tencent. “The context of confidential computing is that we can actually use the data encrypted while programs are working on it,” John Gossman, distinguished engineer at Microsoft, said during a keynote presentation announcing the new effort. Initially there are three projects that are part of the Confidential Computing Consortium, with an expectation that more will be added over time. Microsoft has contributed its Open Enclave SDK, Red Hat is contributing the Enarx project for Trusted Execution Environments and Intel is contributing its Software Guard Extensions (SGX) software development kit. Lorie Wigle, general manager, platform security product management at Intel, explained that Intel has had a capability built into some of its processors called software guard which essentially provides a hardware-based capability for protecting an area of memory.

Graphics: Mesa Radeon Vulkan Driver and SPIR-V Support For OpenGL 4.6

  • Mesa Radeon Vulkan Driver Sees ~30% Performance Boost For APUs

    Mesa's RADV Radeon Vulkan driver just saw a big performance optimization land to benefit APUs like Raven Ridge and Picasso, simply systems with no dedicated video memory. The change by Feral's Alex Smith puts the uncached GTT type at a higher index than the visible vRAM type for these configurations without dedicated vRAM, namely APUs.

  • Intel Iris Gallium3D Is Close With SPIR-V Support For OpenGL 4.6

    This week saw OpenGL 4.6 support finally merged for Intel's i965 Mesa driver and will be part of the upcoming Mesa 19.2 release. Not landed yet but coming soon is the newer Intel "Iris" Gallium3D driver also seeing OpenGL 4.6 support. Iris Gallium3D has been at OpenGL 4.5 support and is quite near as well with its OpenGL 4.6 support thanks to the shared NIR support and more with the rest of the Intel open-source graphics stack. Though it's looking less likely that OpenGL 4.6 support would be back-ported to Mesa 19.2 for Iris, but we'll see.