Language Selection

English French German Italian Portuguese Spanish

Kde Planet

Syndicate content
Planet KDE - http://planetKDE.org/
Updated: 12 hours 10 min ago

Connect your Android phone with your Mac via KDE Connect

Tuesday 16th of July 2019 12:20:23 PM

Have you ever heard Continuity, the solution of Apple which provides one seamless experience between your iPhone and your Mac?

You may be surprised, “Woohoo, it’s amazing but I use my OnePlus along with my Mac.” With my GSoC 2019 project, you can connect your Mac and your Android phone with KDE Connect!

And you can even connect your Mac with your Linux PC or Windows PC (Thanks to Piyush, he is working on optimizing experience of KDE Connect on Windows).

Installation instruction
  1. You can download KDE Connect Nightly Build for macOS from KDE Binary Factory: https://binary-factory.kde.org/view/MacOS/job/kdeconnect-kde_Nightly_macos/. But notice that it’s not yet a stable version, and it requires that you have permission to run application from non-certificated developer. We’ll release a stable one next month on August.

  2. Otherwise you can build your own version. Please follow the instructions on KDE Connect Wiki. If you’re using macOS 10.13, MacOS X 10.12 or below, we recommend that you build your own KDE Connect because our Binary Factory are building applications for only macOS 10.14 or above.

You’ll finally get a DMG image file in both 2 ways.

Just click on it, mount it and drap kdeconnect-indicator into Applications folder.

Open kdeconnect-indicator and your magic journey with KDE Connect for macOS begins!

Use

After installation, you can see an icon of kdeconnect-indicator in the Launchpad.

Click it to open. If everything is ok, you will see an KDE Connect icon in your system tray.

Click the icon -> Configure to open configuration window. Here you can see discovered devices and paired devices.

You can enable or disable functions in this window.

Currently, you can do these from your Android phone:

  • Run predefined commands on your Mac from connected devices.
  • Check your phones battery level from the desktop
  • Ring your phone to help finding it
  • Share files and links between devices
  • Control the volume of your Mac from the phone
  • Keep your Mac awake when your phone is connected
  • Receive your phone notifications on your desktop computer (this function is achieved but not yet delivered, you can follow another article to enable it manually)

I’m trying to make more plugins work on macOS. Good luck to my GSoC project :)

Acknowledgement

Thanks to KDE Community and Google, I could start this Google Summer of Code project this summer.

Thanks to members in KDE Connect development. Without them, I cannnot understand the mechanism and get it work on macOS so quickly :)

Conclusion

If you have any question, KDE Connect Wiki may be helpful. And you can find a bug tracker there.

Don’t be hesitated to join our Telegram Group or IRC channel if you’d like to bring more exciting functions into KDE Connect:

  • Telegram
  • IRC (#kdeconnect)
  • matrix.org (#freenode_#kdeconnect:matrix.org)

I wish you could enjoy the seamless experience provided by KDE Connect for macOS and your Android Phone!

Plasma sprint, 2019 edition; personal updates

Tuesday 16th of July 2019 01:49:52 AM
KDE Project:

In June, I had a great time at a series of KDE events held in the offices of Slimbook, makers of fantastic Neon-powered laptops, at the outskirts of Valencia, Spain. Following on from a two-day KDE e.V. board of directors meeting, the main event was the 2019 edition of the Plasma development sprint. The location proved to be quite ideal for everything. Slimbook graciously provided us with two lovely adjacent meeting rooms for Plasma and the co-located KDE Usability & Productivity sprint, allowing the groups to mix and seperate as our topics demanded - a well-conceived spatial analog for the tight relationship and overlap between the two.


The Plasma team walked the gorgeous Jardí del Túria almost every day during their sprint week to stay healthy and happy devs.

As always during a Plasma sprint, we used this opportunity to lock down a number of important development decisions. Release schedules, coordinating the next push on Plasma/Wayland and a new stab at improving the desktop configuration experience stand out to me, but as the Dot post does a fine job providing the general rundown, I'll focus on decisions made for the Task Manager widgets I maintain.

On one of the sprint mornings, I lead a little group session to discuss some of the outstanding high-level problems with the two widgets (the regular Task Manager and the Icons-only Task Manager), driven by frequent user reports:

  • Poor experience performing window management on groups of windows
  • Unnecessary duplication in the UI displaying window group contents
  • Unintuitive behavior differences between the two widgets

To address these, we came up with a list of action items to iteratively improve the situation. Individually they're quite minor, but there are many of them, and they will add up to smooth out the user experience considerably. In particular, we'll combine the currently two UIs showing window group contents (the tooltip and the popup dialog) into just one, and we'll make a new code path to cycle through windows in a group in most recently used order on left click the new default. The sprint notes have more details.

Decision-making aside, a personal highlight for me was a live demo of Marco Martin's new desktop widget management implementation. Not only does it look like a joy to use, it also improves the software architecture of Plasma's home screen management in a way that will help Plasma Mobile and other use cases equally. Check out his blog post for more.


I got a new laptop. Slimbook founder Alejandro López made it a proper computer by attaching a particularly swanky metal KDE sticker during the preceding KDE e.V. board sprint.

In KDE e.V. news, briefly we stole one of the sprint rooms for a convenient gathering of most of our Financial Working Group, reviewing the implementation of the annual budget plan of the organization. We also had a chance to work with the Usability goal crew (have you heard about KDE goals yet?) on a plan for the use of their remaining budget -- it's going to be exciting.

As a closing note, it was fantastic to see many new faces at this year's sprint. It's hard to believe for how many attendees it was their first KDE sprint ever, as it couldn't have been more comfortable to have them on board. It's great to see our team grow.

See you next sprint. :)

In more personal news, after just over seven years at the company I'm leaving Blue Systems GmbH at the end of July. It's been a truly fantastic time working every day with some of the finest human beings and hackers. The team there will go on to do great things for KDE and personal computing as a whole, and I'm glad we will keep contributing together to Plasma and other projects we share interests and individual responsibilities in.

As a result, the next ~10 weeks will see me very busy moving continents from Seoul back to my original home town of Berlin, where I'll be starting on a new adventure in October. More on that later (it's quite exciting), but my work on the KDE e.V. board of directors or general presence in the KDE community won't be affected.

That said -- between the physical and career moves, board work and personal preparations for Akademy, I'll probably need to be somewhat less involved and harder to reach in the various project trenches during this quarter. Sorry for that, and do poke hard if you need me to pick up something I've missed.

And of course:


KDE Applications 19.08 branches created

Monday 15th of July 2019 07:22:21 PM

Make sure you commit anything you want to end up in the KDE Applications 19.08 release to them

We're already past the dependency freeze.

The Freeze and Beta is this Thursday 18 of July.

More interesting dates
August 1, 2019: KDE Applications 19.08 RC (19.07.90) Tagging and Release
August 8, 2019: KDE Applications 19.08 Tagging
August 15, 2019: KDE Applications 19.08 Release

https://community.kde.org/Schedules/Applications/19.08_Release_Schedule

Kate LSP Client Continued

Sunday 14th of July 2019 01:33:00 PM

The new LSP client by Mark Nauwelaerts made nice progress since the LSP client restart post last week.

Reminder: The plugin is not compiled per default, you can turn it on via:

cmake -DCMAKE_INSTALL_PREFIX=“your prefix” -DENABLE_LSPCLIENT=ON “kate src dir”

The code can still be found kate.git master, see lspclient in the addons directory.

What is new?

  • Diagnostics support: A tab in the LSP client toolview will show the diagnistics, grouped by file with links to jump to the locations. Issues will be highlighted in the editor view, too.

  • Find references: Find all references for some variable/function in your complete program. They are listed like the diagnostics grouped per file in an extra tab.

  • Improved document highlight: Highlight all occurrences of a variable/… inside the current document. Beside highlighting the reads/writes/uses, you get a jump list like for the other stuff as tab, too.

A feature I missed to show last time:

  • Hover support: Show more meta info about a code location, like the proper type, useful e.g. for almost-always-auto C++ programming.

We even got already two patches for the fresh plugin:

Both are aimed to improve the support of the Rust LSP server. As you can see, they got already reviewed and merged.

Feel welcome to show up on kwrite-devel@kde.org and help out! All development discussions regarding this plugin happen there.

If you are already familiar with Phabricator, post some patch directly at KDE’s Phabricator instance.

You want more LSP servers supported? You want to have feature X? You have seen some bug and want it to vanish? => Join!

KDE Usability & Productivity: Week 79

Sunday 14th of July 2019 04:01:27 AM

After a somewhat light week, we’ve back with week 79 in KDE’s Usability & Productivity initiative, and there’s a ton of cool stuff for you!

New Features Bugfixes & Performance Improvements User Interface Improvements

Next week, your name could be in this list! Not sure how? Just ask! I’ve helped mentor a number of new contributors recently and I’d love to help you, too! You can also check out https://community.kde.org/Get_Involved, and find out how you can help be a part of something that really matters. You don’t have to already be a programmer. I wasn’t when I got started. Try it, you’ll like it! We don’t bite!

If you find KDE software useful, consider making a tax-deductible donation to the KDE e.V. foundation.

KDE Craft Packager on macOS

Saturday 13th of July 2019 08:43:27 AM

In Craft, to create a package, we can use craft --package <blueprint-name> after the compiling and the installing of a library or an application with given blueprint name.

On macOS, MacDMGPackager is the packager used by Craft. The MacDylibBundleris used in MacDMGPackager to handle the dependencies.

In this article, I’ll give a brief introduction of the two classes and the improvement which I’ve done for my GSoC project.

MacDMGPackager

MacDMGPackager is a subclass of CollectionPackagerBase. Its most important method is createPackage.

First of all,

1
self.internalCreatePackage(seperateSymbolFiles=packageSymbols)
Initialisation of directory variables

Here we get the definitions, the path of the application which we want to pack, and the path of archive.
The appPath should be the root of an application package with .app extension name. According to the convention of applications on macOS, targetLibdir points to the library directory of the application.
During the compiling and the installing period, in the application directory, there is only a .plist and MacOS subdirectory. So next, the library directory is created for further using.

1
2
3
4
5
6
defines = self.setDefaults(self.defines)
appPath = self.getMacAppPath(defines)
archive = os.path.normpath(self.archiveDir())
# ...
targetLibdir = os.path.join(appPath, "Contents", "Frameworks")
utils.createDir(targetLibdir)
Moving files to correct directories

Then, we predefine a list of pairs of source and destination for directories and move the files to the destinations. The destionations are the correct directories of libraries, plugins and resources in a macOS application package.

1
2
3
4
5
6
7
8
9
10
11
12
13
moveTargets = [
(os.path.join(archive, "lib", "plugins"), os.path.join(appPath, "Contents", "PlugIns")),
(os.path.join(archive, "plugins"), os.path.join(appPath, "Contents", "PlugIns")),
(os.path.join(archive, "lib"), targetLibdir),
(os.path.join(archive, "share"), os.path.join(appPath, "Contents", "Resources"))]

if not appPath.startswith(archive):
moveTargets += [(os.path.join(archive, "bin"), os.path.join(appPath, "Contents", "MacOS"))]

for src, dest in moveTargets:
if os.path.exists(src):
if not utils.mergeTree(src, dest):
return False
Fixing dependencies using MacDylibBundler

After the moving, we create an instance of MacDylibBundler with appPath. After the with instruction, all the codes are executed with DYLD_FALLBACK_LIBRARY_PATH=<package.app>/Contents/Frameworks:<Craft-Root>/lib environment variable.

For further reading of this environment variable, please refer this question on StackOverFlow.

1
2
3
dylibbundler = MacDylibBundler(appPath)
with utils.ScopedEnv({'DYLD_FALLBACK_LIBRARY_PATH': targetLibdir + ":" + os.path.join(CraftStandardDirs.craftRoot(), "lib")}):
# ...
Fixing dependencies of main binary

Here, we firstly create an object of Path. It points to the executable of macOS Package.

It should be reminded that, although here, we use the same name for both the macOS application package and the executable, it is not mandatory. The name of executable is defined by CFBundleExecutable in the .plist file. So maybe read it from the .plist file is a better solution.

Then, the method bundleLibraryDependencies is used to copy libraries and fix dependencies for the executable in the package.

A brief introduction of this method:

  1. Call utils.getLibraryDeps for getting a list of dependencies. This operation is done by using otool -L.
  2. Copy missing dependencies into Contents/Frameworks, and update the library information in the executable.
    I’ll give an analyse in detail in the next chapter.
1
2
3
4
CraftCore.log.info("Bundling main binary dependencies...")
mainBinary = Path(appPath, "Contents", "MacOS", defines['appname'])
if not dylibbundler.bundleLibraryDependencies(mainBinary):
return False
Fixing dependencies in Frameworks and PlugIns

And then, we try to fix all the dependencies of libraries in Contents/Frameworks and Contents/PlugIns.

1
2
3
4
5
6
7
# Fix up the library dependencies of files in Contents/Frameworks/
CraftCore.log.info("Bundling library dependencies...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/Frameworks"):
return False
CraftCore.log.info("Bundling plugin dependencies...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/PlugIns"):
return False
Fixing dependencies using macdeployqt

The macdeployqt is used to fix the Qt libraries used by the application. Craft installed it while compiling and installing Qt. But don’t worry, it is not in your system path.

I have not yet found what macdeployqt exactly do, it’s nice to have an look at its source code.

1
2
if not utils.system(["macdeployqt", appPath, "-always-overwrite", "-verbose=1"]):
return False
Removing files in blacklist

If macdeplyqt added some files which we don’t want, they would be removed here.

1
2
3
4
5
6
7
8
9
# macdeployqt might just have added some explicitly blacklisted files
blackList = Path(self.packageDir(), "mac_blacklist.txt")
if blackList.exists():
pattern = [self.read_blacklist(str(blackList))]
# use it as whitelist as we want only matches, ignore all others
matches = utils.filterDirectoryContent(appPath, whitelist=lambda x, root: utils.regexFileFilter(x, root, pattern), blacklist=lambda x, root:True)
for f in matches:
CraftCore.log.info(f"Remove blacklisted file: {f}")
utils.deleteFile(f)
Fixing dependencies after fixing of macdeployqt

After macdeplotqt, there may be some libraries or plugins added by macdeplotqt. So we do the fixing of dependencies once again.

But I’m doubting if we need to fix twice the dependencies. I’ll update this post after I figure out what will it lead to if we fust fix after macdeployqt.

1
2
3
4
5
6
7
8
9
# macdeployqt adds some more plugins so we fix the plugins after calling macdeployqt
dylibbundler.checkedLibs = set() # ensure we check all libs again (but
# we should not need to make any changes)
CraftCore.log.info("Fixing plugin dependencies after macdeployqt...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/PlugIns"):
return False
CraftCore.log.info("Fixing library dependencies after macdeployqt...")
if not dylibbundler.fixupAndBundleLibsRecursively("Contents/Frameworks"):
return False
Checking dependencies

Then, we use MacDylibBundler to check all dependencies in the application package. If there is any bad dependency, the package process will fail.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# Finally sanity check that we don't depend on absolute paths from the builder
CraftCore.log.info("Checking for absolute library paths in package...")
found_bad_dylib = False # Don't exit immeditately so that we log all the bad libraries before failing:
if not dylibbundler.areLibraryDepsOkay(mainBinary):
found_bad_dylib = True
CraftCore.log.error("Found bad library dependency in main binary %s", mainBinary)
if not dylibbundler.checkLibraryDepsRecursively("Contents/Frameworks"):
CraftCore.log.error("Found bad library dependency in bundled libraries")
found_bad_dylib = True
if not dylibbundler.checkLibraryDepsRecursively("Contents/PlugIns"):
CraftCore.log.error("Found bad library dependency in bundled plugins")
found_bad_dylib = True
if found_bad_dylib:
CraftCore.log.error("Cannot not create .dmg since the .app contains a bad library depenency!")
return False
Creating DMG image

Up to now, everything is well, we can create a DMG image for the application.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
name = self.binaryArchiveName(fileType="", includeRevision=True)
dmgDest = os.path.join(self.packageDestinationDir(), f"{name}.dmg")
if os.path.exists(dmgDest):
utils.deleteFile(dmgDest)
appName = defines['appname'] + ".app"
if not utils.system(["create-dmg", "--volname", name,
# Add a drop link to /Applications:
"--icon", appName, "140", "150", "--app-drop-link", "350", "150",
dmgDest, appPath]):
return False

CraftHash.createDigestFiles(dmgDest)

return True

An example of DMG image is like this one, users can drag the application into Applications directory to install it.

MacDylibBundlerConstructor1
2
3
4
def __init__(self, appPath: str):
# Avoid processing the same file more than once
self.checkedLibs = set()
self.appPath = appPath

In the constructor, a set is created to store the libraries which have been already checked. And the appPath passed by developer is stored.

Methods

This method bundleLibraryDependencies and _addLibToAppImage are the most important methods in this class. But they’re too long. So I’ll only give some brief introduction of them.

_addLibToAppImage checks whether a library is already in the Contents/Frameworks. If the library doesn’t exist, it copies it into the diretory and tries to fix it with some relative path.

1
2
def _addLibToAppImage(self, libPath: Path) -> bool:
# ...

bundleLibraryDependencies checks the dependencies of fileToFix. If there are some dependencies with absolute path, it copies the dependencies into Contents/Frameworks by calling _addLibToAppImage. And then, it calls _updateLibraryReference to update the reference of library.

1
2
def bundleLibraryDependencies(self, fileToFix: Path) -> bool:
# ...

As description in the docstring, fixupAndBundleLibsRecursively can remove absolute references and budle all depedencies for all dylibs.

It traverses the directory, and for each file which is not symbol link, checks whether it ends with “.so” or “.dylib”, or there is “.so.” in the file name, or there is “.framework” in the full path and it’s a macOS binary. If it’s that case, call bundleLibraryDependencies method to bundle it in to .app package.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
def fixupAndBundleLibsRecursively(self, subdir: str):
"""Remove absolute references and budle all depedencies for all dylibs under :p subdir"""
# ...
for dirpath, dirs, files in os.walk(os.path.join(self.appPath, subdir)):
for filename in files:
fullpath = Path(dirpath, filename)
if fullpath.is_symlink():
continue # No need to update symlinks since we will process the target eventually.
if (filename.endswith(".so")
or filename.endswith(".dylib")
or ".so." in filename
or (f"{fullpath.name}.framework" in str(fullpath) and utils.isBinary(str(fullpath)))):
if not self.bundleLibraryDependencies(fullpath):
CraftCore.log.info("Failed to bundle dependencies for '%s'", os.path.join(dirpath, filename))
return False
# ...

areLibraryDepsOkay can detect all the dependencies. If the library is not in @rpath, @executable_path or system library path, the dependencies cannot be satisfied on every mac. It may work relevant to the environment. But it will be a big problem.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
def areLibraryDepsOkay(self, fullPath: Path):
# ...
for dep in utils.getLibraryDeps(str(fullPath)):
if dep == libraryId and not os.path.isabs(libraryId):
continue # non-absolute library id is fine
# @rpath and @executable_path is fine
if dep.startswith("@rpath") or dep.startswith("@executable_path"):
continue
# Also allow /System/Library/Frameworks/ and /usr/lib:
if dep.startswith("/usr/lib/") or dep.startswith("/System/Library/Frameworks/"):
continue
if dep.startswith(CraftStandardDirs.craftRoot()):
CraftCore.log.error("ERROR: %s references absolute library path from craftroot: %s", relativePath,
dep)
elif dep.startswith("/"):
CraftCore.log.error("ERROR: %s references absolute library path: %s", relativePath, dep)
else:
CraftCore.log.error("ERROR: %s has bad dependency: %s", relativePath, dep)
found_bad_lib = True

Here, in checkLibraryDepsRecursively, we traverse the directory to check all the dependencies of libraries, which is .dylib or .so.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
def checkLibraryDepsRecursively(self, subdir: str):
# ...
for dirpath, dirs, files in os.walk(os.path.join(self.appPath, subdir)):
for filename in files:
fullpath = Path(dirpath, filename)
if fullpath.is_symlink() and not fullpath.exists():
CraftCore.log.error("Found broken symlink '%s' (%s)", fullpath,
os.readlink(str(fullpath)))
foundError = True
continue

if filename.endswith(".so") or filename.endswith(".dylib") or ".so." in filename:
if not self.areLibraryDepsOkay(fullpath):
CraftCore.log.error("Found library dependency error in '%s'", fullpath)
foundError = True
# ...
Static methods in class

The _updateLibraryReference method can use install_name_tool -change command to change a reference of dynamic library in a macOS/BSD binary.

1
2
3
4
5
6
7
8
9
10
11
@staticmethod
def _updateLibraryReference(fileToFix: Path, oldRef: str, newRef: str = None) -> bool:
if newRef is None:
basename = os.path.basename(oldRef)
newRef = "@executable_path/../Frameworks/" + basename
with utils.makeWritable(fileToFix):
if not utils.system(["install_name_tool", "-change", oldRef, newRef, str(fileToFix)], logCommand=False):
CraftCore.log.error("%s: failed to update library dependency path from '%s' to '%s'",
fileToFix, oldRef, newRef)
return False
return True

The _getLibraryNameId method can use otool -D to get the identity of a dynamic library in a macOS/BSD binary.

1
2
3
4
5
6
7
8
9
10
@staticmethod
def _getLibraryNameId(fileToFix: Path) -> str:
libraryIdOutput = io.StringIO(
subprocess.check_output(["otool", "-D", str(fileToFix)]).decode("utf-8").strip())
lines = libraryIdOutput.readlines()
if len(lines) == 1:
return ""
# Should have exactly one line with the id now
assert len(lines) == 2, lines
return lines[1].strip()

The _fixupLibraryId method can use install_name_tool -id to try to fix the absolute identity of a dynamic library in a macOS/BSD binary.

1
2
3
4
5
6
7
8
9
10
11
@classmethod
def _fixupLibraryId(cls, fileToFix: Path):
libraryId = cls._getLibraryNameId(fileToFix)
if libraryId and os.path.isabs(libraryId):
CraftCore.log.debug("Fixing library id name for %s", libraryId)
with utils.makeWritable(fileToFix):
if not utils.system(["install_name_tool", "-id", os.path.basename(libraryId), str(fileToFix)],
logCommand=False):
CraftCore.log.error("%s: failed to fix absolute library id name for", fileToFix)
return False
# ...
Conclusion

This class is a magic class which can achieve almost everything on macOS.

But the code style is a little confusing. And the parameters are not agreed. Some methods use str to represent a path, some use Path.

Maybe this can be also improved in the future.

Anyway, it’s really a helpful class.

Improvement

During my bonding period, I found that there is a library named qca-qt5 is not fixed appropriately. It caused a crash.

Locating the problem

After analyzing of crash log, I found that the library qca-qt5 is loaded twice. Two libraries with same dynamic library id caused this crash.
1
2
qca-qt5 (0) <14AD33D7-196F-32BB-91B6-598FA39EEF20> /Volumes/*/kdeconnect-indicator.app/Contents/Frameworks/qca-qt5
(??? - ???) <14AD33D7-196F-32BB-91B6-598FA39EEF20> /Users/USER/*/qca-qt5.framework/Versions/2.2.0/qca-qt5

One is in the .app package, the other is in CraftRoot/lib.

As far as I know, qca-qt5 tried to search its plugins in some path. The one in the package is not fixed, so it started a searching of plugins in the CraftRoot/lib directory. The plugins in it refer the qca-qt5 in the directory. So the two libraries with the same name are loaded, and the application crashed.

Cause

With good knowing of MacDylibBundler, we can improve it to fix the bug. And this will be helpful to other applications or libraries in Craft.

I noticed that all the libraries with .dylib can be handled correctly. The problem is based on the libraries in the .framework package. It seems that Craft cannot handle the dynamic libraries in the .framework correctly.

And we can see that, in checkLibraryDepsRecursively, only .so and .dylib are checked. So this is a bug covered deeply.

1
2
3
4
5
6
7
CRAFT: ➜ MacOS otool -L kdeconnectd
kdeconnectd:
/Volumes/Storage/Inoki/CraftRoot/lib/libkdeconnectcore.1.dylib (compatibility version 1.0.0, current version 1.3.3)
/Volumes/Storage/Inoki/CraftRoot/lib/libKF5KIOWidgets.5.dylib (compatibility version 5.0.0, current version 5.57.0)
/Volumes/Storage/Inoki/CraftRoot/lib/libKF5Notifications.5.dylib (compatibility version 5.0.0, current version 5.57.0)
/Volumes/Storage/Inoki/CraftRoot/lib/qca-qt5.framework Versions/2.2.0/qca-qt5 (compatibility version 2.0.0, current version 2.2.0)
...

In the _addLibToAppImage method, the library in the framework is copied directly to the Contents/Frameworks. For example, lib/qca-qt5.framework/Versions/2.2.0/qca-qt5 becomes Contents/Frameworks/qca-qt5.

And then, during the fix in fixupAndBundleLibsRecursively method, according to the following code, it will not be fixed. Although it should be in a .framework directory and it’s a binary, after _addLibToAppImage, it will not be in a .framework directory. So it will not be fixed.
1
2
3
4
5
6
7
if (filename.endswith(".so")
or filename.endswith(".dylib")
or ".so." in filename
or (f"{fullpath.name}.framework" in str(fullpath) and utils.isBinary(str(fullpath)))):
if not self.bundleLibraryDependencies(fullpath):
CraftCore.log.info("Failed to bundle dependencies for '%s'", os.path.join(dirpath, filename))
return False

Fixing it !

To fix it, I think a good idea is copying all the .framework directory and keeping its structure.

I firstly do a checking in the _addLibToAppImage method. For example, if qca-qt5 is in the qca-qt5.framework subdirectory, we change the libBasename to qca-qt5.framework/Versions/2.2.0/qca-qt5. So the targetPath can also be updated correctly.

1
2
3
4
5
6
7
8
9
libBasename = libPath.name

# Handle dylib in framework
if f"{libPath.name}.framework" in str(libPath):
libBasename = str(libPath)[str(libPath).find(f"{libPath.name}.framework"):]

targetPath = Path(self.appPath, "Contents/Frameworks/", libBasename)
if targetPath.exists() and targetPath in self.checkedLibs:
return True

After several checkings, an important section is copying the library. I add some code to check if the library is in a .framework directory. If a library is in a .framework directory, I try to copy the entire directory to the Contents/Frameworks. So for qca-qt5, it should be Contents/Frameworks/qca-qt5.framework/Versions/2.2.0/qca-qt5.

1
2
3
4
5
6
7
8
9
10
if not targetPath.exists():
if f"{libPath.name}.framework" in str(libPath):
# Copy the framework of dylib
frameworkPath = str(libPath)[:(str(libPath).find(".framework") + len(".framework"))]
frameworkTargetPath = str(targetPath)[:(str(targetPath).find(".framework") + len(".framework"))]
utils.copyDir(frameworkPath, frameworkTargetPath, linkOnly=False)
CraftCore.log.info("Added library dependency '%s' to bundle -> %s", frameworkPath, frameworkTargetPath)
else:
utils.copyFile(str(libPath), str(targetPath), linkOnly=False)
CraftCore.log.info("Added library dependency '%s' to bundle -> %s", libPath, targetPath)

After copying, another important point is in _updateLibraryReference. If a library is in a .framework directory, the new reference should be @executable_path/../Frameworks/*.framework/....

1
2
3
4
5
6
7
if newRef is None:
basename = os.path.basename(oldRef)
if f"{basename}.framework" in oldRef:
# Update dylib in framework
newRef = "@executable_path/../Frameworks/" + oldRef[oldRef.find(f"{basename}.framework"):]
else:
newRef = "@executable_path/../Frameworks/" + basename

After fixing, the executable can be launched without crash.

1
2
3
4
5
6
7
8
9
10
11
12
CRAFT: ➜ MacOS otool -L kdeconnectd
kdeconnectd:
@executable_path/../Frameworks/libkdeconnectcore.1.dylib (compatibility version 1.0.0, current version 1.3.3)
@executable_path/../Frameworks/libKF5KIOWidgets.5.dylib (compatibility version 5.0.0, current version 5.57.0)
@executable_path/../Frameworks/libKF5Notifications.5.dylib (compatibility version 5.0.0, current version 5.57.0)
@executable_path/../Frameworks/qca-qt5.framework/Versions/2.2.0/qca-qt5 (compatibility version 2.0.0, current version 2.2.0)
...
CRAFT: ➜ MacOS ./kdeconnectd
kdeconnect.core: KdeConnect daemon starting
kdeconnect.core: onStart
kdeconnect.core: KdeConnect daemon started
kdeconnect.core: Broadcasting identity packet
Conclusion

In the software development, there are always some cases which we cannot consider. Open Source gives us the possibility of collecting intelligence from people all over the world to handle such cases.

That’s also why I like Open Source so much.

Today is the first day of coding period, I hope all goes well for the community and all GSoC students :)

KDE Itinerary - Vector Graphic Barcodes

Saturday 13th of July 2019 07:45:00 AM

I have previously written about why we are interested in barcodes for the KItinerary extractor. This time it’s more about the how, specifically how we find and decode vector graphic barcodes in PDF files, something KItinerary wasn’t able to do until very recently.

Raster Graphics

While PDF is a vector graphics format, most barcodes we encounter in there are actually stored as images. Technically this might not be the cleanest or most efficient way, but it makes KItinerary’s life very easy: We just iterate over all images found in the PDF, and feed them into the barcode decoder.

It’s of course a bit more complicated to make this as efficient as possible, but conceptually you could script this with Poppler’s pdfimages command line tool and ZXing with just a few lines of code.

Vector Graphics

There are also providers that use vector graphics to represent barcodes in their PDF documents, for example Iberia, easyJet, Ryanair and Aer Lingus, enough to make this a relevant problem for KItinerary. The basic idea would be to render the relevant area of the document into an image and feed that into the barcode decoder. The rendering part is straightforward since Poppler has API for that, but how do we know where to look for a vector graphics barcode?

Answering that required a bit of digging into the PDF files, to understand how the barcodes are actually represented. Lacking a “GammaRay for PDF”, Inkscape turned out to be of great help. Importing PDF files there gives you both a graphical and a “textual” (via the generated SVG) representation of the PDF content. This showed three different variants:

  1. A single complex filled path for the entire barcode.
  2. A set of small filled paths (typically quads), for each line or dot of the barcode.
  3. A set of interrupted line strokes with a sufficiently wide pen, so draw the barcode as “scanlines”.

Case (1) is the most easy one, path fill operations with a solid black brush and hundreds or more path elements within a bounding box of just a few centimeters are very rare for anything else, even more so when filtering out paths with curve elements.

The other two cases are much harder to detect without properly grouping all the involved drawing operations though. Here again Inkscape helped, as in all cases the barcodes were represented as an SVG group there, and Inkscape’s PDF import code contained the necessary hints on how to replicate that grouping in KItinerary.

So in the end we iterate over groups of path fill and line stroke operations found in the document, check them for being plausible barcodes by looking at brush or pen properties, path complexity, output size, etc, and then render them to a raster image. The last two steps are expensive, so it’s important we discard as many false positives before we get there.

As a result all remaining PDF documents with previously undetected barcodes in my sample collection now work, with minimal extra runtime cost.

Poppler’s Private API

While I’m quite happy with the result, it unfortunately comes at a cost, in form of a much stronger dependency on Poppler’s private API. KItinerary is already using Poppler’s private API for iterating over the images in a document, which makes distributors understandably very unhappy. For this dependency we had a plan on how to address it by adding the necessary features to Poppler’s public API (at the cost of processing the same document twice, once for text and once for images).

The new code however heavily relies on access to the low-level stream of drawing operations, which is a much much larger API surface to expose from Poppler than just iterating over image assets. Seeing that Inkscape has the same problem, maybe that is actually necessary though?

Contribute

This work heavily relies on access to a large variety of sample documents, to make sure we support all relevant cases. So if you encounter an airline boarding pass PDF file that isn’t detected as such with the current master branch or the upcoming 19.08 release, I’d be very interested in that test case :)

Popular licenses in OpenAPI

Friday 12th of July 2019 10:00:00 AM

Today I was wondering what the most commonly used license that people use in OpenAPI, so I went and did a quick analysis.

Results

The top 5 (with count in brackets):

  1. Apache-2.0 (421)1
  2. CC-BY-3.0 (250)
  3. MIT (15)
  4. “This page was built with the Swagger API.” (8)
  5. “Open Government License – British Columbia” (6)

The striked-out entries are the ones that I would not really consider a proper license.

The license names inside quotation marks are the exact copy-paste from the field. The rest are de-duplicated into their SPDX identifiers.

After those top 5 the long end goes very quickly into only one license per listed API. Several of those seem very odd as well.

Methodology

Note: Before you start complaining, I realise this is probably a very sub-optimal solution code-wise, but it worked for me. In my defence, I did open up my copy of the Sed & Awk Pocket Reference before my eyes went all glassy and I hacked up the following ugly method. Also note that the shell scripts are in Fish shell and may not work directly in a 100% POSIX shell.

First, I needed to get a data set to work on. Hat-tip to Mike Ralphson for pointing me to APIs Guru as a good resource. I analysed their APIs-guru/openapi-directory repository2, where in the APIs folder they keep a big collection of public APIs. Most of them following the OpenAPI (previously Swagger) specification.

git clone https://github.com/APIs-guru/openapi-directory.git cd openapi-directory/APIs

Next I needed to list all the licenses found there. For this I assumed the name: tag in YAML4 (the one including the name of the license) to be in the very next line after the license: tag3 – I relied on people writing OpenAPI files in the same order as it is laid out in the OpenAPI Specification. I stored the list of all licenses, sorted alphabetically in a separate api_licenses file:

grep 'license:' **/openapi.yaml **/swagger.yaml -A 1 --no-filename | \ grep 'name:' | sort > api_licenses

Then I generated another file called api_licenses_unique that would include only all names of these licenses.

grep 'license:' **/openapi.yaml **/swagger.yaml -A 1 --no-filename | \ grep 'name:' | sort | uniq > api_licenses_unique

Because I was too lazy to figure out how to do this properly5, I simply wrapped the same one-liner into a script to go through all the unique license names and count how many times they show up in the (non-duplicated) list of all licenses found.

for license in (grep 'license:' **/openapi.yaml **/swagger.yaml -A 1 \ --no-filename | grep 'name' | sort | uniq) grep "$license" api_licenses --count end

In the end I copied the console output of this last command, opened api_licenses_unique, and pasted said output in the first column (by going into Block Selection Mode in Kate).

Clarification on what I consider “proper license” and re-count of Creative Commons licenses (12 July 2019 update)

I was asked what I considered as a “proper license” above, and specifically why I did not consider “Creative Commons” as such.

First, if the string did not even remotely look like a name of a license, I did not consider that as a proper license. This is the case e.g. with “This page was built with the Swagger API.”.

As for the string “Creative Commons”, it – at best – indicates a family o licenses, which span a vast spectrum from CC0-1.0 (basically public domain) on one end to CC-BY-NC-CA-4.0 (basically, you may copy this, but not change anything, nor get money out of it, and you must keep the same license) on the other. For reference, on the SPDX license list, you will find 32 Creative Commons licenses. And SPDX lists only the International and Universal versions of them7.

Admiteldy, – and this is a caveat in my initial method above – it may be that there is an actual license following the lines after the “Creative Commons” string … or, as it turned out to be true, that the initial 255 count of name: Creative Commons licenses included also valid CC license names such as name: Creative Commons Attribution 3.0.

So, obviously I made a boo-boo, and therefore went and dug deeper ;)

To do so, and after looking at the results a bit more, I noticed that the url: entries of the name: Creative Commons licenses seem to point to actual CC licenses, so I decided to rely on that. Luckily, this turned out to be true.

I broadened up the initial search to one extra line, to include the url: line, narrowed down the next search to name: Creative Commons, and in the end only to url:

grep 'license:' **/openapi.yaml **/swagger.yaml -A 2 --no-filename | \ grep 'name: Creative Commons' -A 1 | grep 'url' | sort > api_licenses_cc

Next, I searched for the most common license – CC-BY-3.0:

grep --count 'creativecommons.org/licenses/by/3.0' api_licenses_cc

The result was 250, so for the remaining6 5 I just opened the api_licenses_cc file and counted them manually.

Using this method the list of all “Creative Commons” license turned out to be as follows:

  1. CC-BY-3.0 (250, of which one was specific to Australian jurisdiction)
  2. CC-BY-4.0 (3)
  3. CC-BY-NC-4.0 (1)
  4. CC-BY-NC-ND-2.0 (1)

In this light, I am amending the results above, and removing the bogus “Creative Commons” entry. Apart from removing the bogus entry, it does not change the ranking, nor the counts, of the top 5 licenses.

hook out → not proud of the method, but happy with having results

  1. This should come as no surprise, as Apache-2.0 is used as the official specification’s example

  2. At the time of this writing, that was commit 506133b

  3. I tried it also with 3 lines, and the few extra results that came up where mostly useless. 

  4. I did a quick check and the repository seems to include no OpenAPIs in JSON format. 

  5. I expected for license in api_licenses_unique to work, but it did not. 

  6. The result of wc -l api_licenses_cc was 255. 

  7. Prior to version 4.0 of Creative Commons licenses each CC license had several versions localised for specific jurisdictions. 

The new userbase wiki

Friday 12th of July 2019 09:35:35 AM

I’m happy to announce that the userbase wiki is getting a new theme and an updated MediaWiki version.

New theme - Aether

The old userbase theme was called Neverland and looked a bit antiquated. A new theme was created with a similar look to kde.org.

The new theme features a light and dark modes using the new prefers-color-scheme: dark CSS media query. The new theme is also mobile friendly.

I think this is quite an improvement over this:

I am confident that Claus_Chr and me found most of the visual glitches, but if you do find a glitch, please report it to me on my talk page.

The new theme is hosted in KDE gitlab instance. Contributions are welcome.

New MediaWiki version

We jumped MediaWiki from the obsolete version 1.26 to 1.31, the latest LTS version. This should fix some of the long-standing bugs and allow us to get all security updates with minimal maintenance needs.

What’s next?

A similar update for the community and techbase wikis should be comming soon™. The only thing that we still need to work is an update of the configuration files and some testing to make sure nothing broke during the update. A preview version of the community wiki can already be tested at: wikisandbox.kde.org.

Contribute to Userbase

When you find a kool feature in KDE software, you can write a small tutorial or just a small paragraph about it and the KDE Userbase Wiki is the right place to publish it. You don’t need to know how to code, have perfect English or know how MediaWiki’s formatting work, to contribute. We also need translators.

Admire my GIMP skills ;)

Thanks to Blumen Herzenschein and Paul Brown for proofreading this blog post and to Ben Cooksley for pointing me to the right direction.

Discussion: Reddit or Mastodon

Guest post: Coloring book & wall art created with Krita

Friday 12th of July 2019 08:54:05 AM

On July 6th we launched Dream Ripple, an art studio located in Minneapolis, MN. We’d like to share a bit about who we are and how Krita aided us in creating our launch project – Wandering: a coloring book and wall art collection that features 50 hand-drawn illustrations of peculiar line-organisms.

We formed Dream Ripple out of a desire to create artwork with the hope to inspire curiosity in others. For a long time, Joe had been experimenting with an unusual abstract line style for doodles, fun drawings, and cards. After wandering through a craft store together, we got really inspired by how creative and fun the coloring books were and it motivated us to try and create one!

We found Krita online after looking for software focused on drawing, illustration, & painting. After a bit of experimenting, it quickly became apparent that Krita provided the toolset needed for our hand-drawn style. Our process was fairly straightforward: we started with pencil sketches, scanned them into Krita, and used a combination of the Stabilizer Brush and Bezier Curve Tool to create crisp uniform lines while still trying to retain the organic feel of the hand-drawn sketch. We’d then print out the illustrations, mark-up design adjustments with a red pen, and revise in Krita over and over until we were happy with it.

For the wall art color variations, we used the Fill Tool to color the areas between the lines. Since we used flat colors, we were able to add an additional 200 color variations to the 50 illustrations fairly quickly.

Also, being free and open source software, Krita allowed us to take time to work without the pressure of a subscription service. That accessibility is something we think is valuable to allow artists to take time to learn their craft without worry of a financial burden.

Here are links to our website, the specific project pages, and one of our wall art stores to see all 50 designs and the 200 color variations:

https://www.dreamripple.com/
https://www.dreamripple.com/wandering-coloring-book/
https://www.dreamripple.com/wandering-wall-art/
https://dream-ripple.pixels.com/art

You can follow us on:

Instagram: @dreamripple
Facebook: @dreamripple
Pinterest: @dreamripple
Twitter: @dreamripple_

Thanks for reading!

Kayla & Joe

Kdenlive 19.04.3 is out

Friday 12th of July 2019 04:00:26 AM

While the team is out for a much deserved summer break the last minor release post-refactoring is out with another huge amount of fixes. The highlights include fixing compositing and speed effect regressions, thumbnail display issues of clips in the timeline and many Windows fixes. With this release we finished polishing the rough edges and now we can focus on adding new features while fixing other small details left. As usual you can get the latest AppImage from our download page.

Speaking of that, the next major release is less than a month away and it already has some cool new features implemented like changing the speed of a clip by ctrl + resize and pressing shift and hover over a thumb of a clip in the Project Bin to preview it. We’ve also bumped the Qt version to 5.12.4 and updated to the latest MLT. You can grab it from here to test it. Also planned is finishing the 3 point editing workflow and improvements to the speed effect. Stay tuned for more info soon.

Bugfixes:

  • Fix tools cursor when hovering a clip in timeline. Commit.
  • Ensure we don’t put a video stream in audio streams in mp3. Commit.
  • Fix loading .mlt playlist can corrupt project profile. Commit.
  • When opening a project file with missing proxy and clip, don’t remove clips from timeline. Commit.
  • Improve main item when grabbing. Commit.
  • Fix reloading of title clips and others. Commit. Fixes bug #409569
  • Update Appdata for 19.04.3 release. Commit.
  • Fix opening of project files with special character. Commit. Fixes bug #409545
  • Fix reloading playlist doesn’t update out. Commit.
  • Don’t leak Mlt repository on first run (attempt to fix Windows fail on first run). Commit.
  • Warn and try fixing clips that are in timeline but not in bin. Commit.
  • Fix timeline tracks config button only showing menu when clicking its arrow. Commit.
  • Fix lambda not called regression. Commit.
  • Don’t hardcode width of clip/composition resize handles. Commit.
  • Fix missing luma error on project opening with AppImage. Commit.
  • Fix reloading clip doesn’t update duration. Commit.
  • Fix overwrite/insert drop leaving audio on wrong track. Commit.
  • Fix error in mirror track calculation. Commit.
  • Fix overwrite clip with speed change. Commit.
  • Fix keyframe corruption on project opening (was creating unexpected keyframe at 0). Commit.
  • Fix keyframes corruption on dragging effect onto another clip. Commit.
  • Fix composition cannot be added after deletion / if another composition is placed just after current pos. Commit.
  • Fix fades broken on speed change. Commit. Fixes bug #409159
  • Fix speed job overwrites without warning. Commit.
  • Fix incorrect crash message on rendering finished. Commit.
  • Fix timeline preview when fps != 25. Commit.
  • Fix tests. Commit.
  • Effectstack: don’t display keyframes that are outside of clip. Commit.
  • Cleanup in clip/composition resize UI update. Commit.
  • Fix thread/cache count causing concurrency crashes. Commit.
  • Don’t trigger unnecessary refresh on clip resize. Commit.
  • Fix crash deleting last track. Commit.
  • Fix duplicate clip with speed change on comma locales. Commit.
  • Don’t allow undo/redo while dragging a clip in timeline. Commit.
  • Fix crash on cutting group with a composition. Commit.
  • Fix crash on group cut. Fixes #256. Commit.
  • Fix playlist duration in bin. Commit.
  • Fix crash loading playlist with different fps. Commit.
  • Fix thumbs not displayed in all thumbs view. Commit. See bug #408556
  • Ensure no empty space between thumbs on all thumbs view in timeline. Commit.
  • Some cleanup in audio thumbs. Fix recent regression and bug where audio thumbs were not displayed after extending a clip in timeline. Commit.
  • I18n fixes. Commit.
  • Use i18n for QML. Commit.
  • Fix monitor image hidden after style change. Commit.
  • Fix resize failure leaving clip at wrong size. Commit.
  • Fix XML translation for Generators. Commit.
  • Fix some effects default params on locales with comma. Commit.
  • Fix crash after undo composition deletion. Commit.
  • Fix i18n for QML. Commit.
  • Fix various selection regressions. Commit.
  • Don’t export metadata as url encoded strings. Commit. Fixes bug #408461
  • Fix crash on project close, see #236. Commit.
  • Fix zone rendering with updated MLT. Commit.
  • After undoing deletion, item should not show up as selected. Commit.
  • Fix disable clip broken regression. Commit.
  • Move zoom options to Timeline, remove Duplicate View. Commit.
  • Fix crash on item deletion. Fixes #235. Commit.
  • Fix fade out moving 1 frame right on mouse release. Commit.
  • Major speedup in clip selection that caused several seconds lag on large projects. Commit.
  • Fix changing composition track does not replug it. Commit.
  • Update appdata version(late again sorry). Commit.
  • Fix freeze when moving clip introduced in previous commit. Commit.
  • Fix typo that may prevent display of transcode menu. Commit.
  • Don’t check duration each time a clip is inserted on project load,. Commit.
  • Show progress when loading a document. Commit.
  • Make it possible to assign shortcut to multitrack view. Commit.
  • Allow resizing item start/end on clip in current track if no item is selected. Commit.
  • Fix profile change not applied if user doesn’t want to save current project. Commit. Fixes bug #408372
  • Fix crash on changing project’s fps. Commit. Fixes bug #408373
  • Add .kdenlive project files to the list of allowed clips in a project. Commit. Fixes bug #408299
  • Correctly save and restore rendering properties for the project. Commit.
  • Workaround MLT consumer scaling issue #453 by using multi consumer. Commit. See bug #407678
  • Fix groups keeping keyboard grab state on unselect,. Commit.
  • Fix the remaining compositing issues reported by Harald (mimick the 18.x behavior). Commit.
  • Don’t warn about missing timeline preview chunks on project opening. Commit.
  • Fix forced track composition should indicate state in timeline (yellow background + track name). Commit.
  • Save track compositing mode in project to restore it on load. Commit. Fixes bug #408081

`make -j5 kritaflake`

Thursday 11th of July 2019 02:57:41 AM

At the end of June I finished copy-on-write vector layers. From the very beginning, I have been researching into possibilities to make kritaflake implicitly sharable. In that post I mentioned the way Sean Parent uses for Photoshop, and adapted it for the derived d-pointers in Flake.

Derived d-pointers

TL;DR: We got rid of it.

As I mentioned in the task page, derived d-pointers originally in Flake are a barrier to implicit sharing. One of the reasons is that we need to write more code (either KisSharedDescendent wrapper class, or repeated code for virtual clone functions). Also, derived d-pointers do not actually encapsulate the data in the parent classes – for example, the members in KoShapePrivate are all accessible by descendents of KoShape, say, KoShapeContainer. That is probably not how encapsulating should work. So in the end we decided to get rid of derived d-pointers in Flake.

This leads to one problem, however, in the class KoShapeGroup. KoShapeGroup is a descendent of KoShapeContainer, which owns a KoShapeContainerModel that can be subclassed to control the behaviour when a child is added to or removed from the container. KoShapeGroup uses ShapeGroupContainerModel which performs additional operations specific to KoShapeGroup.

After I merged my branch into master, it was said that Flake tests failed under address sanitizer (ASan). I took a look and discovered that there was use after free in the class KoShapeGroup, namely the use of its d-pointer. The use is called by the destructor of KoShapeContainer, which calls KoShapeContainerModel::deleteOwnedShapes(), which removes individual shapes from the container, which then calls KoShapeGroup::invalidateSizeCache(). The original situation was:

  1. destructor of KoShapeGroup was called;
  2. members defined in KoShapeGroup got deleted (nothing, because everything is in the derived d-pointer which is defined in KoShape);
  3. destructor of KoShapeContainer was called, which calls d->model->deleteOwnedShapes();
  4. then that of KoShape, which deletes all the private members.

But after the derived d-pointers are converted to normal ones, the calling sequence upon destruction becomes:

  1. destructor of KoShapeGroup was called;
  2. members defined in KoShapeGroup got deleted (its own d-pointer);
  3. destructor of KoShapeContainer was called, which calls d->model->deleteOwnedShapes();
  4. d->model is a ShapeGroupContainerModel, which will call KoShapeGroup::invalidateSizeCache();
  5. that last function accesses the d-pointer of KoShapeGroup, USE AFTER FREE.

In order to solve this problem we have to manually call model()->deleteOwnedShapes() in the destructor of KoShapeGroup, at which time the d-pointer is still accessible.

q-pointers

TL;DR: We also got rid of it.

q-pointers are a method used in Qt to hide private methods from the header files, in order to improve binary compatibility. q-pointers are stored in *Private classes (ds), indicating the object that owns this private instance. But this is, of course, conflicting with the principle of “sharing” because the situation now is that multiple objects can own the same data. The q-pointers in flake is rather confusing under such circumstances, since the private data cannot know which object is the caller.

To avoid this confusion, there are multiple ways:

  1. to move all the functions regarding q-pointers to the public classes;
  2. to pass the q-pointer every time when calling those functions in private classes; or
  3. to add another layer of “shared data” in the d-pointer and keep the q-pointers in the unshared part.
implicit sharing

To enable implicit sharing for the KoShape hierarchy, the only thing left to be done is to change the QScopedPointer<Private> d; in the header file to QSharedDataPointer<Private> d; and make the private classes inherit QSharedData. This step is rather easy and then just run the tests to make sure it does not break anything. Horray!

It is coming alive

Thursday 11th of July 2019 12:55:42 AM

After digging for around a month and a half, I can finally do some selections with the Magnetic Lasso tool, which I wrote with utter laziness as I would say.

New unit tests for the new code

Wednesday 10th of July 2019 08:43:53 PM
Hello everyone,

today I want to present the test system for Cantor's worksheet.
The worksheet is the most central, prominent and important part of the application where the most work is done.

So, it is important to cover this part with enough tests to ensure the quality and stability of this component in future.

At the moment, this system contains only ten tests and all of them cover the functionality for the import of Jupyter notebooks only that was added recently to Cantor (I have mentioned them in my first post).
However, this test infrastructure is of generic nature and can easily be used for testing Cantor's own Cantor files, too.

The test system checks that a worksheet/notebook file is loaded successfully, tests the backend type and validates the overall worksheet structure and the content of its entries.

Actually, some content is not validated, for example the image content. This would increase the complexity of the tests and slow down their execution without additional big value with respect to the quality assurance.

This new infrastructure has proven to be helpful already. When writing the first tests for the worksheet I have found couple of bugs in the implementation of the import of Jupyter notebooks. After having fixed them and now, having this additional barriers, I'm more confident about the implementation and can say more surely that the import of Jupyter notebooks works fine.

In previous post I have mentioned some issues with the perfromance of the renderer used for mathematical expressions in Cantor. It turned out this problem is not so easy to solve as I assumed first. But now, after having finished a substantial part of the work that was planned to be done as part of this GSoC project, I can give more attention to to remaining problems, including this one with the performance of the renderer.
In the next post I plan to show a better realization of the math renderer in Cantor.

KMyMoney 5.0.5 released

Wednesday 10th of July 2019 01:32:32 PM

The KMyMoney development team today announces the immediate availability of version 5.0.5 of its open source Personal Finance Manager.

After three months it is now ready: KMyMoney 5.0.5 comes with some important bugfixes. As usual, problems have been reported by our users and the development team worked hard to fix them in the meantime. The result of this effort is the brand new KMyMoney 5.0.5 release.

Despite even more testing we understand that some bugs may have slipped past our best efforts. If you find one of them, please forgive us, and be sure to report it, either to the mailing list or on bugs.kde.org.

From here, we will continue to fix reported bugs, and working to add many requested additions and enhancements, as well as further improving performance.

Please feel free to visit our overview page of the CI builds at https://kmymoney.org/build.php and maybe try out the lastest and greatest by using a daily crafted AppImage version build from the stable branch.

The details

Here is the list of the bugs which have been fixed. A list of all changes between v5.0.4 and v5.0.5 can be found in the ChangeLog.

  • 368159 Report Transactions by Payee omits transactions lacking category
  • 390681 OFX import and unrecognized <FITID> tag
  • 392305 Not all Asset accounts are shown during OFX import
  • 396225 When importing a ofx/qif file, it does not show me all my accounts
  • 396978 Stable xml file output
  • 400761 Cannot open files on MacOS
  • 401397 kmymoney changes group permissions
  • 403745 in import dialog, newly-created account doesn’t appear in pulldown menu
  • 403825 Transaction validity filter is reset when re-opening configuration
  • 403826 Transactions without category assignment are not shown in report
  • 403885 Buying / selling investments interest / fees round to 2 decimal places even when currency is to 6 decimal places
  • 403886 No way to set/change investment start date in investment wizard
  • 403955 After an action, the cursor returns to top of page and does not remain in a similar position to when action was started
  • 404156 Can’t select many columns as memo
  • 404848 Crash on “Enter Next Transcation”
  • 405061 No chart printing support
  • 405329 CPU loop reconciling if all transactions are cleared
  • 405817 CSV importer trailing lines are treated as absolute lines
  • 405828 Budget problems
  • 405928 Loss of inserted data in transaction planner
  • 406073 Change of forecast method is not reflected in forecast view
  • 406074 Unused setting “Forecast (history)” for home view
  • 406220 Crash when deleting more than 5000 transactions at once
  • 406509 “Find Transaction…” dialog focus is on “Help” button instead of “Find”
  • 406525 Subtotals are not correctly aggregated when (sub-)categories have the same name
  • 406537 Encrypted file cannot be saved as unencrypted
  • 406608 Custom report based on Annual Budget incorrectly getting Actuals
  • 406714 Home view shows budget header twice

Here is the list of the enhancements which have been added:

  • 341589 Cannot assign tag to a split

Beware of some of the Qt 5.13 deprecation porting hints

Tuesday 9th of July 2019 11:18:17 PM

QComboBox::currentIndexChanged(QString) used to have (i.e. in Qt 5.13.0) a deprecation warning that said "Use currentTextChanged() instead".

That has recently been reverted since both are not totally equivalent, sure, you can probably "port" from one to the other, but the "use" wording to me seems like a "this is the same" and they are not.

Another one of those is QPainter::initFrom, which inits a painter with the pen, background and font to the same as the given widget. This is deprecated, because it's probably wrong ("what is the pen of a widget?") but the deprecation warning says "Use begin(QPaintDevice*)" but again if you look at the implementation, they don't really do the same. Still need to find time to complain to the Qt developers and get it fixed.

Anyhow, as usual, when porting make sure you do a correct port and not just blind changes.

Usability & Productivity Sprint 2019

Tuesday 9th of July 2019 01:09:30 PM

In June 2019 I went to Usability & Productivity Goal Sprint in the beautiful city of Valencia! As I’m a relatively new KDE contributor this was my very first sprint experience and it was awesome. At the same time the Plasma Sprint took place and it felt more like one big sprint than two separate events. We were kindly hosted by Slimbook which also organized a bus that took us to their office in the morning and back to the hotel in the evening. A big thank you to them!

In the first part of the sprint I mainly worked on continuing to improve Spectacle. You don’t know Spectacle? It is our screenshotting application with many settings: for example to control what should be captured, if you want to include your mouse cursor or to simply set a delay from when you press the button until the actual screenshot is taken.

> Showing the remaining time in the taskmanager

The first feature I worked on is based on a cool idea by Felix Ernst. Spectacle now shows the time remaining until a screenshot is taken in the taskmanager like when you copy a file in Dolphin or download a file. Now you don’t have to wonder anymore how much time you have left to arrange everything for your screenshot or if Spectacle is still running or crashed somewhere in the background (even if that does not happen very often).

Configuring shortcuts inside Spectacle

Together David Edmundson and I finished porting the Shortcuts configuration of Spectacle from KHotkeys infrastructure to KGlobalAccel. This means, the shortcuts aren’t located in the Custom Shortcuts part of the system settings and duplicated under KDE Daemon in Global Shortcuts but reside in their own Spectacle category in the global shortcut settings. Even more important, we can now show a configuration dialog for the shortcuts inside Spectacle! You don’t have to fear that your carefully assigned hotkeys are reverted to default ones during this transition - they are carefully migrated to the new system. The migration program is longer than the actual required code changes and is run automatically when you receive the update thanks to kconf_update.

Continuing to work with David we investigated why persistent copy to clipboard of screenshots didn’t work - it was supposed to work after recent changes in Spectacle and Klipper after all. It turned out that there was a bug in Klipper but you can now paste screenshots into your favorite image editor or chat prorgam after closing Spectacle. Even if your clipboard history is set to ignore images (of course they will not appear in the history in this case) !

The last days of the sprint I spent on adding the possibility to have your wallpaper slideshow in a particular order in addition to the current random sequence. For now, I implemented sorting in alphabetical order and based on the time the pictures were modified but extending it with others is straightforward. In doing so I simplified the code a bit (it wasn’t touched in a long time) and reduced some duplication by using the same model to show the images in the configuration dialog and in the actual slideshow (before this was done in two different code paths). I didn’t quite finish it up during the sprint but you can have a sneak peek at it over at Phabricator.

Another bigger change I started is porting Spectacle from it’s hand-rolled configuration-managing class to a KConfig XT based approach. You write a XML file and it generates the code for you that manages the settings and their defaults (notice that currently Spectacle has only an OK and Cancel button but no Defaults or Apply button). The main settings already work but I still need to wire some things up like the new shortcuts settings for example.

Aside from that I also worked on investigating and fixing some bugs as always. For example the action buttons now fit into their respective list elements inside the virtual desktop settings, and you can’t get trapped inside Spectacle’s Region Selection anymore.

However the great thing about a sprint is that is not all hacking but you can discuss bigger changes and directions for the future, share ideas and brainstorm together in person. We had a big discussion about discoverability of widget settings and agreed on having a global edit mode where everything on your desktop will be configurable. Other points of discussions were the right click menu of the plasmoids in your panel that can be confusing when they include multiple very similar entries, the behavior of the taskmanager or the multi-screen configuration. One last thing we talked about is the future of the usability goal. We have awesome news to share so stay tuned and have an eye on Nate Graham’s blog as always.

But a Sprint is not only working with others but also meeting the people whose names you know and maybe interacted with textually, talking to them and getting to know each other. And I had a great time, starting from the first day hacking together in a single hotel room with slow wifi, to the last one when we four with the latest flights walked to the beach. So thank you to KDE e.V. that made this possible for me, Slimbook again, Aleix Pol who organized it and lastly to all the nice people who attended and made this a great experience. See you soon.

Plasma Sprint 2019 in Valencia

Tuesday 9th of July 2019 12:00:30 PM

Last month the Plasma team met in Spain for their annual developer sprint. It was kindly hosted by Slimbook in their offices on the outskirts of Valencia. This time it was co-located with the Usability sprint and it was great to meet so many new faces there.

View from the rooftop bar at the hotel Continued improvements in notifications

On 11 June we released Plasma 5.16 with a completely redesigned notification center. In the weeks since I received numerous suggestions on how to improve the system even further and I started working on them. Since this technology is relatively new there’s also a lot of activity and changes being made to the “stable” branch of notification code, i.e. the one feeding subsequent 5.16 bugfix releases. Let’s talk about some of those changes:

  • Fixed job progress reporting when using Latte Dock: in the old system, there was a dedicated process kuiserver managing job tracking (the info popup when you copy a file) forwarding the information to every interested party. However, apart from plasmashell there was nobody else making use of it and it seemed like a huge waste to duplicate the DBus traffic happening while copying a file. Or so I thought :) I killed kuiserver and moved its logic into plasmashell. However, since Latte Dock also uses Task Manager progress reporting, it became random who claimed the service on login. This is resolved now using some clever DBus magic. If you still have issues with progress reporting in conjunction with Latte Dock with latest updates installed, please let me know!
  • More reliable popup placement: the popup should no longer fly all over the place on Wayland.
  • Ignore duplicates: When an application sends the same notification multiple times in quick succession, the additional requests are ignored.
  • Improved notifications for app bundles: notification center identifies applications based on the desktop-entry hint they send. However, in case of bundled apps, such as Flatpak and Snap, the desktop file in the bundle might be different from the one it originally got built with. To address this, I now also take into account the X-Flatpak-RenamedFrom key and BAMF_DESKTOP_FILE_HINT environment variable when trying to identify an application.
  • No unidentified apps in history: another side-effect of a failure to identify an application is that we cannot relate any user preferences to it. This means that you cannot prevent an application from flooding your history. Since 5.16.2 unidentified applications no longer show up in history. This is admittedly a stark behavior change for a stable release and I do apologize for breaking someone’s workflow. However, I needed a quick mitigation for the spam problem and will consider making it an option in Plasma 5.17.

Do not disturb while screens are mirrored

Speaking of Plasma 5.17, I worked on additional notification features for the October Plasma feature release: since we had a projector in the meeting room and I got tasked to run through the agenda in the mornings, I realized that having a way to automatically enter do not disturb mode when mirroring screens could be useful. Since I already used our KScreen library before in PowerDevil (which by default will not suspend your laptop when you close the lid with an external monitor connected), a patchset was quickly created.

Quick reply. Application gets to choose placeholder text and submit button appearance

For a long time I’ve been craving for a quick reply feature where you get a text field inside the notification. In fact, this has been on the notification master plan since 2016. While implementing the feature itself was relatively straight-forward, keyboard focus is an issue still to be resolved, especially on Wayland: notification windows never get focus so they can’t steal it away from other applications. However, conditionally granting focus in this particular case is a lot harder than it sounds.

Shaping up the next Plasma Browser Integration release

Plasma Browser Integration is one of the projects I’m most proud of. In case you didn’t know, there’s a browser extension for Firefox and Chromium-based browsers that bridges the gap between browser and desktop. It lets you share links, find browser tabs in KRunner, and control music and video playback anytime from Plasma, or even your phone using KDE Connect!

A crossed icon indicates it’s not running and the popup gives some advice about it

For the next feature release I first of all worked on better error handling. Right now, when the bridge application acts up or isn’t installed, a popup is shown. This is especially annoying when you have the extension synced across devices to computers that may not be able to run it. I now make use of a so-called browser action to place an icon in the toolbar that indicates status.

WebShare API in action

Furthermore, I added support for the Web Share API so websites can trigger a share dialog from Purpose, our content sharing framework used throughout our applications. This feature also got added to the context menu, so you can not only send links to your phone via KDE Connect but to any registered application. What I’d love to see is a Purpose plug-in for KDE Itinerary so I could store boarding passes directly from the airline booking pages. :)

Automatic dark mode, maybe?

I also toyed around with Media Queries Level 5 to support “dark mode” CSS media queries. While I managed to have it query the current system color scheme to determine dark or light mode, the media queries are currently applied by tampering with the website CSS and installing new rules with the media query unset. This seems to work well but is not something I feel very confident in shipping. Let’s hope this feature request for letting extensions enforce a color scheme goes anywhere or maybe they could just start reading the gtk-application-prefer-dark-theme setting in the future.

Finally, the plan is to enable enhanced media controls by default now that I made it less invasive and more resilient. With this you’ll get more detailed track information, album covers, and more playback controls for websites using the Media Session API. Luckily, more and more websites are starting to make use of that API.

Please do me a favor and enable “Enhanced Media Controls” in the extension settings right now and report any websites that might misbehave, so we can fix that!

GammaRay 2.11.0 Release

Tuesday 9th of July 2019 08:40:32 AM

We have released version 2.11.0 of our Qt application monitoring tool GammaRay. GammaRay allows you to observe behavior and data structures of Qt code inside your program live at runtime.

GammaRay 2.11 comes with a new inspection tool for Qt’s event handling, providing even more insights into the inner working of your application. Besides looking at the events and their properties as they occur the event monitor visualizes event propagation as it happens for Qt Quick or Qt Widgets input handling.

GammaRay event monitor log view

Additionally the event monitor provides statistics on how often which type of event occurred, as well as fine-grained filtering options to find the events interesting for you even in a huge dataset.

GammaRay event type view

Another major new feature is the network operation inspector. This allows you to observe the HTTP operations triggered via QNetworkAccessManager and helps to optimize network interactions, identify leaked QNetworkReply objects and ensure that all operations are encrypted.

GammaRay network operation inspector

Next to this, GammaRay 2.11 got support for more data types (such as the QJson* classes), a new thread affinity checker for the problem reporter, and of course compatibility with the just released Qt 5.13. Behind the scenes we also did some work on performance, improving the responsiveness on large and/or busy inspected applications.

GammaRay 2.11 is available as part of the just released Qt Automotive Suite 5.13 including QtCreator integration and professional support, or GPL-licensed on Github.

About KDAB

KDAB is a consulting company offering a wide variety of expert services in Qt, C++ and 3D/OpenGL and providing training courses in:

KDAB believes that it is critical for our business to contribute to the Qt framework and C++ thinking, to keep pushing these technologies forward to ensure they remain competitive.

The post GammaRay 2.11.0 Release appeared first on KDAB.

OpenExpo 2019 After Dark

Monday 8th of July 2019 07:22:26 PM

OpenExpo is an event aimed at businesses and the public sector. Top topics usually revolve around cloud computing, big and open data, IoT, and as of late, blockchain technologies. 2019 was its sixth edition, held on the 20th of June in “La Nave” on the outskirts of Madrid.

Organisers tell us that 2800 visitors attended this year’s event. There were about 120 speakers and 70 exhibitors with booths. From what we could garner, most visitors were representatives of public institutions, consulting companies, and software development companies, especially from the field of cloud computing.

The Booth

KDE’s booth was right next to the entrance; on the right as you went in, in an area called the “Innovation & Community Village”. We were one of five exhibitors in the area. On our right was the FSFE. I happened to know one of the people staffing, which was nice.

KDE’s booth at OpenExpo 2019.

Behind us was a father-and-son outfit showing 3D printers. Apart from owning a shop, they apparently run courses in their neighbourhood, and that is what earned them a spot in the “Community Village”.

Then there were some people with a DIY go-kart/scooter/tricycle thingy (?). They opened a big, colourful box full of interesting-looking pieces, didn’t do anything with them, and then left.

Finally, on the other side of our table was a company/community that virtualised desktops in the browser. Interesting stuff.

There were six tables and it was first come, first served. I was first, so I picked a front-facing table. Each table was 180 by 80 cm, which is big compared to what we often get in other events, and gave us plenty of space to set up our things. There was a space for our banner in a corner, as you can see in the photograph. We added a screen on a stand behind us that ran videos showcasing Plasma, Plasma Mobile (PlaMo), Kirigami and Applications on a loop. You can see the screen in the background of the photo.

On the table, we laid out the following items:

From left to right, a Nexus5X phone running Plasma Mobile, a Raspberry Pi with a touch screen also running Plasma Mobile, the Pinebook 14” $99 netbook, and a KDE Slimbook II.

We also had 100 stickers: 50 stickers of Katie using a phone and with the Plasma Mobile URL, and 50 Konqi stickers with KDE.org URL. The Konqi ones ran out first.

Katie and Konqi stickers.

The aim of our table spread was three-fold. First, we wanted to show people “shopping” for software that Plasma and other KDE applications are “end-user ready”. Secondly, we intended to show how Plasma is light and can work on a wide variety of devices, including devices usually used in setups where embedded electronics are required (the Raspberry PI); low-powered, ARM-based netbooks (the Pinebook); and as a potential mobile environment (the Nexus 5X). Finally, we wanted to demonstrate how applications, thanks to Kirigami, can adapt to different hardware and screen configurations.

The overarching aim was to see if we could convince administrators of large deployments (for example, schools) that Plasma and KDE Applications would be a good choice for their users. We were also seeking contributors and sponsors for KDE, and looking to convince companies that KDE has good solutions for developing graphical applications.

What I did

To attract and engage visitors, I used several tactics I had used in the past, and that seem to work well. I stood outside the booth and approached visitors that showed interest in our spread.

I found out where the visitors were coming from and adapted my spiel to that. I demoed Plasma on laptops for administrators of large deployments, showing off features and pointing out how it was fast and snappy even on low-spec hardware.

I showed the proof-of-concepts of Plasma Mobile on Yocto (Raspberry Pi) and on postmarketOS (Nexus 5) to managers of companies that developed for several platforms. They could check for themselves how Kirigami could let them create cross-platform applications, including for Android (I had my own phone on hand for this), and how it would allow them to create applications that would adapt to different sizes of screens.

At the end of each demonstration, I encouraged visitors to scan the QRs so they could leave with more information they could research for themselves.

The thing that most attracted the visitors’ attention was the Pinebook – when they read it cost 99 USD. That sparked interest in the underlying hardware, and in what software would run on an underpowered device. A lot of people also picked up the SBC for some reason. The Pine64 I had brought along was only there to show what kind of hardware was in the Pinebook, but it seems that… er… naked electronics are inherently fascinating to visitors at these kinds of events.

After the Pinebook, the most popular devices where the phone and the Raspberry Pi with its touchscreen. A lot of visitors asked if the phone was already for sale, thinking that a pure GNU + Linux phone was already a thing and they had somehow missed it. Even though I had to burst their bubble, they were satisfied that at least some progress was going on, both in the realms of mobile phones and vehicle infotainment systems.

Visitors

The scanning application provided by the organisers of the event was very useful, and I scanned 54 people in total, but, of course, I talked to more than that. By my calculations, about 50% more one-to-one, which puts the number of people I interacted one-on-one with between 75 and 80. Four or five times while I was delivering my spiel, a small crowd of 5 to 10 people congregated around me, so a conservative total number of people I talked to would be around 100.

Many of them were system administrators specialised in cloud computing, one of the main topics of the event. Others managed large networks of end-user machines for schools, libraries and other public institutions. There were also plenty of CEOs, CTOs and other C*Os, both attending for the talks and “shopping” for new open source development software. They are the people who found things like Kirigami interesting.

There were Linux desktop end-users in the mix, too. Many of them did not use Plasma (a few did), and they were under the impression that Plasma was heavy. The Pinebook disproved that, but this (that KDE software is bloated) is something we have seen before, and we clearly must continue to work towards dispelling this notion.

I tried to make sure that visitors to the booth walked away with something to remember us by. Stickers with KDE.org URLs on them until they run out; my card, in case they needed more information; or at the very least, the links to more information in the browsers on their phones, as I encouraged people to scan the QRs associated with each item on the table.

Mission(s) Accomplished?

One of the things I set out to do was to generate some publicity for KDE in the mainstream media, since it was announced that journalists from some big Spanish newspapers, radios and TVs would be there. Unfortunately, I did not see them.

However, I was not disappointed with the day, since we achieved other things on the list. We made contacts within several Madrilian institutions, like the leaders of the MAX Linux distribution, deployed in many Madrilian schools. They are currently using MATE for their desktop, but after reviewing our spread, the said they would give Plasma a try. I will be following up with them.

Continuing with public institutions, we also talked to the people who manage the libraries in Alcorcón, sysadmins from the Congreso de los Diputados and the Ministerio de Economía and Hacienda, and developers from Correos, the Spanish post office. There were representatives from several universities, both students and professors. All visitors were impressed by Plasma’s feature set, performance and flexibility, and were excited about trying it out at work and at home.

The students from the LibreLabUCM of the Universidad Complutense de Madrid later wrote to me and asked how they could contribute. They were especially interested in contributing to Plasma Mobile.

We had a mixed bag when it came to visitors from private enterprises. There were both coders and managers among the people who came to the booth, as well as freelancing consultants. Many of the managers, including CEOs, CTOs and product managers, and all the consultants seemed to be “shopping” for FLOSS to boost productivity (the former) or to add to their portfolio (the latter). Although they were mainly after infrastructure-like software, like cloud management systems, they would often become interested when I demoed Kirigami-based software and showed them it was possible to create good-looking, graphical applications for most platforms that would adapt to different screen sizes and shapes.

From the bigger, more recognisable companies, we had visitors from IBM, Oracle, BT, Atos, Allfunds Bank and Wacom. From smaller, Spanish joints we met people from VASS, Zylk, Zendata and Certelia.

Lessons Learnt

The first lesson I learnt was not to try and do this alone again. Over twelve hours of standing and greeting visitors is not good for an unfit, overweight 53-year-old. Being alone also meant I had to rely on the kindness of the people in the FSFE booth when I had to go foraging for water and food, or for when I needed a bathroom break — thanks Pablo and Erik!

But, seriously, next time we should show off some “naked” electronics. This fascinates attendees for some reason. We should maybe acquire the RISC-V board we showed in FOSDEM. These kinds of things attract visitors like a magnet.

I noticed many visitors looking over the booth from afar, trying to figure out who we were before approaching. As the roll-up banner was to one side, it was not always obvious that it was associated with us. A solution would be to always make sure we have a tablecloth or a prominent flag with our logo, name, and URL handy. We had both at the booth at FOSDEM, and I’m pretty sure that helped.

The stickers ran out rather quickly. By two o’clock there were none left. It wasn’t a big issue, because the event wasn’t the type that attracted merch scavengers, and most people were more interested in what we had on display than in stockpiling goodies. But it would still have been nice to have had more. Also, vinyl die-cut stickers are expensive: 60 euros for 100 stickers.

Speaking of printed merch, maybe we should make attractive flyers with coloured pictures, snappy explanatory bites, shortened URLs and no marketing speak, relevant to what is on show at the booth. Not everybody has QR scanning software on their phones, and a printed guide explaining what we were showing at the booth would’ve helped and served as a reminder if attendees could’ve taken it with them.

Was it worth it?

Yes. We made a lot of contacts with companies and institutions that would have been difficult to get in touch with any other way. We also heard about problems they have, and we can use that to see what solutions we can offer. Both things will ultimately help grow the number of companies that use KDE technologies (like Kirigami) in their products, as well as help us convince institutions to deploy our software (like Plasma and Applications) for their users.

More in Tux Machines

Today in Techrights

8 Top Ubuntu server Web GUI Management Panels

Ubuntu Server with command-line interface might sound little bit wired to newbies because of no previous familiarization. Thus, if you are new to Ubuntu Linux server running on your local hardware or some Cloud hosting and planning to install some Linux Desktop Graphical environment (GUI) over it; I would like to recommend don’t, until and unless you don’t have supported hardware. Instead, think about free and open-source Ubuntu server Web GUI Management panels. Moreover, for a moment, you can think about Desktop Graphical environment for your local server but if you have some Linux cloud hosting server, never do it. I am saying this because Ubuntu or any other Linux server operating systems are built to run on low hardware resources, thus even old computer/server hardware can easily handle it. GUI means more RAM and hard disk storage space. Read more

Android Leftovers

Ubuntu 18.10 Cosmic Cuttlefish reaches end of life on Thursday, upgrade now

Canonical, earlier this month, announced that Ubuntu 18.10 Cosmic Cuttlefish will be reaching end-of-life status this Thursday, making now the ideal time to upgrade to a later version. As with all non-Long Term Support (LTS) releases, 18.10 had nine months of support following its release last October. When distributions reach their end-of-life stage, they no longer receive security updates. While you may be relatively safe at first, the longer you keep running an unpatched system, the more likely it is that your system will become compromised putting your data at risk. If you’d like to move on from Ubuntu 18.10, you’ve got two options; you can either perform a clean install of a more up-to-date version of Ubuntu or you can do an in-place upgrade. Read more