Road Ahead


Road ahead

CC BY 2.0 by Nicholas A. Tonelli

I just realized that at June, 1 it is exactly four years since I joined ownCloud Inc. That’s a perfect opportunity to look back and to tell you about some upcoming changes. I will never forget how all this get started. It was FOSDEM 2012 when I met Frank, we already knew each other from various Free Software activities. I told him that I was looking for new job opportunities and he told me about ownCloud Inc. The new company around the ownCloud initiative which he just started together with the help of others. I was directly sold to the idea of ownCloud and a few months later I was employee number six at ownCloud Inc.

This was a huge step for me. Before joining ownCloud I worked as a researcher at the University of Stuttgart, so this was the first time I was working as a full-time software engineer on a real-world project. I also didn’t write any noteworthy PHP code before. But thanks to a awesome community I got really fast into all the new stuff and could speed up my contributions. During the following years I worked on many different aspects of ownCloud, from sharing, over files versions to the deleted files app up to a complete re-design of the server-side encryption. I’m especially happy that I could contribute substantial parts to a feature called “Federated Cloud Sharing”, from my point of view one of the most important feature to move ownCloud to the next level. Today it is not only possible to share files across various ownCloud servers but also between other cloud solutions like Pydio.

But the technical part is only a small subset of the great experience I had over the last four years. Working with a great community is just amazing. It is important to note that with community I mean everyone, from co-workers and students to people who contributed great stuff to ownCloud in their spare time. We are all ownCloud, there should be no distinction! We not only worked together in a virtual environment but meet regularly in person at Hackathons, various conferences and at the annual ownCloud conference. I met many great people during this time which I can truly call friends today. I think this explains why ownCloud was never just a random job to me and why I spend substantial parts of my spare time going to conferences, giving talks or helping at booths. ownCloud combined all the important parts for me: People, Free Software, Open Standards and Innovation.

Today I have to announce that I will move on. May, 25 was my last working day at the ownCloud company. This is a goodbye and thank you to ownCloud Inc. for all the opportunities the company provided to me. But it is in no way a goodbye to all the people and to ownCloud as a project. I’m sure we will stay in contact! That’s one of many great aspects of Free Software. If it is done right a initiative is much more than any company which might be involved. Leaving a company doesn’t mean that you have to leave the people and the project behind.

Of course I will continue to work on Free Software and with great communities, especially I have no plans to leave the ownCloud community. Actually I hope that I can even re-adjust my Free Software and community focus in the future… Stay tuned.

Freie Software im Koalitionsvertrag Baden-Württemberg


Landeswappen Baden-Württemberg

Am 13. März wurde in Baden-Württemberg der neue Landtag gewählt. Die nächsten 5 Jahre werden politisch von einer Koalition aus Bündnis 90/Die Grünen und der CDU gestaltet. Am letzten Wochenende wurde hierfür der Koalitionsvertrag von beiden Parteien bestätigt. Ich nahm diese Gelegenheit zum Anlass, um mir den Koalitionsvertrag genauer anzusehen, insbesondere mit Blick auf Freie Software. Dabei wurde ich an mehreren Stellen fündig.

So heißt es im Abschnitt “Chance zur Entbürokratisierung”:

Wir werden die E-Government-Richtlinien und das Beschaffungswesen des Landes bei der IT-Beschaffung in Richtung Open Source weiterentwickeln.

Dies ist sehr zu begrüßen. Unter anderem fordern Organisationen wie die Free Software Foundation Europe (FSFE) oder die Open Source Business Alliance (OSBA) schon seit längerem, dass durch die Öffentlichkeit finanzierte Software unter einer Freie-Software-Lizenz veröffentlicht werden soll und dass bei Ausschreibungen Freie Software stärker beachtet wird. Gerade in diesem Jahr möchten beide Organisationen hierzu auch verstärkt aktiv werden.

Weiter heißt es im selben Abschnitt:

Auch die Bereitstellung freier Software und offener Bildungsressourcen (OER) durch das Landesmedienzentrum begrüßen und unterstützen wir.

Gerade Schulen, in denen die nächste Generation mit Software und Bildungsressourcen zum ersten Mal systematisch in Kontakt kommt, ist es von großer Bedeutung, dass von Anfang an ein Verständnis dafür entwickelt wird, wie man im Informationszeitalter nachhaltig Wissen und Information erarbeitet und veröffentlicht. Wie könnte dies besser geschehen als durch den praktischen Einsatz von Freier Software und freien Lerninhalten?

Im Abschnitt “Allianz Wirtschaft 4.0 für die Digitalisierung im Mittelstand” ist sogar ein ganzer Abschnitt Freier Software gewidmet. So heißt es dort:

Kleine und mittlere IT-Unternehmen im Land sind besonders aktiv in der Entwicklung von freier, quelloffener Software (Open Source) und in den damit verbundenen Dienstleistungen. Open Source bietet ebenso wie freie Standards und offene Formate große Chancen für ein herstellerunabhängiges Software-Ökosystem. Diese Ansätze wollen wir unterstützen.

Hier wird zurecht der Vorteil Freier Software zur Stärkung des Standorts gewürdigt. Freie Software ermöglicht es, lokale Unternehmen zu fördern und sowohl Wissen als auch Wirtschaftsleistung im Land zu halten. Darüber hinaus wird auf die Wichtigkeit eines herstellerunabhängigen Software-Ökosystems hingewiesen. Man darf gespannt sein, wie die konkrete Unterstützung und Förderung in den nächsten Jahren aussehen wird.

Der Abschnitt “DIGITAL@BW: Schulen mit Digitalisierung und Medienkompetenz” wird noch einmal ausführlicher auf die Rolle von Freier Software und Open Education Resources (OER) eingegangen:

Wir werden die pädagogisch begleitete Nutzung von E-Learning-Programmen im Unterricht vorantreiben und ihr Potenzial hin zu einer genau auf den einzelnen Schüler abgestimmten individuellen Förderung erschließen. Digitale Medien sind fächerübergreifend ebenso wie im Fachunterricht hilfreich. Entscheidend ist weniger die Technik als vielmehr das pädagogische Konzept. Wir setzen uns dafür ein, dass an den Schulen verstärkt freie Lern- und Lehrmaterialien (Open Educational Resources und Freie Software) genutzt werden können.

Gerade bei der fortschreitenden Digitalisierung der Schulen besteht die Gefahr, dass mit dem Einsatz von proprietärer Software frühzeitig Produktschulung betrieben wird, anstelle dass Konzepte gelehrt werden. Des Weiteren kann es schnell passieren, dass der Unterricht mehr oder weniger direkt zur Werbung für einzelne Unternehmen und Produkte genutzt wird. Auch darf der Lock-In Effekt nicht unterschätzt werden. Haben Schüler über viele Jahre hinweg gelernt, mit einer bestimmten Software zu arbeiten und viele Dokumente in proprietären Formaten erstellt, wird ein späterer Wechsel viel schwieriger. Dieses Risiko kann gemindert werden, indem die Schulen darauf achten, dass Dokumente in offen standardisierten Formaten erstellt und bereitgestellt werden.

Es ist zu begrüßen, wenn durch den Einsatz von Freier Software und Offenen Standards eine Bindung an einzelne Programme oder Unternehmen verhindert oder zumindest reduziert wird. Dies gelingt natürlich nur, wenn der Unterricht auch entsprechend aufgebaut ist. Die Wahl von freien Werkzeugen und offen Bildungsressourcen sorgen aber schon einmal für gute Grundvoraussetzungen.

Die Bekundungen zu Freier Software, Offenen Standards und offenen Bildungsressourcen hören sich durchweg positiv an. Wie man aus vergangenen Koalitionsverträgen weiß, bedeutet das aber nicht immer, dass auch alles entsprechend umgesetzt wird. Von daher bleibt es spannend zu beobachten was in den nächsten fünf Jahren in Baden-Württemberg im Bezug auf Freie Software passiert. Ich werde es mit großem Interesse verfolgen und freue mich, wenn ich im Laufe dieser Zeit über konkrete Umsetzungen berichten kann.

Installing Wallabag 2 on a Shared Web Hosting Service


Wallabag 2.0.1

Wallabag describes itself as a self hostable application for saving web pages. I’m using Wallabag already for quite some time and I really enjoy it to store my bookmarks, organize them by tags and access them through many different clients like the web app, the official Android app or the Firefox plug-in.

Yesterday I updated by Wallabag installation to version 2.0.1. The basic installation was quite easy by following the documentation. I had only one problem. I run Wallabag on a shared hoster, so I couldn’t adjust the Apache configuration to redirect the requests to the right sub-directory, as described by the documentation. I solved the problem with a small .htaccess file I added to the root folder:

<IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteCond %{HTTP_HOST} ^links\.schiessle\.org$ [NC]
    RewriteRule !^web/ /web%{REQUEST_URI} [L,NC]
</IfModule>

I also noticed that Wallabag has a “register” button which allows people to create a new account. There already exists a feature request to add a option to disable it. Because I don’t want to allow random people to register a account on my Wallabag installation I disabled it by adding following additional lines to the .htaccess file:

<FilesMatch ".*register$">
    Order Allow,Deny
    Deny from all
</FilesMatch>

Guake Terminal Improvement for Multi-Monitor Setups


Guake Terminal

Guake Terminal

Guake is a top-down “Quake-style” terminal. I use it on a daily basis on the Xfce desktop. The only drawback, Guake doesn’t work the way I want it on a multi-monitor setup. On such a setup the terminal always starts on the main (left) monitor. But for many people, including myself, the left monitor is the small Laptop monitor. Therefor many people prefer to open the terminal on the secondary (right) monitor. If you search for “Guake multi-monitor” you can find many patches to achieve this behavior.

For me it is not enough that the terminal always starts on the right monitor. I want the terminal to always start at the currently active monitor, the monitor which contain the mouse pointer. Luckily Guake is written in Python, this makes it quite easy to patch it without the need to re-compile and re-package it. Together with the patches already available on the Internet and a short look at the Gtk documentation I found a solution. To always show the terminal on the currently active monitor you have to edit /usr/bin/guake and replace the method get_final_window_rect(self) with following code:

    def get_final_window_rect(self):
        """Gets the final size of the main window of guake. The height
        is the window_height property, width is window_width and the
        horizontal alignment is given by window_alignment.
        """
        screen = self.window.get_screen()
        height = self.client.get_int(KEY('/general/window_height'))
        width = 100
        halignment = self.client.get_int(KEY('/general/window_halignment'))
 
        # get the rectangle from the currently active monitor
        x, y, mods = screen.get_root_window().get_pointer()
        monitor = screen.get_monitor_at_point(x, y)
        window_rect = screen.get_monitor_geometry(monitor)
        total_width = window_rect.width
        window_rect.height = window_rect.height * height / 100
        window_rect.width = window_rect.width * width / 100
 
        if width < total_width:
            if halignment == ALIGN_CENTER:
                window_rect.x = (total_width - window_rect.width) / 2
                if monitor == 1:
                    right_window_rect = screen.get_monitor_geometry(0)
                    window_rect.x += right_window_rect.width
            elif halignment == ALIGN_LEFT:
                window_rect.x = 0
            elif halignment == ALIGN_RIGHT:
                window_rect.x = total_width - window_rect.width
        window_rect.y = 0
        return window_rect

This patch is based on Guake 0.4.4. The current stable version is already at 0.8.4 and no longer contain the method shown above. Still version 0.4.4 is in use on the current Debian stable version (Jessie), therefore I thought that it might be useful for more people than just for me.

Federated Sharing – What’s new in ownCloud 9.0


Privacy, control and freedom was always one of the main reasons to run your own cloud instead of storing your data on a proprietary and centralized service. Only if you run your own cloud service you know exactly where your data is stored and who can access it. You are in control of your data. But this also introduces a new challenge. If everyone runs his own cloud service it become inevitable harder to share pictures with your friends or to work together on a document. That’s the reason why we at ownCloud are working at a feature called Federated Cloud Sharing. The aim of Federated Cloud Sharing is to close this gap by allowing people to connect their clouds and easily share data across different ownCloud installations. For the user it should make no difference whether the recipient is on the same server or not.

What we already had

The first implementation of Federated Cloud Sharing was introduced with ownCloud 8.0. Back then it was mainly a extension of the already existing feature to share a file or folder with a public link. People can create a link and share it with their friends or colleagues. Once they open the link in a browser they will see a button called “Add to your ownCloud” which enables them to mount the share as a WebDAV resource to their own cloud.

add-to-your-owncloud

With ownCloud 8.1 we moved on and added the Federated Cloud ID as a additional way to initiate a remote share. The nice thing is that it basically works like a email address. Every ownCloud user automatically gets a ID which looks similiar to john@myOwnCloud.org. Since ownCloud 8.2 the users Federated Cloud ID is shown in the personal settings.

federated-cloud-id

To share a file with a user on a different ownCloud you just need to know his Federated Cloud ID and enter it to the ownCloud share dialog. The next time the recipient log-in to his ownCloud he will get a notification that he received a new share. The user can now decide if he wants to accept or decline the remote share. In order to make it easier to remember the users Federated Cloud ID the Contacts App allows you to add the ID to your contacts. The share dialog will automatically search the address books to auto-complete the Federated Cloud IDs.

What’s new in ownCloud 9.0

With ownCloud 9.0 we made it even easier to exchange the Federated Cloud IDs. Below you can see the administrator setting for the new Federation App, which will be enabled by default.

federation

The option “Add server automatically once a federated share was created successfully” is enabled by default. This means, that as soon as a user creates a federated share with another ownCloud, either as a recipient or as a sender, ownCloud will add the remote server to the list of trusted ownClouds. Additionally you can predefined a list of trusted ownClouds. While technically it is possible to use plain http I want to point out that I really recommend to use https for all federated share operations to secure your users and their data.

What does it mean that two ownClouds trust each other? ownCloud 9.0 automatically creates a internal address book which contains all users accounts. If two ownClouds trust each other they will start to synchronize their system address books. In order to synchronize the system address books and to keep them up-to-date we use the well known and widespread CardDAV protocol. After the synchronization was successful ownCloud will know all users from the trusted remote servers, including their Federated Cloud ID and their display name. The share dialog will use this information for auto-completion. This allows you to share files across friendly ownClouds without knowing more than the users name. ownCloud will automatically find the corresponding Federated Cloud ID and will suggest the user as a recipient of your share.

The screen-shot of the new Federation App shows a status indicator for each server with three different states: green, yellow and red. Green means that both servers are connected and the address book was synced at least once. In this state auto-completion should work. Yellow means that the initial synchronization is still in progress. Creating a secure connection between two ownCloud servers and syncing the users happens in the background. This can take same time, depending on the background job settings of your ownCloud and the settings of the remote server. If the indicator turns red something went wrong in a way that it can’t be fixed automatically. ownCloud will not try to reestablish a connection to the given server. To reconnect to the remote server you have to remove the server and add it again.

If the auto-add option is enabled, the network of known and trusted ownClouds will expand every time a user on your server establish a new federated share. The boundaries between local users and remote users will blur. Each user will stay in control of his data, stored on his personal cloud but from a collaborative point of view everything will work as smooth as if all users would be on the same server.

What will come next? Of course we don’t want to stop here. We will continue to make it as easy as possible to stay in control of your data and at the same time share your files with all the other users and clouds out there. Therefor we work hard to document and standardize our protocols and invite other cloud initiatives to join us to create a Federation of Clouds, not only across different ownCloud servers but also across otherwise complete different cloud solutions.

The next Generation of Code Hosting Platforms


Source Code

CC BY-SA 2.0 by
Christiaan Colen

The last few weeks there has been a lot of rumors about GitHub. GitHub is a code hosting platform which tries to make it as easy as possible to develop software and collaborate with people. The main achievement from GitHub is probably to moved the social part of software development to a complete new level. As more and more Free Software initiatives started using GitHub it became really easy to contribute a bug fix or a new feature to the 3rd party library or application you use. With a few clicks you can create a fork, add your changes and send them back to the original project as a pull request. You don’t need to create a new account, don’t need to learn the tools used by the project, etc. Everybody is on the same platform and you can contribute immediately. In many cases this improves the collaboration between projects a lot. Also the ability to mention the developer of other projects easily in your pull request or issue improved the social interactions between developers and makes collaboration across different projects the default.

That’s the good parts of GitHub, but there are also bad parts. GitHub is completely proprietary which makes it impossible to fix or improve stuff by yourself or run it by your own. Benjamin Mako Hill already argued 2010 why this is a problem and why Free Software needs free tools. More and more people seems to realize that this can create serious problems and a large group of active and influential GitHub users sent a letter to GitHub which ends with:

“Hopefully none of these are a surprise to you as we’ve told you them before. We’ve waited years now for progress on any of them. If GitHub were open source itself, we would be implementing these things ourselves as a community — we’re very good at that!”

I can’t stress this argument enough. The Free Software community is a community of people who are used to do stuff and don’t just consume it. If we use a third party library and find a bug or need a feature we don’t just complain, instead we look at the code, try to fix it and provide a patch to upstream. We could do the same for the tools we use. But we need to be able to do it. It has to be Free Software.

Now a lot of rumors and discussion evolved around the news that GitHub is undergoing a full-blown overhaul as execs and employees depart. Some people even predict that this will be the end of GitHub.

Wait for it. Three months from now, GitHub introduces "features" no-one wants or needs. 12 months from now, the exodus.

— Pieter Hintjens (@hintjens) February 7, 2016

It seems that many people underestimated the lock-in effect of the new hosting platforms such as GitHub for a long time. Now they start to realize that it might be easy to export the git repository but what about the issue tracker, the wiki, CI integration, all the social interaction and collaboration between the projects, all the useful scripts written for the GitHub-API? You can’t clone all this stuff easily and move on.

I don’t want to go deeper into the discussion about what’s going on at GitHub and what will happen next. There are plenty of articles and discussions about it, you can read some of them if you follow the links in this blog.

At the moment the ESLint initiative discusses the option to move away from GitHub and by reading the comments you can get a idea about the lock-in effect I’m talking about. With the growing dissatisfaction and with people realizing that they are sitting in a “golden cage” I have the feeling that we might have a opportunity to think about the next generation of code hosting platforms and how they should look like.

Some of you may remember how Git come into existence, the tool which is used as the underlying technology of GitHub. Ironically, Git was born because of quite similar reasons for which the next generation source code hosting platforms might arise. Before Git, the Linux-Kernel developer community used BitKeeper. BitKeeper is a proprietary source control management system. The developer decided to use it because from a technical point of view BitKeeper was so much better than what we had until then, mainly SVN and CVS. The developer enjoyed the tool and didn’t thought about the problems such a dependency could create. At some point the copyright holder of BitKeeper had withdrawn gratis use of the product after claiming that Andrew Tridgell had reverse-engineered the BitKeeper protocols. The Linux-Kernel community had to move on and Linus Torvalds wrote Git.

Back to the next generation of source code hosting and collaboration platforms. It is easy to find Free Software to run your own git repository, a issue tracker and a wiki. But in 2016 I think that this is no longer enough. As described before, the crucial part is to connect software initiatives and developer to make the interaction between them as easy as possible. That’s why traditional code hosting platforms like for example Savannah are no longer a real option for many projects. I think the next generation code hosting platform needs to work in a decentralized way. Every project should be able to either host its own platform or chose a provider freely without loosing the connection to other software initiatives and developers. This development, from proprietary and centralized solutions to centralized Free Software solutions to federated Free Software solutions is something we already saw in the area of social networks and cloud services. Maybe it is worth looking at what they already achieved and how they did it.

To make the same transition happen for code hosting platforms we need implementations based on Free Software, Open Standards and protocols which enabled this kind of federation. The good news is that we already have most of them. Git by itself is already a distributed revision control system and doesn’t need a central server for collaboration. What’s missing is a nice web interfaces to glue all this parts together: a issue tracker, a wiki, good integration in Free Software CI tools, good APIs and of course Git. This will enable us to fork projects across servers, send pull requests, interact with the other developers and comment on issues no matter if they are on the same server or not. Chances are high that we will already find a suitable protocol by looking at the large amount of federated social networks. By choosing a exiting protocol of a established federated social network we could even provide a tight integration in traditional social networks which could provide additional benefits beyond what we already have. The hard part will be to pull all this together. Will it happen? I don’t know. But I hope that after we have seen the raise and fall of SourceForge, Google Code and maybe at some point GitHub we will move on to create something more sustainable instead of building the next data silo and wait until it fails again.

Integrate ToDo.txt into Claws Mail


I use Claws Mail for many years now. I like to call it “the mutt mail client for people who prefer a graphical user interface”. Like Mutt, Claws is really powerful and allows you to adjust it exactly to your needs. During the last year I began to enjoy managing my open tasks with ToDo.txt. A powerful but still simple way to manage your tasks based on text files. This allows me not only to manage my tasks on my computer but also to keep it in sync with my mobile devices. But there is one thing I always missed. Often a task starts with an email conversation and I always wanted to be able to transfer a mail easily to as task in a way, that the task links back to the original mail conversation. Finally I found some time to make it happen and this is the result:

To integrate ToDo.txt into Claws-Mail I wrote the Python program mail2todotxt.py. You need to pass the path to the mail you want to add as parameter. By default the program will create a ToDo.txt task which looks like this:


<task_creation_date> <subject_of_the_mail> <link_to_the_mail>

Additionally you can call the program with the parameter “-i” to switch to the interactive mode. Now the program will ask you for a task description and will use the provided description instead of the mail subject. If you don’t enter a subscription the program will fall back to the mail subject as task description. To use the interactive mode you need to install the Gtk3 Python bindings.

To call this program directly from Claws Mail you need to go to Configuration->Actions and create a action to execute following command:


/path_to_mail2todotxt/mail2todotxt.py -i %f &

Just skip the -i parameter if you always want to use the subject as task description. Now you can execute the program for the selected mail by calling Tools->Actions-><The_name_you_chose_for_the_action>. Additional you can add a short-cut if you wish, e.g. I use “Ctrl-t” to create a new task.

Now that I’m able to transfer a mail to a ToDo.txt item I also want to go back to the mail while looking at my open tasks. Therefore I use the “open” action from Sebastian Heinlein which I extended with an handler to open claws mail links. After you added this action to your ~/.todo.action.d you can start Claws-Mail and jump directly to the referred mail by typing:


t open <task_number_which_referes_to_a_mail>

The original version of the “open” action can be found at Gitorious. The modified version you need to open the Claws-Mail links can be found here.

The ownCloud Public Link Creator


ownCloud Share Link Creator - Context Menu

ownCloud Share Link Creator – Context Menu

Holiday season is the perfect time to work on some stuff on your personal ToDo list. ownCloud 6 introduced a public REST-style Share-API which allows you to call various share operations from external applications. Since I started working on the Share-API I thought about having a simple shell script on my file manager to automatically upload a file and generate a public link for it… Here it is!

I wrote a script which can be integrated in the Thunar file manager as a “custom action”. It is possible that the program also works with other file managers which provide similar possibilities, e.g Nautilus. But until now I tested and used it with Thunar only. If you try the script with a different file manager I would be happy to hear about your experience.

ownCloud Share Link Creator - File Upload

ownCloud Share Link Creator – File Upload

If you configure the “custom action” in Thunar, make sure to pass the paths of all selected files to the program using the “%F” parameter. The program expects the absolute path to the files. In the “Appearance and Conditions” tab you can activate all file types and directories. Once the custom action is configured you can execute the program from the right-click context menu. The program works for all file types and also for directories. Once the script gets executed it will first upload the files/directories to your ownCloud and afterwards it will generate a public link to access them. The link will be copied directly to your clipboard, additionally a dialog will inform you about the URL. If you uploaded a single file or directory than the file/directory will be created directly below your default target folder as defined in the shell script. If you selected multiple files, than the program will group them together in a directory named with the current timestamp.

This program does already almost everything I want. As already said, it can upload multiple files and even directories. One think I want to add in the future is the possibility to detect a ownCloud sync folder on the desktop. If the user selects a file in the sync folder than the script should skip the upload and create the share link directly.

Edit: In the meantime I got feedback that the script also works nicely with Dolphin, Nautilus and Nemo

Introduction to the new ownCloud Encryption App


Last weekend we released a first preview version of the new encryption app. This wouldn’t be possible without the work done by Sam Tuke and Florin Peter. Thanks a lot for all your work! Let me take the opportunity to tell you some details about the app, what it does and how it works.

The encryption app for ownCloud 5 was a complete re-write. We moved from the relatively weak blowfish algorithm to the more secure AES algorithm. The complete encryption is built on top of OpenSSL a well-known and tested encryption library. Further, the encryption app is integrated into ownCloud seamlessly. This means that the encrypt and decrypt happens transparently so that you can still use all the other features from ownCloud like sharing, different viewer apps, WebDAV access etc.

To make this possible, we decided to perform the encryption server-side. Still the architecture allows us to implement client-side encryption as an additional option later. Server-side encryption is especially interesting for users who also use the external storage app. Combining the external storage app with the encryption app allows you to use external storage without giving any 3rd-party provider access to your data.

ownCloud uses the users log-in password for encryption. This means that you should choose a strong password in order to protect your data. It is important to know that by default a user will lose access to his data if he loses his log-in password. As an additional feature the administrator can generate a recovery key which allows him to recover user data. Once this feature is activated in the administrator settings every user can enable the recovery key in his personal settings. By default the recovery key is disabled. Every user can decide for himself whether he wants this additional protection against password loss or not. Since we are using server-side encryption this feature does not reduce the security. Keep in mind that your ownCloud administrator will always be able to intercept your data because everything gets encrypted and decrypted at the server. Since ownCloud is Free Software you can choose a trustworthy administrator freely or decide to be your own administrator if you wish.

Let’s talk about some technical details and how the encryption works. The encryption is based on three different keys: every user has a private/public key-pair, every file has a file-key and to give multiple users access to a file we have share-keys.

Every user has an asymmetric 4096-bit strong key-pair which consists of a private and a public key. The private key is encrypted with the users log-in password, for the encryption AES-128 is used. Additionally there are up to two system-wide key-pairs: One for public link shares which allows ownCloud to decrypt files which are shared as public link and if enabled the recovery-key-pair.

In order to not always have to encrypt and decrypt large files we have introduced the file-keys which are 183 byte strong ASCII keys. The file-key is used to encrypt the users file symmetrically with AES-128. Than the file-key gets encrypted with the public keys from all users with access to the file. This means that if a user gets added or removed from a file we only have to re-encrypt the small file-key instead of the whole file.

Every time a file-key gets encrypted to multiple users OpenSSL generates for each user an additional share-key. Only the combination of the users private key with the corresponding share-key enables the user to decrypt the given file again.

Everybody is welcome to test the new encryption app and report issues on our mailing list or preferable directly on GitHub. But keep in mind that this is a preview version, you should always have a backup of your unencrypted data!

140 Zeichen – Eine Menschenrechtsverletzung?


Gerade habe ich einen Bericht auf Zeit-Online über eine Matinee des Zeit-Verlages mit dem Titel “Demokratie 2.0” gelesen. Hierbei trafen Claudia Roth von Bündnis90/Die Grünen und Bernd Schlömer von der Piratenpartei zum ersten mal aufeinander. Während der Diskussion antwortete Frau Roth auf die Frage, ob sie denn auch twittere mit “Nein, denn für mich ist es fast eine Menschenrechtsverletzung, immer nur mit 140 Zeichen zu kommunizieren”.

Auch wenn es Frau Roth in dieser Situation vermutlich gar nicht bewusst war, so hat sie dennoch eine sehr wichtige Erkenntnis formuliert die weit über Twitter und dessen Zeichenbegrenzung hinaus geht. Internet und Computer bieten eine Unmenge von neuen Möglichkeiten. Sie revolutionieren die Art wie wir kommunizieren, lernen und arbeiten. Wir müssen aber aufpassen wer diese Medien kontrolliert und damit die Regel aufstellt, nach denen wir in Zukunft diesen Tätigkeiten nachgehen. Denn mit der Kontrolle dieser neuen Medien erhält man auch die Entscheidungshoheit darüber, wer mit wem in welcher Form kommuniziert kann, was unser Computer können und nicht zuletzt auch über den Zugang zu unseren Daten. Im Fall von Twitter sind das die zitierten 140 Zeichen und egal wie sehr jemand will oder wie dringlich es erscheint eine ausführlichere Antwort zu formulieren, die Regeln stehen fest und lassen keine Ausnahme zu.

Beim lesen des Berichts ist mir auch ein Beispiel von Lawrence Lessig aus seinem Buch “Code and other laws of Cyberspace” wieder eingefallen. Hier beschreibt er Chat-Räume bei AOL (American Online) die auf 23 Personen begrenzt sind. Aber warum genau 23? Warum nicht 22? Oder 24? Könnte die Anzahl der Teilnehmer nicht auch offen bleiben? Mit solchen Entscheidungen kann man sehr genau festlegen welche Möglichkeiten der Diskussion, der Teilhabe und der Verbreitung von Information möglich sind. Solche Entscheidungen können ein Werkzeug sehr mächtig und nützlich machen oder so stark einschränken, dass man es kaum noch sinnvoll nutzen kann. Die Macht die von der Möglichkeit ausgeht solche Regeln festzulegen ist enorm.

Unsere Kommunikation hängt heute immer mehr von Computern, dem Internet und damit in letzter Konsequenz von Software ab. Wer immer diese Software kontrolliert entscheidet über Zugang, Form und Möglichkeiten der Kommunikation. Wir als Gesellschaft sollten diese Kontrolle nicht einzelnen Unternehmen überlassen. Die Kontrolle über diese zentrale Infrastruktur der Informationsgesellschaft gehört in die Hände der Gesellschaft und wir sollten bei der Wahl unsere Werkzeuge sehr genau darauf achten wer diese eigentlich kontrolliert. Nur so können wir gesellschaftlichen Werte wie Chancengleichheit, Demokratie und auch die von Frau Roth richtig herangezogenen Menschenrechte im digitalen Zeitalter wahren. Dies ist keine graue Theorie, es gibt bereits Projekte die sich genau dies zum Ziel gesetzt haben, der Gesellschaft wieder die Kontrolle über ihr (digitales) Leben zurück zu geben. Angefangen von Computer-Systemen wie GNU/Linux über freie und dezentrale alternativen zu Twitter wie Status.Net, dezentralen und freien sozialen Netzwerken, bis hin zu freien sogenannten “Cloud” Lösungen, zum Beispiel in der Form von ownCloud.

Wir haben die Möglichkeit und die Verantwortung uns die richtigen Werkzeuge auszuwählen, mit denen wir uns in der digitalen Welt bewegen. Ich bin fest davon überzeugt, dass wir unsere hart erkämpften Werte nur dann in das Informationszeitalter übertragen können wenn wir darauf achten, dass unsere Werkzeuge alle drei Anforderungen erfüllen:

  • Freie Software, so dass jeder die Software verstehen, anwenden, teilen und anpassen kann.
  • Offene Standards zusammen mit dezentralen Strukturen um uns unabhängig miteinander zu vernetzen und auszutauschen
  • Die volle Kontrolle über unsere Daten die wir Online bereitstellen, teilen und archivieren

Wenn wir alle zusammen darauf achten, dann kann sich Frau Roth zusammen mit uns allen ganz frei in der digitalen Welt bewegen, an Diskussionen und Entscheidungen teilnehmen und, um den Kreis zu schließen, dabei völlig frei entscheiden wie viel Zeichen sie dafür benötigt.