#rC3 talk notes continued

✍️ → Written on 2021-03-21 in 1147 words. Part of cs IT-security


I continued watching rC3 talks and wanted to share my notes here.


The Elephant in the background: Empowering users against browser fingerprinting

Julian Fietkau delivered a wonderful talk on their research which started one year ago. They want to evaluate the spread of fingerprinting technology on websites. The first step was evaluating the functionalities used for fingerprinting. They take a quantative approach, where they observe which kind of functionalities are requested in-order. If many features are requested one after another, it is considered as fingerprinting technology.

Their result is that they identified 115 Javascript functions and classified them into 40 fingerprinting features. They implemented the quantitative fingerprinting model in a Google Chrome browser extension called “FPMON”. Then they looked at the data fetched from websites. For example, wikileaks.org uses 0 features but metacafe.com uses 95 of 115 JS functions, 38 of 40 features, and 17 of 18 aggressive features have been used. A 50% score was identified as an estimated baseline. They continued to evaluate the alexa.com 10,000 most popular websites (by visiting the landing page for 60sec). 500 pages don’t use any features and 38 of 40 features have been the maximum attained by {breitbart.com, foursquare.com, and politifact.com}. 57% of pages use 11 plus/minus 4 features. Part of their research was to identify the major networks used. Here, Moatads is pointed out as particularly aggressive.

They conclude that in 7 years, the growth of font fingerprinting has grown by a factor of 10. The tools EFF Privacy Badger, DuckDuckGo PE, Firefox Strict, and Apple Safari were looked at as well. The first three simply use blacklisting strategies to block fingerprinting networks. However, this might break some website functionality. In contrast, Safari uses a different approach with unification and herd immunity. They unified the values of those features and thus users cannot be easily distinguished. The Safari was presented as much more effective than the others.

I expected this talk to be a boring talk about fingerprinting theory. But it was well-delivered, the browser extension is very useful for research and the research output is very interesting to me.

Kein Filter für Rechts

keinfilterfuerrechts.de is a journalist project by collectiv to evaluate the network of right-wing users on social platforms. Specifically, instagram was analyzed such that users and posts are taken into account, but user stories are only stored for 24h and thus have not been accessible to the journalists. First of all, they evaluated four dimensions that contributed data points per user:

  1. metadata & profile information

  2. connection data

  3. text data

  4. qualitative observations

Then they tried to find a proper sampling method for their purpose (because just following all followers of some person would yield unmanagable exponential growth quickly). They used a variant of exponential discriminative snowball sampling. They created a fake account which followed other right-wing accounts to build up a network. 86 non-right-wing complementary accounts were followed too. 281 origin accounts established the network analyzed:

  1. This network follows >58,000 accounts (“origin set”). Thus a methodology to reduce this number must be defined.

  2. Follower remains in list only if the account follows at least 3 accounts in the origin set ⇒ 4532 accounts

  3. Then an additional point-based system with web science criteria filters >58,000 into 10,805 accounts

  4. Then manual categorization of accounts into topic clusters have been performed

  5. And private accounts are filtered, because their content is inaccessible

  6. finally 4,501 accounts are left (with 331,956 connections and 838,505 posts)

Then, they analyzed the distribution of topics and generated nice colorful images of the network.

Furthermore they wanted to visualize the network by defining accounts as vertices and edges as interactions. The edge weights are defined by {mutually following, following the same accounts, mutually used hashtags, tagging in images, comments in posts of other accounts}. Two web science features, they looked at, are:

  1. Eigenvector centrality (measure for social relevance of a vertex) is high for AFD accounts and representatives.

  2. Betweeness centrality (measure for connection relevance of a vertex) is high for meme accounts and right-wing singers.

gephi was used for data analysis. In general, the images of the journalistic work summarize the goal of this work. It allows you to get a rough idea how the right-wing network works and which topics relate to each other in which way.

How to digitale Barrierefreiheit

Carola and Robert Köpferl discuss accessibility in German in Germany and in documents.

  • 12.77 mio. people have any disabilities

  • 7.5 mio. people have severe disabilities

  • about 18 mio. people are over the age of 65

First, Carola defines digital participation:

  • fair, safe, secure access to digital infrastructure, and

  • access to technology like computer/smartphone

  • cheap or free internet access

  • use of technologies

  • across all groups of the population

  • comprehensible due to user-specific offers

There are many norms including EN 301 549, DIN EN ISO 9241 and Web Accessibility guidelines 2016/2012. In the end, it would be a good approach to once use a device like a screenreader. A Braille keyboard is of interest as well as the Jabbla Zingui plus as a speech generating device.

Then they provide a table:

disability devices


screenreaderm, tab order

deaf & blind

refreshable braille display, tab order


audio transcription to sign language

visual impairment

higher contrast, magnification, cursor aids

red/green blindness

avoid red & green combination

light sensitivity

dark mode


sensor input, joystick