This project is looking at the emerging decentralised web - specifically 'Mastodon', which is a decentralised version of Twitter. To achieve this, a web collection tool is being developed to gather and analyse data from Mastodon. The output is a large-scale characterisation of user behaviour and resilience threats to the infrastructure.
Explaining the science
The project will contribute to two main aspects of computer science. First, through the collection of real-time information about infrastructure availability, it will be possible to study the operations of an 'in-the-wild' decentralised platform. This will hopefully reveal patterns that allow for the platform to be streamlined and for its operations to be improved.
Second, through the collection of user behaviour patterns, understanding will be gained about the needs of people using such a decentralised platform. The goal is then to integrate these two themes making it possible to build superior decentralised platforms that reflect user needs.
- Devise a scalable methodology to gather a Mastodon dataset across a wide range of instances, and to make the data available to the Turing.
- Perform a statistical analysis that characterises usage of Mastodon.
- Design techniques to help Mastodon instances to identify malicious behaviour and resilience threats to the system.
The work will result in large-scale Mastodon dataset being made available to the public. This will be beneficial to researchers working in the space of distributed systems and social media, as it will cover aspects of both social interaction and infrastructure availability. By identifying issues and fixes in Mastodon's operations, it is also intended that this can benefit the open source community (as Mastodon is a large open source project).