From 51d8f04fc4b91c7fb1997013e34fa717782025df Mon Sep 17 00:00:00 2001 From: PC-Admin Date: Wed, 16 Aug 2023 08:05:37 +0800 Subject: [PATCH] add description of user_may_join_room spam checker interface --- technical_spec.md | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/technical_spec.md b/technical_spec.md index 89a861d..7a1d7c4 100644 --- a/technical_spec.md +++ b/technical_spec.md @@ -26,6 +26,13 @@ Imagine if you have 100 room_ids and you know 1 is abusive, well you could us an The redlight system attempts to be a solution to this problem. +## user_may_join_room + +Since Redlight will basically restrict access to sections of the network that contain abusive material the most ideal "spam checker" interface in Synapse would be the [user_may_join_room](https://matrix-org.github.io/synapse/latest/modules/spam_checker_callbacks.html#user_may_join_room). It's called when a user attempts to join a room. + +By using this interface we can prevent users from entering these rooms at all, this guarantees that the harmful content in these rooms won't be synced to the redlight client homeserver at all. + + ## Chain of Trust If distributing a hash-list openly is dangerous, the simplest way to make it safe to close up the distribution of it.