Note that X, the supposed “free speech platform”, provides information on platform users to the governments of EU member states in connection with not just illegal speech — and, yes, national legislation in EU countries includes many “speech crimes” — but also legal speech that is deemed “harmful”. This was the real innovation of the EU’s Digital Services Act (DSA): It created an obligation for platforms to take action in the form of “content moderation” against not just illegal content, but also ostensibly “harmful” content such as “disinformation”.
Note that in the period covered in X’s latest “Transparency Report” to the EU on its “content moderation” efforts, nearly 90% of such requests for information on the purveyors of ostensibly “illegal or harmful speech” came from just one country: Germany. See the below chart.

Note that X also takes action against posts or accounts for “illegal or harmful speech” that is reported to it by EU member states or the European Commission. Such action may involve deletion or geo-blocking (“withholding”) of content. But, as the “enforcement options” linked in the report make clear, it can also involve various forms of “visibility filtering” or restricting engagement. Per the report, action was taken against 226,350 items during the reporting period, but the action in question is only specified as “deletion” or “withholding” in less than half of the cases.
Here again, Germany is top of the table, having submitted 42% of all the reports to X on “illegal or harmful speech” and nearly 50% of the reports from member states. See the chart below. Germany submitted nearly twice as many reports as any other member state — France finished a distant second — and over ten times more reports than comparably-sized Italy. The European Commission submitted around 15% of the reports.

It is also notable that Germany submitted by far the most reports on content entailing “negative effects on civic discourse or elections”, yet another category of speech that is clearly not illegal per se but that is deemed “harmful” enough under the DSA regime to require suppression. (Hence, while the content is not per se illegal, it would be illegal for platforms under the DSA not to suppress it. This ambiguity is at the very heart of the DSA censorship regime.) Germany submitted well over half of all such reports and over 60% of the reports from member states.
Finally, it is worth noting, at least in passing, that the overwhelming majority of these reports and the related “enforcement actions” undoubtedly involve English-language content. This can be gleaned from the fact that nearly 90% of X’s “content moderation team” consists of English speakers. The “primary language” of fully 1535 of the team’s 1726 members is English, as can be seen in the below chart.

But why should Germany or the EU be accorded any jurisdiction over English-language discourse? Needless to say, Germans are not as a rule native English speakers and only 1.5% of the total EU population has English as mother tongue.
In any case, two things are very clear from X’s “Transparency Report”. One is that Elon Musk’s “free speech platform” is anything but that and is in fact devoting enormous resources, both in terms of “trained” human censors and programming, to complying with the EU’s censorship regime. And the other is that Germany is the EU’s — and hence undoubtedly the world’s — undisputed, online censorship champion.
Lest, by the way, readers have trouble reconciling the foregoing with the viral kerfuffle between Elon Musk and Thierry Breton and the “proceedings” against X that were initiated under Breton’s leadership, please see the helpful account of the “preliminary findings” of the EU Commission’s investigation here. The EU’s “case” against X, as it now stands, has nothing to do with “content moderation”, but only concerns other, rather arcane, aspects of the DSA.
Republished from Robert Kogon’s Newsletter