K.S. Park spoke on the history of platform regulation on June 24, 2025 in Bangkok at a conference hosted by Thammasat University https://www.facebook.com/100064076590615/posts/the-faculty-of-law-thammasat-university-will-host-a-2025-symposium-on-%F0%9D%91%B5%F0%9D%92%82%F0%9D%92%97%F0%9D%92%8A%F0%9D%92%88%F0%9D%92%82%F0%9D%92%95%F0%9D%92%8A%F0%9D%92%8F%F0%9D%92%88/1093638062782070/


He positioned DSA in the long history of the regulators trying to strike a balance between supporting an information revolution and combating illegal contents created by the bad actors. He began with intermediary liability safe harbors of US’ DMCA 512, EU’s E-Commerce Directive, and Japan’s Provider Liability Law as the prevailing standard, while juxtaposing it to *mandatory* notice and take down rules of China, South Korea, etc. Intermediary liability safe harbors require notice and takedown only as a condition for liability shield only, which means that intermediaries are allowed to retain the noticed-on contents unless they themselves are aware of the illegality. The mandatory NTD is a positive liability-imposing regime and therefore incentivizes intermediaries to take down even lawful contents (i.e. erring on the side of taking down) in fear that the courts later may find the contents illegal. He discussed actual ‘false positives’ in those countries, hampering online civic discourse.
Then he discussed the political changes in 2016 from the Trump election whereby the concerns about online disinformation were pronounced, leading to the passage of 2017 NetzGD, which was in essence a *mandatory* notice and takedown system whose congruence with the intermediary liability safe harbor was in question, and which inspired poorly crafted adoptions into administrative censorship laws in Southeast Asia. Under the new laws in Vietnam, Indonesia, Thailand, Malaysia, and Singapore, the failure to take down upon notice itself became a basis of intermediaries’ liability or even punishability regardless of whether the harms or illegalities of the contents were known by the intermediaries or let alone proven at all. Admittedly, NetzGD was interpreted as imposing the liability only for the known illegality but the Southeast Asian copycats did not require such knowledge in imposing civil and criminal liability on the intermediaries. So the contents “become illegal” when the government content moderation bodies (newly repurposed from the previously largely technical communication ministries) call them illegal instead of their inherent illegality driving the intermediary liability.
Then European Union wanted to clear up the muddy waters by passing DSA in 2023, which re-asserted intermediary liability safe harbor but strengthened/specified what notice-and-takedown procedure must be afforded by the intermediaries in order to benefit from the liability shield, and also imposed transparency (on all online platforms) and risk management obligations (on the big techs only). KS Park urged the Thai regulators who were on site that DSA must be studied well and can be a good replacement to the existing Thailand Computer Crimes Act that imposes administrative censorship law.
On the last point, he emphasized that internet freedom is essential for modernizing the economy and society and that the super-charged intermediary liability safe harbor under DSA presents a better balance between platform accountability and information revolution than the current administrative censorship regimes prevalent in Southeast Asia.
intermediary-liability-lecture-2
0 Comments