Online Harm Rules

Online Harm Rules

A recent report by the Gallup and Knight Foundation found that while users disagree on the responsibility of governments or social media companies when it comes to content moderation, most want more choice and control over their own experience. While two-thirds of children in the UK say they have experienced online bullying or predatory behaviour, less than 20% report such abuse. And it`s not just children: nearly two-thirds of adults under 30 say they`ve been bullied online. Giving users the ability to control who can send them messages and restrict the content they see online would provide more privacy and protection for all vulnerable groups. The rules are published under Pakistan`s Telecommunications (Reorganization) Act and the Electronic Crime Prevention Act (PECA), which impose numerous requirements on social media platforms to meet the requirements of the rules. According to Article 1(2) of the rules, they would enter into force `immediately`, which is problematic in itself, since social media platforms would have to deal with changes in an ad hoc manner without the benefit of a transitional period. While the rules establish a powerful position, the qualifications, responsibility and selection process of the National Coordinator remain questionable. The search and seizure of data without the interrogation measures of the authority concern the privacy of companies and individuals. It is important to mention that PECA allows the Pakistan Telecommunications Agency (PTA) to request the removal of online content by the agencies mentioned in Article 29. The law also creates appropriate legal oversight of unauthorized actions by agencies. However, the rules of procedure do not contain sufficient safeguards against the power of the national coordinator, who respects the provisions of the PECA. In addition, these regulations impose onerous obligations on service providers to filter out “illegal content” on their platforms, although Pakistan`s Electronic Crime Act explicitly prohibits all obligations on these platforms, particularly for user-generated content, such as “live streaming”, as mentioned in Article 38 of the PECA. Safety by Design holds online platforms and services accountable for user safety by assessing, treating and mitigating potential damage before it occurs.

The policy measures require platforms to review the impact of their design and algorithmic recommendations and make the results available for audits, thereby increasing accountability and incentivizing platforms to integrate features designed to improve user well-being. Internal risk assessments and independent audits are a common feature of security policy proposals, including the US Senate KOSA, the DSOSA House of Representatives, the EU DSA and the UK Online Safety Act. While the United Kingdom. relies on a communications regulator to carry out audits, the EU and the US hand over responsibility for the audit to third-party organisations, which large consulting firms are already preparing for. Content removal: The rules require social media intermediaries to remove, suspend or disable access to illegal online content from the Pakistan Telecommunications Authority (PTA) within 24 hours of receiving this notice. The decision whether or not to prohibit content by law is made by the PTA or the National Coordinator (appointed by the Minister of Information Technology and Telecommunications). In emergency situations (as determined by the National Coordinator), the facilitator should delete content within 6 hours. In the case of Saiyid Abul A` la Maudoodi and others. v. The Government of West Pakistan, the Court, has ruled that access to justice is a fundamental right for all citizens and that restricting the right to freedom of expression is inappropriate if the necessary safeguards against arbitrary exercise of power are not provided.

However, subsection 4(2) places the coordinator above any community norm, rule or guideline that violates the rights of individuals.