Different jurisdictions have varied definitions of what content and behaviour is illegal. Some content and behaviours is considered illegal in all jurisdictions (such as child abuse), while others vary from country to country (such as defamation). When content is posted online for the world to see, the content itself and access to it may be subject to disparate regulations of multiple jurisdictions.
One of the earlier and frequently quoted cases that exemplify the problem of multiple jurisdictions is the 2001 Yahoo! case originating in France. It was prompted by a breach of French law, which prohibits the exhibition and sale of Nazi objects, even though the website that provided these items – the Yahoo.com auction website – was hosted in the USA, where the display of such materials was, and still is, legal. The court case was solved through the use of a technical solution (geo-location software and filtering of access). Yahoo! was forced to identify users who accessed the site from France and block their access to web pages showcasing Nazi objects.
Similarly, the right to be forgotten judgment in the EU (Google et al v. Mario Costeja Gonzalez et al) imposed upon search engines the obligation to consider requests from European users to remove certain results from searches.
One of the recent examples is the judgment of the Court of Justice of the European Union (CJEU) in case C-18/18 Eva Glawischnig-Piesczek v Facebook Ireland Limited. Ms. Glawischnig-Piesczek, an Austrian politician, demanded through the national court that Facebook delete defamatory statements about her, as well as equivalent statements, worldwide. The Austrian Supreme Court asked CJEU to decide on the interpretation of the Directive on e-commerce, specifically on the obligation of the host provider to remove or to disable access to illegal information as soon as it becomes aware of it. In its decision, CJEU concluded that Facebook as the host provider can be ordered by a national court to remove information globally with identical or equivalent content to the illegal information or block such information from being posted in the first place (through filters). In the case of removing equivalent content, the host provider should not carry out an independent assessment of the content (the host provider may use automated search tools and technologies).