Technology Law and Data Privacy Updates
Edition II - February 2025
INDEX
A. SUMMARY
- ANI Media v. OpenAI: Indian Music Industry joins Legal Battle over AI & Copyright
- Advisory issued for OTT Platforms on Compliance with Content Regulation
- Government Addresses Web Scraping Concerns for AI Training in Rajya Sabha
- Advisory issued to Social Media and App Hosting Platforms on Telecom Fraud
- Draft Bill on Urban Lending Regulation introduced for Public Consultation
European Union
- Guidelines issued on Definition of ‘AI System’ under AI Act
- EU Parliament highlights Legal Uncertainty over AI Bias Detection Under GDPR and AI Act
- CJEU clarifies Pseudonymisation and GDPR Applicability in EU Institutions
- Switzerland – Guidelines for Use of Cookies Released
- Austria – Preliminary Ruling holds that Companies must balance AI Transparency and Trade Secret Protections under GDPR
United States of America
- Class Action against Amazon over Alleged Unlawful Geolocation and Health related Data Collection
- Home Depot loses Appeal over Insurance Coverage for Data Breach Losses
- Data Law Amendment mandates 15 Days’ Notification for Breach
- Lawsuit filed over Unauthorized Sale of Consumer Driving Data
Australia
- Western Australia enacts Privacy and Information Sharing Law
- AML/CFT Regulations amended to Tighten Oversight of Digital Asset Transactions
United Kingdom
- Updated Guidelines on Employment Data Collection and Retention Released
- Draft Guidelines addressing ‘Online Gender-Based Harms’ Released
Others
SUMMARY
Welcome to the latest edition of Fountainhead Legal’s Data Privacy and Technology Law newsletter.
As technology rapidly reshapes industries, legal and regulatory framework around the world is evolving just as swiftly to keep pace. This edition of our newsletter brings you critical updates at the intersection of law, media, technology, and finance—each shaping the future of digital governance and compliance.
India is witnessing a pivotal shift in its regulatory framework as courts and policymakers address the challenges posed by AI, digital content, and financial practices. The ANI Media v. OpenAI case, now joined by the Indian Music Industry, highlights growing concerns over AI’s use of copyrighted material, potentially shaping India’s stance on AI-generated content and intellectual property rights. Meanwhile, the Government’s response on web scraping reinforces stricter enforcement under the DPDP Act, ensuring compliance with data protection laws in AI training. Simultaneously, regulatory scrutiny on OTT platforms and social media is tightening, with advisories emphasizing compliance with content laws and the removal of unlawful or deceptive material. In another push for digital security, the DoT’s directive on telecom fraud aims to curb identity tampering and fraudulent practices on communication platforms. Beyond technology and media, financial regulation is also evolving, with the draft bill on urban lending activities extending oversight to digital lenders, ensuring ethical lending practices and borrower protection. These developments signal a decisive move towards a more structured regulatory environment, balancing innovation with legal safeguards.
On the global front, in the European Union, the regulatory landscape continues to evolve, with the European Commission issuing guidelines on AI system definitions under the AI Act. These guidelines clarify the key criteria for determining whether a system qualifies as AI, ensuring regulatory consistency for businesses and developers. Meanwhile, a European Parliament report highlights legal uncertainties surrounding AI bias detection due to conflicts between the AI Act and GDPR. The CJEU also ruled on pseudonymisation, reinforcing that pseudonymised data remains subject to GDPR if re-identification is possible. Additionally, Austria’s courts weighed in on AI transparency vs. trade secrets, ruling that businesses must disclose key parameters of AI-driven decisions without revealing proprietary algorithmic details. In Switzerland, new cookie guidelines mandate clearer disclosures, informed consent, and safeguards against deceptive dark patterns.
In the United States, significant legal actions are shaping data privacy and cybersecurity regulations. Amazon faces a class action lawsuit over its alleged unauthorized collection of geolocation and health-related data for advertising purposes. Meanwhile, Home Depot lost its appeal for insurance coverage of cyberattack-related financial losses, with the court ruling that general liability policies do not cover electronic data breaches. Strengthening data protection laws, New York has amended its data breach notification law, requiring businesses to inform affected individuals within 15 days and report breaches to multiple state agencies. Additionally, the state of Arkansas has sued over allegations of secretly selling consumer driving data to brokers without consent. These developments signal heightened scrutiny on AI, data privacy, and cybersecurity, requiring businesses to align with evolving compliance frameworks to mitigate risks.
Moving forward, Australia has enacted the Privacy and Responsible Information Sharing Act 2024, strengthening data governance through a Chief Data Officer, mandatory data de-identification, and enhanced transparency measures. Additionally, AML/CFT regulations have been tightened, extending compliance to lawyers, accountants, and real estate agents, expanding digital asset oversight, and enforcing the ‘Travel Rule’ for transaction monitoring. Further, in the UK, the Information Commissioner’s Office released updated guidelines on employment data collection and retention under UK GDPR, while Ofcom’s draft guidelines under the Online Safety Act 2023 propose stricter measures for tackling gender-based online harms, including AI moderation and enhanced enforcement. In Hong Kong, it ruled that DAOs remain subject to financial accountability despite decentralization, setting a precedent for regulatory oversight of decentralized financial structures.
As global regulatory landscapes continue to evolve, businesses must stay ahead of emerging compliance requirements in data privacy, AI governance, cybersecurity, and financial oversight. From India’s proactive stance on AI and digital content regulation to the EU’s clarifications on AI systems and GDPR interplay, jurisdictions worldwide are striving to strike a balance between innovation and legal safeguards. The heightened scrutiny in the US, UK, Australia, and Hong Kong further underscores the growing need for businesses to implement robust compliance strategies, ensuring transparency, accountability, and risk mitigation. As we navigate this ever-changing digital landscape, staying informed and adaptable will be key to achieving long-term regulatory resilience and operational success.
Fountainhead Legal is committed to supporting organizations on this journey. With our deep expertise in data privacy compliance and a strong understanding of regulatory nuances, we offer tailored solutions for each client’s unique needs. From drafting privacy policies and developing data protection frameworks to advising on cross-border data transfers and facilitating employee training programs, our team is equipped to guide clients through every stage of their compliance strategy.
We hope you enjoy our latest updates!
NATIONAL
1. ANI Media v. OpenAI: Indian Music Industry joins Legal Battle Over AI & Copyright[1]
The Delhi High Court has issued fresh directions in the ongoing ANI Media Pvt Ltd v. OpenAI Inc. & Anr [CS(COMM) 1028/2024] case. To revisit the facts, ANI had accused ‘ChatGPT’, OpenAI’s large language model, of unauthorized use of its copyrighted news content, alleging that the AI system stored and reproduced its material for training and responses, violating the Copyright Act, 1957. ANI sought an interim injunction to prevent OpenAI from storing, reproducing, or utilizing its content, while also requesting that access to its copyrighted works be blocked. OpenAI countered that ANI’s domain had already been blacklisted in October 2024, ensuring its exclusion from future AI training. The Court had acknowledged this but continues to examine broader copyright concerns.
Now, adding to the case’s significance, the Indian Music Industry —representing Super Cassettes Industries Pvt. Ltd. and Saregama India Ltd.—has sought intervention aiming to present legal arguments on the impact of AI on copyright protection in the music and entertainment sector. The Court admitted the intervention application and has directed OpenAI to file a reply within 2 weeks.[2] Additionally, the court also partly heard the submissions by the Amicus Curiae—Prof. Arul George Scaria and Mr. Adarsh Ramanujan. The Court has kept the matter for further hearings scheduled on March 10, 2025, and March 18, 2025.[3]
This case could set a legal precedent on how AI models interact with copyrighted content in India. As AI-generated content becomes more widespread, courts and regulators will need to strike a balance between innovation and intellectual property rights. With major media and entertainment stakeholders now involved, this case is poised to influence copyright law, AI governance, and digital content protection in the years ahead.
2. Advisory issued for OTT Platforms on Compliance with Content Regulation[4]
On February 19, 2025, MIB issued an advisory to OTT platforms and their self-regulatory bodies, urging strict compliance with Indian laws and the ‘Code of Ethics’ prescribed under the IT Rules. The advisory applies to all OTT platforms and their self-regulatory bodies, reminding them of their responsibilities under the ‘Code of Ethics’. It emphasizes the prohibition of unlawful content, enforcement of age-based classification, and the restriction of adult-rated content through proper access controls. Additionally, OTT platforms are required to exercise caution and discretion while publishing sensitive content to ensure compliance with legal and ethical standards. The advisory highlights that OTT platforms must comply with laws such as the IRWA, BNS, POCSO and IT Act which prohibit the publication of obscene or harmful content. It warns that violations could lead to legal consequences, reinforcing the need for strict adherence under the IT Rules.
The advisory comes at a time when FIRs have been filed against popular influencers for allegedly posting obscene content on social media, sparking concerns over the regulation of digital content. These incidents highlight the growing scrutiny on online platforms and the legal risks of publishing content that violates public decency laws.
3. Government Addresses Web Scraping for AI Training in Rajya Sabha[5]
In response to a Rajya Sabha query, MeitY clarified the regulatory framework governing web scraping of publicly available data for AI model training. ‘Web scraping’ refers to the automated extraction of data from websites, often used for market research, analytics, and AI development. However, concerns have been raised regarding unauthorized data collection by social media platforms and AI companies.
MeitY emphasized that web scraping is regulated under the IT Act and its rules and the DPDP Act, 2023. Section 43 of the IT Act penalizes unauthorized access to computer systems, while the IT Rules mandate that social media platforms prevent unauthorized collection and misuse of user data. Additionally, the DPDP Act requires entities to obtain user consent before processing personal data and may impose penalties of up to INR 250 crore for non-compliance.
While the IT Act already penalizes unauthorized web scraping, enforcement will become significantly stricter under the DPDP Act. The DPDP Act mandates explicit user consent before processing personal data and introduces severe financial penalties for violations. Once fully implemented, companies engaged in web scraping will face tighter regulatory scrutiny and must align their practices with stringent data protection norms. Businesses should take these obligations seriously to avoid legal risks and ensure compliance with evolving data privacy laws.
4. Advisory issued to Social Media and App Hosting Platforms on Telecom Fraud[6]
DoT has issued an advisory directing social media and app hosting platforms to remove content and applications that facilitate telecom identity tampering, such as Caller Line Identification (CLI) spoofing, IP address masking, and International Mobile Equipment Identity (IMEI) manipulation.
The move follows reports of an influencer sharing methods to alter caller identification, violating the telecom regulations, as per which telecom identity tampering is a punishable offense, carrying fines of up to INR 50 lakh and imprisonment of up to 3 years. Additionally, abetting such offenses is also punishable. Social media and app hosting platforms are required to remove such content and submit compliance reports by February 28, 2025.
The advisory signals stricter enforcement of digital security laws, placing greater responsibility on platforms to curb fraudulent telecom practices. Failure to comply could result in legal action under the Telecom Act, marking a tougher stance on digital fraud prevention.
5. Draft Bill on Urban Lending Activities introduced for Public Consultation[7]
The MoF has released a draft Bill on Urban Lending Activities 2024, (“BULA”) seeking public feedback. BULA aims to regulate urban money lending practices, ensuring transparency, borrower protection, and financial stability. It addresses concerns over predatory lending, unregulated credit practices, and unfair interest rates.
BULA mandates that all lending entities obtain authorization from the RBI and prohibits unauthorized lending. It establishes strict penalties, including imprisonment of up to 7 years and fines up to INR 10 million for unregulated lending and up to ten years imprisonment for harassment in loan recovery. BULA also targets misleading loan offers, imposing up to 5 years of imprisonment and fines up to INR 1 million for deceptive claims. To enhance transparency, it proposes an online public database of regulated lenders to help borrowers verify legitimate lenders. BULA covers both traditional and digital lending platforms, ensuring fair interest rates, clear loan terms, and ethical recovery methods. The Government had invited public feedback until February 13, 2025, signalling a push towards stronger consumer protection and ethical lending standards in India’s urban financial sector.
BULA expands upon the RBI’s Digital Lending Guidelines, 2022 (“Guidelines”) by extending regulation beyond banks and NBFCs to cover all digital and urban lenders. While the Guidelines are focused on transparency, data privacy, and fair lending for RBI-regulated entities, BULA introduces a licensing framework for all lenders, including those previously unregulated. It also establishes criminal penalties for unauthorized lending, deceptive loan offers, and coercive recovery methods, ensuring stronger enforcement. Additionally, BULA mandates a public database of authorized lenders, enhancing borrower protection and promoting ethical lending practices across both digital and traditional platforms.
INTERNATIONAL
EUROPEAN UNION
6. Guidelines issued on Definition of ‘AI System’ under AI Act[8]
The European Commission has released guidelines to clarify the definition of an ‘AI system’ under the AI Act, which came into force on August 1, 2024. These guidelines aim to assist AI developers, businesses, and regulators in determining whether a system qualifies as an AI system under Article 3(1) of the AI Act, ensuring effective application and enforcement of the law.
The AI Act defines an AI system as a machine-based system designed to operate with varying levels of autonomy and, in some cases, adaptiveness. The guidelines outline 7 core elements that determine whether a system qualifies as ‘AI’, covering aspects like autonomy, inference capabilities, and interaction with the environment. While the definition of an AI system became applicable on February 2, 2025, the guidelines are yet to be adopted. However, they serve as an essential reference for AI providers and regulatory bodies across the EU. These guidelines also work alongside the European Commission’s guidelines on prohibited AI practices, ensuring clarity on the types of AI systems and their compliance obligations.
By providing a structured interpretation of AI, these guidelines help businesses assess whether their AI models are subject to regulatory oversight. This ensures greater legal certainty for AI developers while promoting responsible innovation. Companies operating in the AI sector must now carefully evaluate their systems against these criteria to determine regulatory obligations under the AI Act.
7. EU Parliament highlights Legal Uncertainty over AI Bias Detection under GDPR & AI Act[9]
A recent EPRS report highlights legal challenges in addressing AI bias due to conflicting provisions in the AI Act and GDPR. The conflict between the AI Act and GDPR arises from how each regulation treats the processing of special categories of personal data, such as ethnicity, biometric data, and racial origin, which are crucial for detecting and mitigating AI bias.
While the AI Act allows data processing to correct bias, the GDPR imposes heavy restrictions, making it legally uncertain when and how AI systems can process sensitive data. The core issue is whether AI bias correction qualifies as a ‘substantial public interest’ exception under GDPR, as no clear legal basis currently exists. This ambiguity could create legal risks for businesses deploying AI systems while trying to comply with both laws. As such, the report suggests that guidance from regulators or legislative reforms may be needed to clarify how AI developers can legally process sensitive data for bias detection without violating GDPR.
8. CJEU clarifies Pseudonymisation and GDPR Applicability in EU Institutions[10]
In European Data Protection Supervisor v. Single Resolution Board [Case C-413/23 P], CJEU examined whether pseudonymised data remains ‘personal data’ under GDPR when different entities hold the key to re-identification. The dispute arose over whether pseudonymised information processed by one entity ceases to be personal data if another entity retains the means to reverse the pseudonymisation.
The CJEU ruled that pseudonymised data remains personal data if any entity—whether the processor or another party—can re-identify it. The court emphasized that pseudonymisation enhances data security but does not remove the data from GDPR’s scope if re-identification is feasible.
9. Guidelines for Use of Cookies Released[11]
In Switzerland, Guidelines on Data Processing Using Cookies (“Guidelines”) were released to establish a proper framework on the use of cookies for data collection and data processing. The Guidelines apply to website operators and third-party services handling user data and emphasize compliance with principles such as transparency, proportionality, and informed consent.
Key provisions require clear disclosures on cookie usage, distinguishing between essential and non-essential cookies, and mandating opt-in consent and assessment through DPIA for high-risk profiling or sensitive data processing. Additionally, to counter dark patterns, websites must ensure that rejecting cookies is as simple as accepting them, preventing misleading user interface elements such as pre-ticked checkboxes, that manipulate user choices. Further, a ‘two-click solution’ is recommended to block third-party tracking scripts before user approval.
10. Preliminary Ruling holds that Companies must balance AI Transparency and Trade Secret Protections under GDPR[12]
In CK Interested parties: Dun & Bradstreet Austria GmbH [Case C-203/22], the CJEU examined the balance between GDPR transparency obligations and trade secret protections. The case arose after a consumer was denied a mobile contract based on a credit rating and requested access to the logic behind the decision under Article 15(1)(h) GDPR, which grants individuals the right to access their personal data and understand how automated decisions affecting them are made. However, the company refused, citing trade secrets, and the matter was eventually referred to CJEU.
The CJEU’s opinion, issued on September 12, 2024, states that companies must disclose key parameters, weighting factors, and personal data used in automated decisions but may withhold technical algorithmic details if they qualify as trade secrets. However, businesses cannot refuse transparency entirely and must provide sufficient explanation for individuals to assess the decision’s fairness. However, this is a preliminary ruling, and the final judgement is yet to come.
11. Class Action against Amazon over Alleged Unlawful Geolocation and Health related Data Collection[13]
A class action lawsuit has been filed against Amazon.com, Inc. and Amazon Advertising, LLC (“Amazon”), accusing the tech giant of unlawfully collecting and monetizing users’ location data that also unveiled users’ visits to medical facilities or health service providers, thereby revealing potential health conditions or concerns without their consent. The lawsuit alleges that Amazon’s Ads Software Development Kit (“SDK”)—integrated into thousands of third-party mobile apps—secretly collected users’ real-time location data, which, in turn, enabled access to sensitive health-related information such as visits to a cancer clinic, fitness routines, dietary habits, and even social determinants of health by indicating where a person lives and works.
The complaint alleges that Amazon used this data for targeted advertising and also sold it to third parties for profit, all without properly informing users. It further alleges that Amazon failed to provide transparency regarding its data collection practices, misleading users into believing their personal information was secure and as a result, has violated multiple state and federal laws. The lawsuit seeks monetary damages, restitution, and injunctive relief to prevent further data collection without consent.
This case could have far-reaching implications for data privacy in digital advertising, particularly regarding the use of SDKs by major tech companies to track user behaviour. If the court rules in favour of the plaintiffs, it may set a new precedent for stricter data privacy regulations and enforcement. Companies relying on user data for targeted advertising may need to reassess their data collection practices to ensure compliance with evolving legal standards.
12. Home Depot loses Appeal over Insurance Coverage for Data Breach Losses[14]
The U.S. Court of Appeals for the Sixth Circuit has ruled against Home Depot (the “Company”), a leading home improvement retailer, in Home Depot, Inc. v. Steadfast Insurance Company and Great American and Assurance Company [No. 23-3720]. The case involved Company’s attempt to claim insurance coverage for financial losses resulting from a massive data breach that exposed millions of customers’ payment card details.
Back in 2014, the Company suffered a cyberattack when hackers stole customer payment card information from its self-checkout terminals. Financial institutions were forced to cancel, and reissue compromised cards, monitor fraud risks, and compensate for fraudulent transactions, leading to substantial losses. After settling these claims for USD 170 million, Company’s cyber insurance covered USD 100 million, but the Company sought additional coverage under its commercial general liability policies (“Policy”) with Steadfast and Great American. The insurers denied coverage, arguing that the Policy excluded losses related to electronic data breaches and only applied to tangible property damages. They also refused to defend the Company in lawsuits filed by banks, prompting the Company to challenge the decision in court.
The court agreed with the insurers that electronic data does not qualify as tangible property under the policy terms, making the loss of customer payment information ineligible for coverage. Additionally, the court found that the Policy explicitly excluded damages resulting from electronic data breaches, including claims related to the loss of use of such data. As a result, since the claim fell outside the scope of coverage, the insurers had no legal obligation to defend the Company in lawsuits filed by financial institutions seeking compensation for fraud-related losses.
13. Data Law Amendment mandates 15 days’ Notification for Breach[15]
New York State Senate amends Section 899-aa of the General Business Law, 2012, strengthening data breach notification requirements for businesses handling personal information of New York residents. Businesses are now required to notify affected individuals within 15 days of discovering a breach, replacing the previous requirement of notifying ‘in the most expedient time possible’. This amendment also applies to third-party data holders.
In addition to data breach notifications to affected individuals, businesses must now report data breaches to four state agencies: the New York Attorney General, the Department of State, the Division of State Police, and the Department of Financial Services. The bill also clarifies that reporting to authorities must not delay notifications to affected individuals.
14. Lawsuit over Unauthorized Sale of Consumer Driving Data[16]
The State of Arkansas, has filed a lawsuit against General Motors LLC (“GM”) and OnStar LLC (“OnStar”) in the Circuit Court of Phillips County, alleging violations of the ADTPA. The lawsuit accuses GM and OnStar of secretly collecting and selling consumer driving data to third-party data brokers without proper consent. The complaint states that GM’s telematics system tracked data such as trip start and end times, speed, high-speed driving percentage, late-night driving, acceleration and braking habits, and location information, even for drivers who did not enrol in OnStar services. This data was allegedly sold to brokers, including Verisk, LexisNexis, and Wejo, who then resold it to insurance companies, potentially affecting consumers’ insurance premiums, coverage decisions, or policy cancellations. GM reportedly continued these practices for years before ending its agreements with LexisNexis and Verisk due to an investigation by the New York Times on the matter.
The lawsuit alleges multiple legal violations, including deceptive trade practices, failure to disclose material facts, unconscionable business practices, and unjust enrichment. It claims that GM misled consumers by failing to clearly disclose in its lengthy privacy policies—some over 36 pages long—that their driving data could be sold to third parties. Additionally, GM is accused of using deceptive interface designs to obscure its data collection practices. The complaint seeks an injunction to stop GM and OnStar from collecting and selling driver data without informed consent, civil penalties of up to USD 10,000 per violation under the ADTPA, and the disgorgement of profits obtained from the sale of consumer driving data.
15. Western Australia enacts Privacy and Information Sharing Law[17]
The Privacy and Responsible Information Sharing Act 2024 (“PRIS Act”) has been enacted in Western Australia, introducing a legal framework for handling personal information by public entities, ministers, and service providers. It aims to balance privacy protection with responsible data sharing, ensuring secure and transparent management of personal data. It establishes a ‘Chief Data Officer’ to oversee data governance and privacy safeguards, ensuring compliance with the law.
The PRIS Act also mandates data de-identification measures to protect individuals’ privacy when sharing personal information. Additionally, it amends ‘Freedom of Information’ laws to enhance public access to Government-held data, promoting transparency while maintaining strong privacy protections.
Australia is rapidly advancing its cybersecurity and data governance framework, demonstrating a commitment to stronger regulatory oversight. Following the introduction of three new cyber law bills, the passage of the PRIS Act marks another step toward enhancing data privacy, transparency, and responsible information use. By creating structured policies for data sharing, security, and public access, Australia is positioning itself as a well-regulated jurisdiction, balancing innovation with privacy protections to safeguard citizens’ data in an evolving digital landscape.
16. AML/CFT Regulations amended to Tighten Oversight of Digital Asset Transactions[18]
Australia has passed the Anti-Money Laundering and Counter-Terrorism Financing Amendment Bill 2024 to enhance its financial crime prevention framework. The amendment expands AML/CTF obligations to previously unregulated sectors, including lawyers, accountants, and real estate agents. Key reforms introduce stricter customer due diligence, enhanced reporting structures, and increased AUSTRAC enforcement powers. The amendment also tightens rules on VASPs and information-sharing mechanisms.
Key legal changes also focus on implementing the ‘Travel Rule’ which mandates the sharing of transaction information between parties for digital asset transfers, ensuring better transparency and monitoring. Additionally, the scope of ‘digital assets’ has expanded to include Non-fungible Tokens (NFTs), stablecoins, and governance tokens, requiring VASPs to apply AML/CTF measures to a broader range of assets. These amendments align with international standards and aim to combat illicit activities such as money laundering and terrorism financing in the rapidly evolving digital asset space. The implementation will be phased, with major provisions taking effect between 2026 and 2027.
17. Updated Guidelines on Employment Data Collection and Retention Released[19]
The UK Information Commissioner’s Office has issued Employment practices and data protection: keeping employment records (“Guidelines”) on collecting and maintaining employment records under UK GDPR and the DPA. The Guidelines outline employers’ responsibilities regarding the lawful collection, processing, storage, and retention of employee data. Covered data includes personnel records, payroll details, performance evaluations, disciplinary records, and health-related information, with additional safeguards required for sensitive categories such as biometric and criminal records data.
The Guidelines apply to all UK employers including public organizations, requiring them to establish a lawful basis for data collection, limit retention periods, ensure transparency, and implement security measures to protect employee privacy. Employers must also grant employees’ rights over their data including access, correction, and erasure requests. The Guidelines emphasize responsible data management, pushing organizations to review internal policies and avoid potential regulatory penalties.
18. UK Draft Guidelines addressing ‘Online Gender-Based Harms’ Released
Ofcom’s draft guidance, A Safer Life Online for Women and Girls[20] (“Draft Guidelines”) published on February 25, 2025, is part of the implementation of the Online Safety Act 2023 (“OS Act”) and sets out mandatory and recommended actions for online service providers to tackle online gender-based harms. The Draft Guidelines identifies 4 main categories of online gender-based harms – ‘Online Misogyny’, ‘Pile-ons and Harassment’, ‘Online Domestic Abuse’, and ‘Image-based Sexual Abuse’ and outlines 9 key actions for service providers covering risk assessments, safer platform design, reporting mechanisms, and enforcement against violators. The Draft Guidelines take a safety-by-design approach, requiring companies to implement pre-set security measures, AI-based content moderation, and effective reporting tools. Once finalized, companies will be assessed for compliance 18 months after implementation, with enforcement actions where necessary. This regulatory framework marks a major shift in accountability for online platforms, setting new safety standards to address gender-based harms.
The public consultation period remains open until May 23, 2025.
19. Court rules on DAO Governance and emphasized on Liability of Companies in matters of Financial Oversight[21]
The Hong Kong Court of First Instance, in Mantra DAO Inc. & RioDefi Inc. v. John Mullin and Others [HCA 749/2022 [2024] HKCFI 2099], ruled on a dispute concerning financial control, governance, and asset misappropriation within a Decentralized Autonomous Organization (“DAO”) finance platform. The plaintiffs accused the defendants of unlawfully taking control of the DAO, withholding financial records, and misusing project funds. The defendants argued that DAO governance is decentralized and managed by token holders, not a single entity. The central issue was whether the defendants breached contractual and fiduciary duties by withholding financial records and misusing project funds.
The court acknowledged the complex governance structure of DAOs but held that decentralization does not exempt entities from financial accountability. It granted an ‘Accounts Disclosure Order’, requiring the defendants to provide financial records from January 1, 2021, onwards. The ruling sets a key precedent, reinforcing that DAOs, despite their decentralized nature, are still subject to legal and financial oversight.
- ADTPA – Arkansas Deceptive Trade Practices Act, 2023
- AI Act – Artificial Intelligence Act, 2024
- AML/CFT – Anti- Money Laundering and Countering of Financing of Terrorism
- BNS – Bhartiya Nyay Sanhita, 2023
- CJEU – Court of Justice of European Union
- CrPC – Criminal Procedural Code, 1973
- DoT – Department of Telecommunication
- DPA – Data Protection Act, 2018
- DPDP Act – Digital Personal Data Protection Act, 2023
- DPIA – Data Protection Impact Assessment
- EPRS – European Parliamentary Research Service
- IRWA – Indecent Representation of Women Act, 1986
- IT Act – Information Technology Act, 2000
- IT Rules – Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
- MIB – Ministry of Information and Broadcasting
- MoF – Ministry of Finance
- POCSO – Protection of Children from Sexual Offences Act, 2012
- Telecom Act – Telecommunications Act, 2023
- VASP – Virtual Asset Service Provider
Authors:
- Rashmi Deshpande
- Aarushi Ghai
- Shriya Haridas
Download File:
[1] https://delhihighcourt.nic.in/court/judegment_orders?pno=1247719
[2]https://dhcappl.nic.in/dhcorderportal/GetOrder.do?ID=abl/2025/760734241739880403109_7561_10282024.pdf
[3]https://dhcappl.nic.in/dhcorderportal/GetOrder.do?ID=abl/2025/145729751740464415393_95672_10282024.pdf
[4] https://mib.gov.in/sites/default/files/2024-02/Advisory%20to%20Digital%20News%20Publishers%20and%20OTT%20Platforms%2003.10.2022%20%281%29%20%281%29.pdf
[5] https://sansad.in/getFile/annex/267/AU558_hhtT2g.pdf?source=pqars
[6]https://dot.gov.in/sites/default/files/Advisory%20to%20Social%20media%20platform%20and%20application%20hosting%20platform%20dated%2018022025.pdf
[7] https://www.fidcindia.org.in/wp-content/uploads/2024/12/MOF-BULA-DRAFT-BILL-13-12-24.pdf
[8] https://digital-strategy.ec.europa.eu/en/library/commission-publishes-guidelines-ai-system-definition-facilitate-first-ai-acts-rules-application
[9] Algorithmic discrimination under the AI Act and the GDPR
[10] Case C 413/23 P https://curia.europa.eu/juris/document/document.jsf?text=&docid=295078&pageIndex=0&doclang=en&mode=req&dir=&occ=first&part=1&cid=28925561
[11] https://backend.edoeb.admin.ch/fileservice/sdweb-docs-prod-edoebch-files/files/2025/02/26/0c422c14-fef7-4bba-8460-5e92ec00f07f.pdf
[12] C-203/22 https://curia.europa.eu/juris/document/document.jsf?text=&docid=290022&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=732215
[13] https://storage.courtlistener.com/recap/gov.uscourts.wawd.344509/gov.uscourts.wawd.344509.1.0.pdf
[14] https://www.govinfo.gov/content/pkg/USCOURTS-ca6-23-03720/pdf/USCOURTS-ca6-23-03720-0.pdf
[15] https://www.nysenate.gov/legislation/bills/2023/S2659/amendment/A
[16] CASE NO. 54 CV-25- 77 doc04129820250226084129.pdf
[17]https://www.legislation.wa.gov.au/legislation/prod/filestore.nsf/FileURL/mrdoc_48019.pdf/$FILE/Privacy%20and%20Responsible%20Information%20Sharing%20Act%202024%20-%20%5B00-a0-01%5D.pdf?OpenElement
[18] https://www.aph.gov.au/Parliamentary_Business/Bills_Legislation/bd/bd2425/25bd018
[19] https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/employment/employment-practices-and-data-protection-keeping-employment-records/collecting-and-keeping-employment-records/#bases
[20] Consultation on draft Guidance: A safer life online for women and girls – Ofcom
[21] https://legalref.judiciary.hk/lrs/common/ju/loadPdf.jsp?url=https://legalref.judiciary.hk/doc/judg/word/vetted/other/en/2022/HCA000749_2022.doc&mobile=N






