Technology Law and Data Privacy Updates

Latest News

Technology Law and Data Privacy Updates

Annual Edition 2025

INDEX 

A. FOUNDER’S NOTE

B. NATIONAL UPDATES 

C. INTERNATIONAL UPDATES

United States of America

European Union

Others

D. ABBREVIATIONS

FOUNDER’S NOTE

As we look back at the year gone by, one thing is unmistakably clear. Across jurisdictions, lawmakers, regulators, and courts have been unusually active in shaping the rules governing data privacy, digital assets, and emerging technologies. India continued to recalibrate its digital governance framework, while globally, countries accelerated efforts to bring coherence and accountability to data-driven business models. What stood out was not merely the volume of regulatory activity, but its intent. The focus has been on replacing uncertainty with structure, and experimentation with responsibility. From privacy and crypto to platform governance, the year marked a decisive shift toward rule-based innovation rather than reactive compliance.

In India, this transition was most visible in the steady progress toward operationalising the Digital Personal Data Protection Act. The release of draft rules and implementation timelines underscored that data protection is no longer a future concern but an imminent compliance reality. Alongside this, regulators across financial services, telecom and gaming sharpened their focus on cybersecurity, user protection, and accountability. Courts too played a significant role, clarifying intermediary obligations, reinforcing duties of care, and recognising digital assets and interests within established legal frameworks. Together, these developments reflected India’s broader attempt to balance innovation with safeguards, even as the DPDP regime continue to evolve.

Globally, the momentum was equally strong. In the United States, eight states enacted comprehensive data privacy laws, further fragmenting, but also deepening, the country’s privacy landscape. The European Union continued refining its mature framework, clarifying how data protection obligations intersect with competition law and platform regulation, while enforcement actions reinforced that compliance failures now carry tangible consequences. In the United Kingdom, the passage of the Data (Use and Access) Act signalled a recalibration of the post-Brexit data regime, aimed at enabling responsible data use without diluting core protections. Vietnam, notably, emerged as a jurisdiction to watch, with comprehensive digital and data-related legislation reflecting a clear shift from restraint to regulated participation in the digital economy.

Crypto regulation also moved out of its tentative phase. Globally, policymakers sought to bring stability and legitimacy to digital asset markets, most notably through landmark legislative initiatives such as the GENIUS Act in the United States, which aims to provide clearer guardrails for stablecoins and market participants. In India, judicial decisions involving a prominent Indian crypto exchange tackled issues on custody, user rights, and the legal character of digital assets. These developments, taken together, suggest that crypto is increasingly being viewed not as a regulatory anomaly, but as an asset class that must fit within existing legal and fiduciary principles.

Beyond regulation, the year was equally transformative for artificial intelligence. Governments and regulators across jurisdictions began grappling with the practical realities of AI deployment, moving from abstract principles to concrete obligations. New laws, guidelines, and advisories addressed AI safety, transparency, synthetic content, and accountability, reflecting a growing recognition that AI systems shape markets, speech, and individual rights at scale. At the same time, courts and regulators started defining the boundaries of permissible AI development, from data sourcing to fair use, signalling that the AI ecosystem will be governed as much by law as by code.

As we step into the year ahead, we hope it brings clarity, resilience and perhaps fewer consent banners though we won’t promise that just yet. We wish you a productive and rewarding year ahead and look forward to continuing this journey through an evolving digital and regulatory landscape!!

NATIONAL 

1. Timelines released for enforcement of Digital Personal Data Privacy Framework[1]

MeitY notified the DPDP Rules, bringing into force the operational framework under the DPDP Act. The DPDP Rules prescribe detailed compliance obligations for data fiduciaries and data processors, including requirements relating to notice and consent, grievance redressal, personal data breach reporting, data retention and deletion, cross-border transfer of personal data, and the identification and obligations of Significant Data Fiduciaries. The Data Protection Board of India was constituted with effect from November 13, 2025, marking the institutional foundation of India’s new privacy regime. Further, key enforcement provisions, including penalty powers will come into force only from May 13, 2027, providing an 18-month transition window for organisations.  The DPDP Rules also provide for a separate timeline for Consent Managers, whose registration framework will become operational from November 13, 2026. In the interim, companies are expected to focus on updating privacy notices, restructuring consent flows, implementing baseline security safeguards, and reviewing data-retention practices. A feature of the DPDP Rules is the operationalisation of DPB as the primary enforcement and adjudicatory authority under the Act. The Rules set out the composition, powers, and procedure of the DPB, including the manner of inquiry, issuance of directions, imposition of monetary penalties, and handling of complaints by data principals. They also provide clarity on digital processes for filings and hearings, reinforcing the DPB’s role as a technology driven regulator with quasi judicial functions.

With the DPDP Rules now in force, organisations processing personal data in India will be required to move from high-level preparedness to implementation readiness. Data fiduciaries should begin reviewing their consent flows, privacy notices, breach response plans, vendor arrangements, and cross-border data transfer mechanisms to align with the prescribed formats and timelines. The notification of the Rules marks a decisive shift from principle-based obligations to rule-based compliance under India’s data protection regime, signalling the start of active regulatory enforcement in the data privacy landscape.

2. Supreme Court recognised ‘Right to Digital Access’ as Fundamental Right[2]

The Supreme Court recognised the right to digital access as an integral part of the right to life and personal liberty under Article 21 of the Constitution of India. In a landmark judgment delivered in May 2025, the court held that access to digital systems, platforms, and services must be inclusive and accessible to persons with disabilities. The ruling arose in the context of barriers faced by disabled persons in accessing online Government portals, welfare schemes, and digital public infrastructure. The court emphasised that as the State increasingly relies on digital modes for governance and service delivery, exclusion from such systems effectively results in denial of basic rights and entitlements. The apex court directed the Government and public authorities to ensure that digital platforms are designed and maintained in compliance with accessibility standards, including compatibility with assistive technologies, alternative formats, and user-friendly interfaces.

This judgment significantly expands the scope of Article 21 by expressly extending constitutional protection to digital inclusion and accessibility. It also places a positive obligation on the State to proactively design digital governance systems that do not marginalise persons with disabilities, reinforcing the principle that technological advancement must be accompanied by constitutional safeguards for equality and dignity.

3. Government enacted First Ever Online Gaming Legislation[3]

The Online Gaming Act, 2025 was enacted, establishing a national statutory framework to regulate online gaming platforms and services. The legislation seeks to bring uniformity and legal certainty to the sector by distinguishing permissible online games from prohibited online money games, while formally recognising and legitimising e-sports and skill-based gaming. It introduced a licensing and compliance regime for gaming intermediaries, mandates user verification and grievance redressal mechanisms, and imposes obligations relating to transparency, advertising standards, and consumer protection. The Act also empowers the Central Government to designate a regulatory authority and prescribe additional rules for registration, monitoring, and enforcement. Pursuant to the enactment of the legislation, MeitY also released the Draft Promotion and Regulation of Online Gaming Rules, 2025 which operationalise the statutory framework by providing for registration of online games, establishment of a national registry, and oversight by a proposed Online Gaming Authority of India. The Rules prescribe timelines for certification, conditions for suspension or cancellation of registration, and mandatory grievance redressal and compliance mechanisms for gaming platforms.

By providing a clear legal framework, the Online Gaming Act, 2025 marks a significant shift from fragmented state-level regulation to centralised oversight of the online gaming ecosystem. The legislation aims to curb gambling and financial misuse disguised as gaming, while simultaneously fostering innovation and growth in legitimate digital gaming and e-sports. For gaming platforms, the Act signals heightened regulatory scrutiny and compliance expectations, while for users it promises greater safeguards against addictive practices, financial harm, and opaque platform operations.

4. MeitY released Draft Guidelines to curb AI Misuse[4]

MeitY released AI Governance Guidelines (“Guidelines”) in November 2025, setting out a pro innovation and human centric framework for the development and deployment of artificial intelligence systems. The Guidelines adopted a principle-based approach anchored in the concept of “Do No Harm”, emphasizing accountability, transparency, safety, and fairness across the AI lifecycle. They outlined expectations for risk identification, bias mitigation, data quality, human oversight, and responsible deployment, while consciously avoiding prescriptive compliance obligations that could hinder innovation.

The Guidelines reflect India’s intent to balance technological advancement with safeguards against misuse and societal harm. By positioning the framework as voluntary and adaptive, MeitY has sought to encourage responsible self-regulation by developers and deployers of AI systems, particularly in high impact use cases. The release of the Guidelines is expected to influence future sector specific regulation and serve as a reference point for organizations designing AI governance frameworks aligned with emerging global standards for safe and trusted AI.

5. CERT-IN released elemental cyber defence controls for MSMEs[5]

CERT-IN released updated cybersecurity standards and audit guidelines titled Elemental Cyber Defence Controls for MSMEs, setting out a baseline security framework for micro, small, and medium enterprises. Issued in September 2025, the guidelines outline practical and scalable cybersecurity controls covering asset management, access control, data protection, network security, incident response, and system monitoring. The document intended to address the growing vulnerability of MSMEs to cyberattacks and ransomware incidents, particularly in light of increased digitisation and reliance on cloud-based and online services.

The updated framework aims to promote cyber resilience by encouraging MSMEs to adopt minimum security hygiene standards proportionate to their size and risk exposure. While not framed as mandatory, the guidelines are expected to serve as a reference point for regulatory assessments, contractual security obligations, and cybersecurity audits across sectors. For MSMEs, the publication provides a structured roadmap to strengthen cyber defences, reduce systemic risk, and align internal security practices with national cybersecurity expectations.

6. Government tightened Aadhaar Rules on Offline Verification[6]

UIDAI notified the Aadhaar Authentication and Offline Verification (Amendment) Regulations, 2025, (“Regulations”) introducing significant changes to the framework governing offline and remote Aadhaar verification. The Regulations expanded the permissible modes of offline verification and refine conditions for remote authentication, with an increased focus on data minimization, purpose limitation, and user consent. The Regulations also updated technical and procedural safeguards applicable to requesting entities, including restrictions on storage, reuse, and sharing of Aadhaar-related information obtained through offline verification mechanisms.

These Regulations are expected to reshape how Aadhaar-based verification is carried out across sectors such as banking, fintech, telecom, and digital services. By strengthening safeguards around offline and remote verification, the updated framework seeks to reduce the risk of misuse while enabling secure and privacy-conscious identity verification. For regulated entities, the changes signal the need to reassess existing Aadhaar verification workflows and align them with the revised compliance and security expectations notified by UIDAI.

7. NCLAT mandated User Consent for all Data Sharing between WhatsApp and Affiliates[7]

NCLAT clarified that WhatsApp LLC (“WhatsApp”) and Meta Platforms Inc. (“Meta”) must obtain express and informed user consent for all forms of data sharing, including both advertising and non-advertising purposes. In WhatsApp LLC v. Competition Commission of India, decided in December 2025, NCLAT upheld the Competition Commission of India’s findings that WhatsApp’s 2021 privacy policy failed to provide users with a genuine choice and transparency regarding the extent and purpose of data sharing with Meta group entities. The NCLAT emphasised that user consent cannot be presumed or bundled and must be freely given, specific, and unambiguous, particularly where data sharing has competitive and consumer impact.

The ruling reinforces the principle that dominant digital platforms cannot rely on take-it-or-leave-it consent mechanisms to legitimise extensive intra-group data sharing. By drawing a clear distinction between user communication services and data monetisation practices, the Tribunal underscored that even non-advertising data sharing requires independent user approval. This decision has significant implications for large technology platforms operating in India, especially in the context of upcoming enforcement under the DPDP Act, as it signals heightened scrutiny of consent frameworks and data sharing arrangements across digital ecosystems.

8. DPIIT released draft policy on AI and copyright [8]

DPIIT released the first part of its working paper examining copyright challenges posed by generative artificial intelligence systems. The draft paper analyzed how existing copyright principles apply to AI training, authorship, and ownership of AI generated outputs, and evaluates global approaches adopted by jurisdictions such as the European Union, United States, and United Kingdom. It also identified key areas of concern including the use of copyrighted works for model training, the scope of fair use or fair dealing exceptions, liability for infringing outputs, and the balance between innovation and protection of creators’ rights.

The release of the working paper marked an initial policy step toward developing an Indian framework for regulating AI related copyright issues. By inviting stakeholder feedback, DPIIT has signaled its intent to adopt a consultative and incremental approach to lawmaking in this area. The paper is expected to inform future legislative or regulatory interventions, particularly as disputes relating to AI training datasets and ownership of AI generated content gain prominence before Indian courts and policymakers.

9. Supreme Court called for Safeguards against Digital Arrest Scams[9]

The Supreme Court took Suo-moto cognizance of the growing incidence of cyber frauds involving so called digital arrest scams, observing serious institutional and systemic lapses in the manner such cases are addressed. In Re: Victims of Digital Arrest Related to Forged Documents, the Court highlighted the failure of timely coordination between banks, law enforcement agencies, and digital intermediaries, which often results in irreversible financial loss to victims. The Court noted that delayed freezing of accounts, inadequate real time alerts, and lack of standard operating procedures have significantly weakened the response to sophisticated frauds carried out through impersonation, forged documents, and digital communication channels. Emphasizing the need for immediate and preventive action, the Court called for the establishment of robust alert and response mechanisms, including real time transaction monitoring, faster interbank coordination, and victim centric recovery frameworks.

The observations signal heightened judicial scrutiny of the role and responsibility of financial institutions and authorities in preventing and responding to cyber enabled fraud. The proceedings are expected to influence future regulatory directions and institutional protocols aimed at strengthening consumer protection and trust in India’s digital financial ecosystem.

10. Courts shaped Crypto Exchange Litigation[10]

The year saw significant judicial developments in the litigation surrounding the crypto exchange ‘WazirX’ and its operator Zanmai Labs Pvt. Ltd (“Zanmai”), following the 2024 cyberattack that impacted user assets on the platform. Proceedings unfolded across multiple forums, including Indian courts and the Singapore High Court, where the exchange’s parent entity sought restructuring relief. In India, users and business partners approached courts seeking protection of crypto assets, interim safeguards, and clarity on the legal character of cryptocurrency holdings and the obligations of exchanges. These cases brought into focus questions of custodianship, fiduciary responsibility, and the contractual allocation of risk between exchanges, users, and affiliated entities.

Indian courts, particularly the Bombay and Madras High Courts, adopted a rights-oriented approach by recognizing cryptocurrencies as property and acknowledging that exchanges may act as custodians of user assets, thereby owing a duty of care. At the same time, the Singapore High Court approved a restructuring framework that sought to socialize losses arising from the cyberattack, subject to safeguards preserving accountability for fraud or willful misconduct.

Taken together, litigation reflects an evolving judicial stance on crypto asset protection, cross border insolvency, and exchange accountability, and is likely to influence future regulatory and contractual standards in India’s digital asset ecosystem.

INTERNATIONAL 

UNITED STATES OF AMERICA

11. GENIUS Act enacted to regulate stablecoins[11]

The United States enacted the Guaranteeing Essential National Infrastructure in US Stablecoins Act, commonly referred to as the GENIUS Act marking the country’s first comprehensive federal legislation governing stablecoins. The Act establishes a regulatory framework for the issuance, backing, and oversight of stablecoins pegged to the US dollar. It introduced requirements relating to reserve backing, disclosure, auditability, and operational resilience, while assigning supervisory responsibilities to federal financial regulators to oversee stablecoin issuers and related infrastructure.

The enactment of the Act reflects a significant shift in the United States’ approach to digital assets, moving from fragmented regulatory guidance toward statutory oversight. By recognising stablecoins as a critical component of national financial infrastructure, the legislation seeks to address systemic risk, consumer protection, and financial stability concerns while allowing space for innovation in digital payments. The Act is expected to influence global regulatory approaches to stablecoins and may serve as a reference point for jurisdictions considering dedicated legislative frameworks for crypto based payment instruments.

12. Eight State Privacy Laws enacted[12]

Eight comprehensive state level privacy laws became effective across the United States, further expanding the country’s fragmented but steadily growing data protection landscape. The states include Delaware, Iowa, Nebraska, New Hampshire, New Jersey, Tennessee, Minnesota, and Maryland, with effective dates spread between January and October 2025. These laws introduced baseline consumer rights such as access, correction, deletion, and opt out of targeted advertising and data sales, alongside obligations on businesses relating to transparency, purpose limitation, and reasonable data security safeguards.

The entry into force of these statutes underscores the accelerating momentum of state led privacy regulation in the absence of a comprehensive federal data protection law. While largely aligned in structure, the laws differ in scope, thresholds, enforcement models, and exemptions, creating a complex compliance environment for organisations operating across multiple states. The cohort reinforces the need for scalable and harmonised privacy governance frameworks capable of accommodating divergent state requirements while maintaining consistent consumer protections.

13. Video Company faced Continued Regulatory Scrutiny over Data Privacy Practices[13]

TikTok Inc. (“TikTok”) remained under sustained regulatory and judicial scrutiny in 2025 over concerns relating to its data privacy practices, cross border data transfers, and handling of user information. Authorities in multiple jurisdictions examined whether personal data of users was being accessed or transferred in ways inconsistent with local data protection laws, particularly in the context of links to its parent entity and foreign jurisdictions. Investigations and regulatory actions focused on issues such as transparency in privacy disclosures, lawful basis for data processing, data localization commitments, and safeguards against unauthorized access to sensitive user data.

The continued focus on TikTok highlights broader regulatory unease around large social media platforms that operate across borders while processing vast volumes of personal data. For regulators, the platform has become a test case for enforcing data protection principles relating to accountability, purpose limitation, and cross border transfers in a global digital economy. The developments in 2025 signal that platform scale and popularity do not dilute regulatory expectations, and that heightened scrutiny of data governance practices for global technology companies is likely to persist.

14. New York enacted regulation to strengthen AI safety and accountability[14]

The Governor of New York signed the Responsible Artificial Intelligence Safety and Enforcement Act, commonly referred to as the RAISE Act (“Act”) establishing a comprehensive state level framework for artificial intelligence safety. Enacted in December 2025, the Act introduced mandatory risk management and safety obligations for developers and deployers of AI systems across all scales, including requirements relating to system testing, documentation, and ongoing monitoring. A key feature of the Act is the obligation to report significant AI related incidents, including system failures, harmful outcomes, and misuse, to designated state authorities within prescribed timelines.

The Act positions New York as one of the most proactive US states in regulating AI risks through binding statutory obligations rather than voluntary guidelines. By extending safety and incident reporting requirements beyond high-risk use cases, the Act reflects a precautionary regulatory approach aimed at ensuring transparency and accountability across the AI lifecycle. The Act is expected to influence broader policy discussions at the federal and state level and adds to the growing patchwork of AI governance frameworks emerging across the United States.

15. California enacted regulation establishing unified data broker opt out mechanism[15]

California has enacted the Delete Act (“Act”) a landmark privacy law mandating the creation of the Data Broker Requests and Opt Out Platform, commonly referred to as DROP. Passed in May 2025, the law requires registered data brokers to participate in a centralized system that allows California residents to submit a single verified request to delete their personal information held by all registered data brokers. The platform is scheduled to go live on January 1, 2026 and is intended to simplify the exercise of deletion rights that were previously fragmented across multiple entities under the California Consumer Privacy Act framework.

The Act significantly strengthens consumer control over personal data by shifting the burden of compliance from individuals to data brokers. By introducing a mandatory, state operated opt out mechanism, California has addressed long standing concerns around ineffective and repetitive deletion requests. The Act also increases accountability for data brokers by reinforcing registration, verification, and response obligations.

EUROPEAN UNION

16. Crypto regulation took full effect across EU Member States[16]

The European Union’s Markets in Crypto Assets Regulation (“MiCA”) became fully effective across Member States during 2025, completing the transition to a harmonized regulatory framework for crypto assets not previously covered under existing financial services laws. MiCA establishes uniform requirements for the issuance, offering, and trading of crypto assets, including obligations relating to authorization, governance, disclosure, consumer protection, and operational resilience. The regulation applies to a wide range of crypto asset service providers and issuers, replacing divergent national approaches with a single EU wide regime.

The full applicability of MiCA represents a significant step toward regulatory clarity and legal certainty for the crypto industry in the European Union. By introducing consistent standards across jurisdictions, the framework aims to enhance consumer confidence, reduce regulatory arbitrage, and strengthen market integrity. For crypto businesses operating in or targeting the EU market, MiCA’s entry into full effect necessitates alignment of licensing, compliance, and risk management practices with the new harmonized requirements.

17. Council adopted procedural reforms for cross border GDPR enforcement[17]

The Council of the European Union adopted new procedural rules aimed at expediting the handling of cross border data protection complaints under the GDPR. Adopted in November 2025, the rules seek to address long standing delays in multi-state enforcement by introducing harmonised timelines, clearer cooperation mechanisms, and structured information sharing between lead supervisory authorities and concerned national regulators. The new rules are intended to improve consistency and predictability in cross border GDPR cases, particularly those involving large multinational technology companies operating across several EU member states.

The procedural rules are expected to strengthen the effectiveness of GDPR enforcement by reducing bottlenecks in coordination and decision making. By streamlining cooperation between data protection authorities, the new rules aim to ensure faster resolution of complaints, quicker remedies for data subjects, and more timely regulatory outcomes. For organisations operating across the European Union, the changes signal a move toward more efficient and coordinated enforcement, reinforcing the importance of proactive compliance and early engagement with supervisory authorities in cross border matters.

18.  AI Act implementation advanced with new compliance milestones[18]

EU’s AI Act, which entered into force in August 2024, saw the rollout of additional implementation milestones in 2025, activating key compliance obligations for artificial intelligence systems across the Union. During the year, governance and transparency requirements applicable to general purpose AI models were operationalised, including documentation, risk disclosure, and downstream information sharing obligations. Regulators also issued early guidance restricting certain AI uses in sensitive contexts such as employment decision making, online content moderation, website analytics, and law enforcement, reflecting a cautious approach toward high impact and intrusive AI applications.

These developments marked the transition of the AI Act from a legislative framework to an enforceable regulatory regime. By progressively activating obligations through phased implementation and guidance, the EU has sought to provide legal certainty while allowing organisations time to adapt compliance frameworks. The 2025 milestones signal increased regulatory scrutiny of general purpose and high-risk AI systems and are expected to shape supervisory priorities, enforcement strategies, and industry practices across the European digital ecosystem.

19. Enforcement of Health Data Space Regulation[19]

EU’s European Health Data Space Regulations (“EHDS”) entered into force in March 2025, establishing a comprehensive framework for the governance, use, and sharing of electronic health data across Member States. The regulation creates a harmonized system for both primary use of health data in healthcare delivery and secondary use for purposes such as research, innovation, public health, and policy making. It introduced common standards for electronic health records, interoperability requirements, and cross border access mechanisms, while reinforcing patient rights over access, portability, and control of their health data.

A central feature of the EHDS framework is the creation of strict safeguards governing secondary access to health data, including purpose limitation, secure processing environments, and oversight by designated health data access bodies. The regulation seeks to balance enhanced data availability with strong privacy and security protections, ensuring that sensitive health information is used responsibly and lawfully. By providing greater individual control alongside regulated access for public interest uses, the EHDS represents a significant step in the EU’s effort to build a trusted and innovation friendly health data ecosystem

20. Data Act became fully applicable across the digital economy[20]

EU’s Data Act (“Act”) became fully applicable from 12 September 2025, marking a significant milestone in the regulation of non-personal and mixed data generated by connected products and digital services. The Act established clear rights for users and businesses to access, use, and share data generated through Internet of Things devices, smart products, and related services. It also introduced obligations on manufacturers, service providers, and data holders to make such data available under fair, reasonable, and non-discriminatory terms, while setting safeguards for trade secrets, security, and interoperability.

The Act is designed to unlock data driven innovation and competition by reducing data silos and rebalancing control over industrial and machine generated data. By extending access and portability rights beyond personal data governed by the GDPR, the regulation complements the EU’s broader digital regulatory framework alongside the Digital Markets Act and Digital Services Act. For businesses operating in the EU, full applicability of the Data Act signals the need to review contractual arrangements, technical interfaces, and data governance practices to ensure compliance with the new data access and sharing standards.

21. EDPB issued guidelines on blockchain and data protection compliance[21]

EDPB issued guidelines in April 2025 clarifying the application of GDPR to the processing of personal data through blockchain and distributed ledger technologies. The guidelines addressed the inherent tension between blockchain features such as immutability, decentralisation, and transparency and core GDPR principles including data minimisation, purpose limitation, and storage limitation. They provided interpretative guidance on identifying data controllers and processors in decentralised environments and assess the circumstances in which on chain data may qualify as personal data.

A key focus of the guidelines is on compliance by design and by default, requiring organisations to assess data protection implications at the architecture stage. The EDPB emphasises the use of off chain storage, cryptographic techniques, and governance models that enable effective exercise of data subject rights. The guidelines also highlighted the importance of conducting data protection impact assessments for blockchain based use cases involving high risks to individuals, reinforcing the need for careful design choices when deploying decentralised systems under the GDPR framework.

22. Member States faced Action over Weak Implementation of Digital Services Regulations[22]

Five EU Member States were brought before the European Court of Justice for failure to adequately implement and enforce key obligations under the Digital Services Act (“DSA”) The proceedings were initiated by the European Commission (“Commission”), which alleged that the concerned Member States had not designated competent authorities, empowered Digital Services Coordinators, or established effective enforcement mechanisms as required under the DSA. The Commission emphasized that delayed or incomplete national implementation undermines the uniform application of platform accountability, content moderation, and systemic risk mitigation obligations across the Union.

The action underscores the EU’s commitment to ensuring that DSA operates as a harmonized and enforceable regulatory framework rather than a fragmented national regime. By invoking infringement proceedings, the Commission has signaled that Member States themselves will be held accountable for regulatory inaction, not merely online platforms. The outcome of the cases is expected to reinforce supervisory consistency and accelerate operational readiness across jurisdictions, strengthening the DSA’s role as a cornerstone of EU digital governance.

23. Italy became First Member State to enact Comprehensive National AI law[23]

Italy became the first European Union Member State to pass a comprehensive national law regulating artificial intelligence, adopting a statutory framework that aligns with the EU AI Act while introducing additional domestic enforcement measures. Enacted in September 2025, the legislation imposes obligations relating to ethical AI use, transparency, human oversight, and risk management, with heightened restrictions on high-risk applications such as biometric surveillance, deepfakes, and AI enabled fraud. The law also established national supervisory authorities empowered to monitor compliance and investigate violations alongside EU level oversight mechanisms.

While closely aligned with the EU AI Act’s risk-based approach, the Italian law goes further by introducing national penalties and sentencing provisions for harmful or abusive AI use. This includes significant fines and potential criminal consequences in cases involving deception, identity manipulation, or large-scale societal harm.

Italy’s move signals a growing willingness among Member States to supplement EU wide AI regulation with domestic enforcement frameworks, and may influence how other jurisdictions balance harmonized EU obligations with national policy priorities in the governance of artificial intelligence.

24. Commission proposed Simplification Of Digital Regulatory Framework [24]

The European Commission (“Commission”)  proposed a package of measures aimed at simplifying and streamlining the EU’s digital regulatory landscape, including targeted revisions affecting the implementation of the GDPR and the AI Act. Announced in November 2025, the proposal seeks to reduce compliance burdens, particularly for small and medium enterprises and emerging technology companies, by revisiting overlapping obligations, reporting requirements, and phased implementation timelines. The Commission indicated that the objective is not to dilute substantive protections, but to improve regulatory coherence and ease of compliance across multiple digital laws.

The proposal reflects growing concern within the EU about regulatory fragmentation and compliance fatigue resulting from the rapid rollout of digital legislation. By recalibrating implementation timelines and clarifying procedural obligations, the Commission aims to strike a balance between effective digital governance and economic competitiveness. If adopted, the changes could provide operational relief to regulated entities while preserving the core principles of data protection, accountability, and risk based AI regulation underpinning the EU’s digital policy framework.

OTHERS

25. Major Data Use and Data Protection Framework Enacted in UK[25]

The Data (Use and Access) Act, 2025 (“Act”) was enacted, formally replacing earlier legislative proposals such as the Data Protection and Digital Information Bill and marking a major overhaul of the UK’s data protection and data use framework following Brexit. The Act, which received Royal Assent on June 19, 2025, seeks to modernise the UK’s data regime by introducing reforms aimed at facilitating responsible data sharing, reducing administrative burdens, and supporting innovation while maintaining core privacy protections. It amends key aspects of the UK GDPR framework and associated data laws to enable more flexible data use across public and private sectors.

The Government has positioned the Act as a growth-oriented reform, estimating that the changes could contribute up to ten billion pounds to the UK economy over the next decade.

By recalibrating compliance requirements and promoting data driven innovation, the Act reflects the UK’s broader strategy of differentiating its post Brexit data governance model while seeking to retain adequacy status with the European Union. The legislation is expected to significantly influence how organisations manage data access, sharing, and compliance in the UK going forward.

26. Canada tightened Data Protection Rules in Hiring Practices[26]

The Commission d’accès à l’information (“CAI”) in Quebec, Canada issued new guidance establishing stricter limits on personal data collection during recruitment. Employers may now gather only information essential to assessing a candidate’s suitability such as name, contact details, and work or academic history at the initial stage. Requests for references, medical, or background checks must be deferred to later stages and tied directly to the role’s requirements. Written candidate consent is mandatory before verifying references, qualifications, or criminal history. Employers using AI or psychometric tools must disclose their use upfront and ensure assessments remain relevant to job performance. The guidelines also restrict social media screening to professional profiles and require prompt destruction of applicant data once hiring concludes. The CAI emphasized that necessity not convenience remains the standard, and over-collection or irrelevant questioning may result in enforcement action. These rules promote fairness, transparency, and data minimization throughout the hiring process.

27. Vietnam enacted Digital Technology Law Recognising Cryptocurrency Assets[27]

Vietnam enacted the Law on Digital Technology Industry (“Law”) in June 2025, becoming one of the first countries globally to adopt standalone legislation explicitly recognising and regulating cryptocurrency assets. The Law marked a significant departure from Vietnam’s earlier restrictive approach by providing legal recognition to digital assets and establishing a regulatory framework for their issuance, trading, custody, and use. It also addressesed broader aspects of the digital technology ecosystem, including blockchain infrastructure, digital innovation, and technology driven financial services, positioning crypto assets within a formal legal and economic framework.

The Law represents a strategic policy shift aimed at fostering innovation while introducing regulatory oversight and safeguards. By moving from prohibition to regulated recognition, Vietnam has signalled its intent to attract investment and talent in the digital asset and blockchain space while mitigating risks associated with fraud, money laundering, and market instability. The Law positions Vietnam as an emerging regional hub for digital assets and may influence regulatory approaches across Southeast Asia as jurisdictions reassess their stance on cryptocurrency and digital technology governance.

ABBREVIATIONS

AI – Artificial Intelligence
CERT-IN Indian Computer Emergency Response Team
DPA – Data Protection Authority
DPB – Data Protection Board
DPDP Act Digital Personal Data Protection Act, 2023
DPDP Rules – Digital Personal Data Protection Rules, 2025
EDPB – European Data Protection Board
GDPR – General Data Protection Regulation (EU) 2018/1725
MeitY – Ministry of Information and Technology
MSMEs – Micro, Small and Medium Enterprises
NCLAT – National Company Law Tribunal
UIDAI Unique Identification Authority of India

Authors:

  • Rashmi Deshpande
  • Aarushi Ghai
  • Vaibhav Gupta

Download File:

[1] https://www.meity.gov.in/static/uploads/2025/11/c56ceae6c383460ca69577428d36828b.pdf
[2] https://api.sci.gov.in/supremecourt/2024/17879/17879_2024_13_1501_61229_Judgement_30-Apr-2025.pdf
[3] https://www.meity.gov.in/static/uploads/2025/08/4f673438a686e3fa81dd2d277b445f42.pdf
[4] https://static.pib.gov.in/WriteReadData/specificdocs/documents/2025/nov/doc2025115685601.pdf
[5] https://www.cert-in.org.in/PDF/Elemental_Cyber_Defense_Controls_for_MSME.pdf
[6] https://uidai.gov.in/en/about-uidai/legal-framework/regulations/19549-aadhaar-authentication-and-offline-verification-amendment-regulations-2025.html
[7] WhatsApp LLC v. Competition Commission of India, I.A. No. 6817 of 2025, NCLAT New Delhi
[8] https://www.dpiit.gov.in/static/uploads/2025/12/ff266bbeed10c48e3479c941484f3525.pdf
[9] Re: Victims of Digital Arrest Related to Forged Documents, SUO MOTO WRIT PETITION (CRIMINAL) No(s).  3/2025
[10] Zanmai Labs Pvt Ltd. v. Bitcipher Labs LLP, Commercial Arbitration Petition (L) No. 11646 of 2025 and Zanmai Labs Pvt ltd v. Nextgendev Solutions Pvt. Ltd., Commercial Arbitration Petition (L) No. 11975 of 2025
[11] https://www.congress.gov/bill/119th-congress/senate-bill/1582
[12] https://mgaleg.maryland.gov/mgawebsite/Legislation/Details/hb0567?ys=2024RS and https://pub.njleg.state.nj.us/Bills/2022/S0500/332_R6.PDF
[13] https://www.supremecourt.gov/opinions/24pdf/24-656_ca7d.pdf
[14] https://www.governor.ny.gov/news/governor-hochul-signs-nation-leading-legislation-require-ai-frameworks-ai-frontier-models
[15] https://www.congress.gov/bill/119th-congress/house-bill/2612/all-info
[16] https://www.esma.europa.eu/esmas-activities/digital-finance-and-innovation/markets-crypto-assets-regulation-mica
[17] https://www.consilium.europa.eu/en/press/press-releases/2025/11/17/council-adopts-new-eu-law-to-speed-up-handling-cross-border-data-protection-complaints/
[18] https://www.europarl.europa.eu/RegData/etudes/ATAG/2025/772906/EPRS_ATA(2025)772906_EN.pdf
[19] https://health.ec.europa.eu/ehealth-digital-health-and-care/european-health-data-space-regulation-ehds_en
[20] https://digital-strategy.ec.europa.eu/en/policies/data-act
[21] https://www.edpb.europa.eu/our-work-tools/documents/public-consultations/2025/guidelines-022025-processing-personal-data_en
[22] https://ec.europa.eu/commission/presscorner/detail/en/ip_25_1081
[23] https://www.gazzettaufficiale.it/eli/gu/2025/09/25/223/sg/pdf
[24] https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2718
[25] https://www.gov.uk/guidance/data-use-and-access-act-2025-data-protection-and-privacy-changes#:~:text=The%20Data%20(Use%20and%20Access)%20Act%202025%20(%E2%80%9C%20DUAA,the%20subject%20of%20this%20page.
[26] https://www.cai.gouv.qc.ca/protection-renseignements-personnels/sujets-et-domaines-dinteret/operation-recrutemement-emploi
[27] https://www.vietnam.vn/en/tai-san-so-va-cach-tiep-can-dac-biet-cua-viet-nam

Disclaimer

Current rules of the Bar Council of India impose restrictions on maintaining a web page and do not permit lawyers to provide information concerning their areas of practice. Fountainhead Legal is, therefore, constrained from providing any further information on this web page except as stated below.

The rules of the Bar Council of India prohibit law firms from soliciting work or advertising in any manner. By clicking on ‘I AGREE’, the user acknowledges that:

The user wishes to gain more information about Fountainhead Legal, its practice areas and the firm’s lawyers, for his/her own information and use;

The information is made available/provided to the user only on his/her specific request and any information obtained or material downloaded from this website is completely at the user’s volition and any transmission, receipt or use of this site is not intended to, and will not, create any lawyer-client relationship; and

None of the information contained on the website is in the nature of a legal opinion or otherwise amounts to any legal advice.

Fountainhead Legal, is not liable for any consequence of any action taken by the user relying on material/information provided under this website. In cases where the user has any legal issues, he/she in all cases must seek independent legal advice.