Technology Law and Data Privacy Updates

Latest News

Technology Law and Data Privacy Updates

Monthly Edition - November 2025

FOUNDER’S NOTE

Welcome to this edition of Fountainhead Legal’s newsletter!

If there is one theme that unmistakably runs through this month’s developments, it is that regulation is no longer catching up with technology, it is beginning to shape it. Across India and key global jurisdictions, regulators and courts are signalling a clear shift from reactive enforcement to structured oversight, accountability by design, and preparedness as a baseline expectation.

In India, the staggered implementation of the DPDP Act signals a pragmatic regulatory approach giving organisations time to prepare, while leaving little doubt about the direction of regulatory expectations. The focus is no longer on compliance in theory, but on how organisations actually handle personal data, manage vendors, respond to incidents, and embed accountability into everyday operations. This shift is part of a broader legislative trend. Alongside digital regulation, India has also introduced four consolidated labour codes, replacing a complex web of legacy employment laws with simplified, principle-based frameworks aimed at ease of compliance and stronger worker protections. Together, these reforms reflect a move towards modern, operationally focused regulation. At the same time, courts are showing a growing willingness to adapt legal tools to digital realities whether by scrutinising platform-led content moderation, examining global-turnover–based antitrust penalties, or granting dynamic injunctions to tackle fast-moving online piracy.

What is equally striking is how similar conversations are unfolding globally. California’s moves on health data and youth safety reflect concerns that many businesses will recognise that sensitive risks now arise from apps, algorithms, and analytics not just traditional regulated sectors. Europe’s efforts to streamline GDPR enforcement and its early judicial signals on AI-generated copyright infringement show a determination to ensure that scale and automation do not become shields against accountability. UK’s proposed cyber resilience reforms add another layer, placing preparedness and supply-chain responsibility at the heart of national security.

Taken together, these developments point to a broader message for leadership teams that compliance is no longer a back-office exercise. Decisions about product design, data use, vendor relationships, and automation are increasingly legal decisions as much as commercial ones. Organisations that treat this phase as an opportunity to build trust, resilience, and clarity, rather than as a regulatory burden, will be far better placed to navigate what comes next.

We hope you find these updates insightful and informative!

NATIONAL 

1. Government lays down Phased Roadmap for Digital Privacy Regime[1]

MeitY has outlined a staggered implementation plan for operationalising the DPDP Act through the Digital Personal Data Protection Rules, 2025 (“DPDP Rules”). The Data Protection Board of India has been constituted with effect from November 13, 2025, marking the institutional foundation of India’s new privacy regime. Further, key enforcement provisions, including penalty powers will come into force only from May 13, 2027, providing an 18-month transition window for organisations.  The DPDP Rules also provide for a separate timeline for Consent Managers, whose registration framework will become operational from November 13, 2026. In the interim, companies are expected to focus on updating privacy notices, restructuring consent flows, implementing baseline security safeguards, and reviewing data-retention practices.

This staggered approach signals a regulatory intent to prioritise preparedness over immediate enforcement. Organisations that proactively undertake data mapping, rationalise retention practices, and strengthen security controls during this transition period will be better positioned to manage future compliance and regulatory scrutiny, particularly if classified as Significant Data Fiduciaries. Importantly, these DPDP Rules represent only the first layer of India’s data protection framework, and several operational clarifications are still awaited such as the mechanism for implementing the right to nominate, the standards applicable to data audits and categories of Data Fiduciaries to be classified as Significant Data Fiduciary. Businesses should therefore treat this phase as a foundational compliance exercise, with flexibility built in to adapt to further regulatory guidance as it emerges.

2. Government dilutes Proposal on Mandatory Pre-Installation of Sanchar Saathi App[2]

DoT had issued directions under the Telecommunication Cyber Security Rules, 2024 empowering the Central Government to require mobile manufacturers to assist in tackling tampered devices and fraudulent IMEI numbers. As part of this initiative, manufacturers were asked to pre-install the Sanchar Saathi application on new handsets to enable device verification and reporting of lost or stolen phones.

While the move was positioned as a consumer-protection measure, it also raised concerns around user autonomy, transparency of data collection, and the implications of mandating a Government-linked application on personal devices by default. In a subsequent clarification, the Government confirmed that pre-installation is not mandatory for manufacturers as of December 03, 2025, easing immediate compliance pressure. The episode reflects a balancing act between cybersecurity objectives and growing sensitivities around privacy and user choice in the digital ecosystem.

3. Supreme Court signals Shift towards Structured Oversight in Social Media Regulation[3]

During proceedings concerning harmful and irresponsible online content, the Supreme Court recorded the Central Government’s submission that revised social media content guidelines are at an advanced stage and will be placed in the public domain for consultation within four weeks. The Government clarified that these guidelines are being developed by the Ministry of Information and Broadcasting to evolve a more effective mechanism for addressing offensive, misleading, and socially harmful content circulating on digital platforms.

Importantly, the court’s observations went beyond timelines and underscored institutional concerns with the current model of platform-centric moderation. During the hearing, the court engaged with suggestions that content governance cannot rely solely on private platforms exercising discretionary takedown powers, and that a more structured, neutral mechanism may be required to deal with serious digital harms. The discussion reflects judicial unease with ad-hoc moderation and the absence of consistent accountability frameworks.

As the proposed guidelines move towards public consultation, this order indicates that future regulation may lean towards greater State oversight or hybrid regulatory structures, rather than self-regulation alone. For intermediaries, creators, and digital platforms, the evolving framework could materially impact intermediary liability standards, takedown obligations, and procedural safeguards making the forthcoming consultation phase a critical inflection point for India’s content governance regime.  

4. Delhi High Court examines Legality of Global-Turnover Based Antitrust Penalties[4]

Technology giant Apple Inc. (“Company”) has approached the Delhi High Court challenging India’s revised antitrust penalty framework, under which the CCI is empowered to impose penalties based on an enterprise’s global turnover. This power flows from Section 27(b) of the Competition Act, 2002, as amended by the Competition (Amendment) Act, 2023, which permits penalties of up to 10% of the global turnover of an enterprise found to have engaged in anti-competitive conduct. Company has questioned the legality and proportionality of applying this provision to conduct allegedly confined to the Indian market. At the initial hearing, the court issued notice to the Union of India and the CCI and directed them to file their responses, indicating that the challenge raises substantive questions of law. Company’s petition has been filed in the backdrop of ongoing CCI proceedings examining its App Store policies, including restrictions on in-app payment mechanisms and developer access. Company has argued that the amended penalty provision significantly expands enforcement exposure for multinational companies and risks imposing disproportionate penalties disconnected from domestic economic impact.

The case is expected to be a key judicial test of the amended penalty regime and will likely shape how India applies its strengthened competition law framework to global technology companies going forward.

5. DoT confirms Amended Telecom Cybersecurity Rules Continue to Apply [5]

The Department of Telecommunications has clarified that the amended Telecommunication Cyber Security Rules, 2025 (“TCS Rules 2025”) remain fully in force, despite the withdrawal of a duplicate Gazette notification that was issued in error. The original notification, first issued in October 2025, continues to be valid and enforceable.

TCS Rules 2025 strengthens protections around telecom identifiers such as mobile numbers and device identifiers, addressing growing risks of misuse and fraud. It also enhances mechanisms for verification and tracking of devices involved in unlawful activities.

6. Supreme Court releases White Paper on Use of AI in Judiciary[6]

The Supreme Court of India has released a comprehensive white paper called White Paper on Artificial Intelligence and Judiciary (“White Paper”) examining how artificial intelligence can be responsibly integrated into the functioning of the judicial system. Prepared by the court’s Centre for Research and Planning, the White Paper outlines potential uses of AI to support tasks such as case categorisation, document analysis and administrative efficiency. It also highlights the opportunities AI presents for reducing delays, improving access to legal information and assisting judges with high-volume workloads. At the same time, the White Paper stresses that AI applications must remain assistive in nature and cannot replace core judicial functions that require human reasoning and constitutional judgment.

The White Paper also stresses the importance of transparency, accountability, and data-protection safeguards, recommending ethical and regulatory frameworks to prevent bias and maintain public trust as courts explore responsible AI adoption.

7. Delhi High Court grants ‘Dynamic+ Injunction’ to Combat Digital Piracy[7]

The Delhi High Cour has granted a Dynamic+ injunction in favour of JioStar India Private Limited (“Company”) in its ongoing battle against large-scale digital piracy. The injunction empowers Company to identify and block access to pirated copies of its copyrighted content across a wide range of online platforms and internet services without the need to return to court for each individual instance.

The court acknowledged that traditional takedown mechanisms are often ineffective against piracy networks that activate new domains minutes before live events and deactivate them immediately after. Emphasising the ‘hydra-headed’ nature of such platforms, the court held that copyright enforcement must be real-time, visible, and effective to prevent rights from being rendered illusory in the digital environment. Accordingly, the injunction empowers the plaintiff to notify domain name registrars and internet service providers directly for immediate blocking, without approaching the Court for fresh orders each time

This development signals increasing judicial support for rights-holders seeking effective remedies against widespread online infringement in the digital age.

INTERNATIONAL 

UNITED STATES OF AMERICA

8. Banking Regulator clarifies Limited Crypto-Asset Holding by Banks[8]

The Office of the Comptroller of the Currency (“OCC”) has clarified that national banks may, in limited circumstances, hold crypto-assets on their own balance sheets where such holdings are incidental and necessary for permissible banking activities. This clarification specifically addresses situations where banks need to hold crypto-assets to pay transaction or network fees required to operate on blockchain-based systems. The OCC has indicated that banks may maintain crypto-assets in amounts that are reasonably connected to supporting authorised activities, including testing, operating, or facilitating distributed ledger platforms. However, such activities must continue to comply with existing prudential standards, including safety and soundness requirements, robust risk management, and applicable compliance obligations.

While the clarification does not amount to a broader endorsement of proprietary crypto-trading by banks, it removes an important operational barrier for institutions engaging with blockchain infrastructure. The move signals a measured regulatory approach allowing functional use of crypto-assets for infrastructure and settlement purposes, while maintaining strict supervisory oversight over exposure and risk.

9. California Strengthens Privacy Regulations Concerning Health Data [9]

California has strengthened protections for health-related personal information through Assembly Bill 45 (AB 45) (“Amendment”), which amends the California Medical Information Act, 1981 (“CMIA”). The amendment broadens the scope of what qualifies as “medical information” to include health-adjacent data collected by non-traditional entities such as mobile apps, digital platforms, wearables, and online services, even where the data does not originate from a hospital, physician, or health insurer.

The Amendment recognises that modern health insights are increasingly derived from behavioural data, wellness metrics, reproductive-health activity, and geolocation information linked to medical visits. By bringing such data within the CMIA framework, California has extended healthcare-grade confidentiality obligations to businesses operating outside the regulated healthcare ecosystem, addressing gaps that previously allowed sensitive health inferences to be commercially exploited.

Recent enforcement actions have highlighted how health-adjacent data is increasingly exploited outside traditional healthcare settings, as seen in the Flo Health matter, where sensitive reproductive and fertility data was shared with third parties contrary to user assurances, and in scrutiny of Gravy Analytics for collecting and selling precise geolocation data capable of revealing visits to medical facilities. Amendment responds directly to these risks by extending CMIA-level confidentiality and use restrictions to digital platforms, apps, and data brokers that derive health insights indirectly, making clear that the absence of a formal healthcare relationship does not dilute obligations where data can expose medical conditions, treatment decisions, or reproductive choices.

10. Telecommunications Provider penalized over Vendor Data Breach[10]

FCC has imposed a USD 1.5 million penalty on Comcast Cable Communications LLC (“Company”) a major telecommunications service provider, following an investigation into a data breach originating from a third-party vendor responsible for handling online customer activation and account-related services. The breach resulted in unauthorised access to sensitive subscriber information, including names, phone numbers, account identifiers, and, in some instances, partial Social Security numbers.

The incident arose after the Company’s vendor failed to secure systems containing legacy customer data. An unauthorised actor gained access to the vendor’s network, exposing personal information of a significant number of current and former subscribers. Notably, the compromised data had been retained by the vendor even after the commercial relationship had ended, highlighting gaps in data minimisation, retention controls, and post-termination oversight of third-party processors. The FCC found that Company had insufficient vendor-management safeguards, inadequate monitoring of third-party security practices, and weaknesses in breach-response escalation, including delays in notification. Under the settlement, the Company must strengthen vendor risk-management frameworks, implement enhanced technical and organisational security measures, improve internal risk assessments, and comply with ongoing FCC monitoring requirements underscoring regulatory expectations that service providers remain accountable for subscriber data even when handled by external vendors.

This enforcement action highlights that vendor due diligence is no longer a procedural formality but a core compliance obligation, particularly where third parties handle large volumes of personal data. Weak oversight of legacy data, retention practices, and breach-response protocols can expose organisations to regulatory liability even when the incident occurs outside their own systems. As India begins implementing the DPDP Act, Indian companies should treat third-party risk management, contractual safeguards, and continuous vendor monitoring as foundational compliance measures rather than post-incident controls.

11. California introduces Measures addressing Youth Safety and Privacy on Social Media[11]

California has passed Assembly Bill 56 (AB 56) (“Legislation”) as part of its continued effort to enhance protections for children and teenagers using social-media platforms. The Legislation responds to growing concerns around teen mental health, behavioural profiling, and the influence of algorithm-driven content, and will come into force from January 01, 2027.  The Legislation places heightened obligations on social-media and online platforms that are reasonably likely to be accessed by minors, requiring them to reassess how data collection, recommendation systems, and engagement-driven design choices impact young users. A key feature of the Legislation is increased platform accountability, particularly around design and algorithmic choices that can harm minors. This includes curbing addictive features such as endless scrolling, autoplay of videos, and frequent push notifications that encourage prolonged screen time, as well as addressing risks from automated content curation, where algorithms repeatedly promote certain videos or posts based on past behaviour, potentially exposing children to inappropriate, harmful, or distressing content. The Legislation aims to ensure that engagement-driven technology does not come at the cost of children’s mental health and well-being.

By mandating stronger privacy and safety standards for minors by default, Legislation reinforces California’s position that digital platforms must prioritise child well-being over engagement metrics. The Legislation is expected to have wide-reaching implications for global technology companies, particularly those offering interactive, algorithm-driven services to minors.

EUROPEAN UNION

12. German Court rules AI-generated Song Lyrics breached Copyright Law[12]

Germany District Court has held that OpenAI Inc. (“Company”) models violated copyright law by generating song lyrics that closely mirrored protected works without permission. The case was initiated by the German music industry, which argued that Company’s systems were trained on copyrighted material and subsequently reproduced lyrics in ways that were ‘substantially similar’ to the originals. The court found that such outputs constituted unauthorised reproductions under German copyright rules, particularly because the lyrics are creative works that afforded strong protection. This ruling mark one of the earliest European judicial findings directly concluding that an AI system generated infringing output. The ruling also highlights the growing legal pressure on AI developers to ensure transparency and compliance in their training practices. The court emphasised that training models on copyrighted works without licences, when it leads to recognisable reproductions, breaches Germany’s copyright framework. Although, the Company argued that model training should fall under permissible text-and-data mining exceptions, the court rejected this defence.

The ruling is expected to influence ongoing EU-wide debates on AI regulation and copyright obligations, particularly as the EU AI Act and updated copyright guidance take shape. Notably, a similar copyright infringement dispute is currently pending in India against the same company, concerning the alleged unauthorised use of news content by a television news channel. While not binding, the EU court’s reasoning could serve as a persuasive reference point for Indian courts in assessing liability for AI-driven or automated reuse of copyrighted news content.

13. Council adopts New Rules to Streamline Cross-border GDPR enforcement [13]

The Council of the European Union has adopted Regulation on Additional Procedural Rules Relating to the Enforcement of the GDPR in Cross-Border Cases (“GDPR Additional Procedural Regulation”) designed to make enforcement of the GDPR more efficient when complaints involve data processing that crosses national borders. GDPR Additional Procedural Regulations harmonises key procedural elements between national data protection authorities, including common criteria for admissibility of cross-border cases, clear rights for complainants and investigated entities, and cooperative mechanisms that reduce administrative complexity. It also introduces targeted deadlines for investigations, with standard procedures to be completed within 15 months and simpler cooperation cases wrapped up within 12 months, helping avoid lengthy delays that have historically slowed cross-border enforcement.

By setting uniform processes across EU Member States, the reform aims to strengthen consistency in how privacy complaints are handled and give individuals and companies greater predictability and transparency in the GDPR enforcement landscape. GDPR Additional Procedural Regulation will enter into force 20 days after its publication in the Official Journal of the European Union and will become applicable 15 months later. This marks the final legislative step in a long-standing effort to improve cooperation between national data protection authorities and to support more effective protection of privacy rights across the single market.

14. Spanish Data Protection Authority issues Practical Guide to Assist Freelancers and SMBs Secure Personal Data[14]

Spain’s data protection authority, the Agencia Española de Protección de Datos (“AEPD”), has published Encryption Guide for Freelances and SMBs (“Guide”) aimed at assisting freelancers and small and medium-sized enterprises improve how they handle personal information. The Guide is designed for organisations with limited resources and focuses on everyday data-processing activities such as customer management, billing, marketing and online presence. It explains key data protection principles in simple terms and highlights common risks faced by smaller businesses when managing personal data. The Guide places strong emphasis on basic but effective security measures, including access controls, secure storage, password hygiene, data minimisation and incident response planning. It also encourages small and medium business and independent professionals to assess their data practices regularly and adopt a proactive approach to compliance rather than treating data protection as a one-time obligation. By issuing this Guide, the AEPD aims to promote better privacy practices at the grassroots level and reduce avoidable data breaches through practical, easy-to-follow recommendations.

UNITED KINGDOM

15. Cyber Security and Resilience Bill introduced[15]

Government has introduced the Cyber Security and Resilience (Network and Information Systems) Bill (“Bill”) wide-ranging legislative proposal aimed at strengthening the country’s ability to prevent, withstand, and respond to cyber threats. The Bill is designed to modernise the existing Network and Information Systems (NIS) framework, which has increasingly struggled to keep pace with the scale, complexity, and interdependence of today’s digital infrastructure and cyber risks. A key feature of the Bill is its expanded scope, which brings managed service providers, data centres, and other critical digital supply-chain actors under direct regulatory oversight, recognising their central role in supporting essential services. These organisations would be required to identify and regularly assess cyber risks, implement proportionate technical and organisational security measures, and maintain clear governance structures for cyber resilience. The Bill also strengthens incident detection and reporting obligations, requiring timely notification of significant cyber incidents that could disrupt services, along with cooperation with regulators during investigations. In addition, regulators are empowered to issue binding directions, require remedial action, and intervene where systemic or emerging cyber risks threaten national infrastructure, shifting the focus from reactive breach response to ongoing cyber preparedness and resilience.

If approved, the Bill will significantly raise compliance expectations for organisations operating in or supporting essential services. Businesses may be required to adopt more structured cyber-risk governance, improve supplier and third-party security controls, and implement faster and more detailed incident-reporting processes. For many organisations particularly technology providers and service vendors previously outside direct regulation the Bill signals a shift towards greater accountability for cyber resilience as a core operational responsibility, rather than a purely technical or reactive function.

Authors:

  • Rashmi Deshpande
  • Aarushi Ghai
  • Vaibhav Gupta

Download File:

Disclaimer

Current rules of the Bar Council of India impose restrictions on maintaining a web page and do not permit lawyers to provide information concerning their areas of practice. Fountainhead Legal is, therefore, constrained from providing any further information on this web page except as stated below.

The rules of the Bar Council of India prohibit law firms from soliciting work or advertising in any manner. By clicking on ‘I AGREE’, the user acknowledges that:

The user wishes to gain more information about Fountainhead Legal, its practice areas and the firm’s lawyers, for his/her own information and use;

The information is made available/provided to the user only on his/her specific request and any information obtained or material downloaded from this website is completely at the user’s volition and any transmission, receipt or use of this site is not intended to, and will not, create any lawyer-client relationship; and

None of the information contained on the website is in the nature of a legal opinion or otherwise amounts to any legal advice.

Fountainhead Legal, is not liable for any consequence of any action taken by the user relying on material/information provided under this website. In cases where the user has any legal issues, he/she in all cases must seek independent legal advice.