Fair terms for data access
We're fighting for public interest platform researchers— and there are model data-sharing terms you can use right now
For years, platforms have set the rules for how their data is shared—often leaving users and the public interest research community with little say. Mozilla Foundation asked the data rights experts at AWO to dig into the problem. They also built a prototype that shows what more balanced, research-ready terms could look like.
Data is everywhere—but for researchers trying to study social media platforms, it can feel like none of it is on offer.
Data brokers profit from our personal information. AI companies sweep up the web, often without regard for consent or ownership. Meanwhile, public interest researchers are left navigating unclear processes and restrictive gatekeeping when they try to examine the social impacts of the world’s largest platforms. In many cases, those same platforms argue that researcher access could infringe user privacy—which we find a bit rich, given their own data practices.
At Mozilla Foundation, we believe privacy-protecting research is essential to understanding harmful data practices at scale. Researchers and watchdogs need access to scrutinize corporate behaviour beyond promises or PR. This kind of access is foundational to maintaining trust—and to building better technology ecosystems.
Under Article 40.12 of the EU’s Digital Services Act (DSA), the largest online platforms must now create processes for granting researchers access to data. But today, that access is usually granted through strict contractual terms written by the platforms themselves. These agreements are dense, complex, and often unworkable. TikTok’s terms have been called a “minefield,” and other platforms restrict even basic research practices—combining datasets, bringing new colleagues onto a project, submitting work for peer review, or publishing independently.
This analysis reviews the data-sharing agreements of X, TikTok, Meta (Facebook and Instagram), YouTube, and LinkedIn to show how poorly suited these terms are for truly independent research. Through focused discussions with researchers who navigate these agreements every day, AWO also surfaces how these terms create a chilling effect on using the DSA’s hard-won data access rights.
This analysis finds that X, TikTok, Meta, YouTube and LinkedIn are imposing unfair terms on researchers.
These agreements can block projects outright, delay critical work, or add costs and burdens that have nothing to do with privacy or safety. In the most severe cases, they even prevent researchers from replicating their own findings. Critically, many of these conditions go beyond what the DSA and GDPR actually require.
Through in-depth conversations with platform researchers, AWO validated how these terms play out in practice. The report includes case studies showing the real-world impact — and outlines the most frequent, most consequential ways current terms hinder research.
The most common research constraints found in the terms relate to:
- Overly strict qualification criteria for researchers
- Access method restrictions
- Use restrictions
- Platform rights to monitor and terminate researcher access
- Rate/Quota restrictions
- Scraping restrictions
- Onward sharing restrictions
- Excessive data management obligations
- Publication-related obligations and rights that curtail independence
- Indemnity and liability that discourage research
Not all platforms follow the exact same approach. This inconsistency creates yet another hurdle: researchers must assess each agreement one by one, weighing risk and feasibility before a project even begins.
Download the analysis—and explore the prototype model fairer terms—to help access data that strengthens research, protects privacy, and supports a healthier digital ecosystem.
Get the full picture
Towards Fairer Terms for Data Access under Article 40(12) DSA
This analysis reviews the data-sharing agreements of X, TikTok, Meta (Facebook and Instagram), YouTube, and LinkedIn to show how poorly suited these terms are for truly independent research.
Through focused discussions with researchers who navigate these agreements every day, AWO also surfaces how these terms create a chilling effect on using the DSA’s hard-won data access rights.
How to strike a better balance?
Today, platforms typically write the terms — and researchers are asked to accept them. But what if we flipped that script and started from what researchers actually need to conduct independent, privacy-protecting work?
Mozilla Foundation asked AWO to design a prototype data-sharing agreement that reflects what the DSA and GDPR truly require, without imposing conditions that unnecessarily constrain research. The result is a model contract that both platforms and researchers can adapt — one that centers the realities of public-interest platform research while respecting user privacy and legitimate business interests.