Transparency Report
The most salient feature of ByteDance’s transparency disclosure is that overseas disclosure far exceeds domestic disclosure. TikTok is among the fastest-iterating VLOPs on transparency reporting globally, while inside China, Doubao / Coze / Volcano Engine publish essentially no standalone transparency reports— domestic disclosure is mediated by the CAC filing roster, on a path of “national archiving” (国家档案化) rather than “corporate transparency.”
1. The disclosure matrix, in full
Section titled “1. The disclosure matrix, in full”| Report | Publisher | Frequency | Start | Jurisdiction |
|---|---|---|---|---|
| TikTok Community Guidelines Enforcement Report | TikTok Global | Quarterly | 2019 | Global |
| TikTok Government Requests Report | TikTok Global | Semi-annual | 2019 | Global (broken out by country) |
| TikTok Intellectual Property Removal Report | TikTok Global | Semi-annual | 2020 | Global |
| TikTok DSA Transparency Report (Art. 24 / 15) | TikTok EU | Semi-annual | 2023-10 | EU |
| TikTok DSA Systemic Risk Assessment (Art. 34) | TikTok EU | Annual | 2024-11 | EU |
| TikTok USDS Transparency (under Project Texas) | TikTok USDS | Irregular | 2023 | U.S. |
| CapCut Community Guidelines Enforcement Report | CapCut Global | Semi-annual | 2024 | Global |
| ByteDance Corporate Social Responsibility Report | ByteDance Group | Annual | 2019 | China |
| Doubao / Coze / Volcano Engine transparency report | — | None | — | China |
2. TikTok Community Guidelines Enforcement Report
Section titled “2. TikTok Community Guidelines Enforcement Report”Published quarterly since 2019. Most recent public version as of 2026-04: the 2025 Q4 Report (published 2026-03).
Typical disclosure fields:
- Total videos removed + removal rate (with a breakdown of AI-automated vs. human review)
- Breakdown by violation category (hate, minor safety, disinformation, synthetic / manipulated media, etc.)
- Breakdown by country / region
- Accounts banned, appeal restoration rates
- Ad-violation disclosures
- AI-generated content section (added 2024 Q1): synthetic-content detection rates, proactive detection of unlabeled synthetic content, and coordination with the C2PA ecosystem
Sector position: TikTok’s reports were criticized as “below Meta / YouTube” in 2020–2022, were substantially strengthened in 2023–2024 under DSA pressure, and by 2025 have approached the top tier of EU VLOPs. Ranking Digital Rights (RDR)‘s 2025 Corporate Accountability Index places TikTok’s score rising for a third consecutive year, though still below Google, Meta, and Microsoft.
3. EU DSA Transparency Reports (core battleground since 2023)
Section titled “3. EU DSA Transparency Reports (core battleground since 2023)”Art. 24 / 15: semi-annual structured disclosure
Section titled “Art. 24 / 15: semi-annual structured disclosure”Under DSA Art. 15 / 24, TikTok has published a DSA Transparency Report every six months since 2023-10. Versions to date:
- 2023 H2 (published 2024-02)
- 2024 H1 (published 2024-09)
- 2024 H2 (published 2025-03)
- 2025 H1 (published 2025-09)
- 2025 H2 (published 2026-03)
Mandatory fields (hard requirements under DSA Art. 15):
- Orders from Member States (number and disposition of government takedown orders from each Member State)
- Notice and action (notices submitted under Art. 16 by users / entities / trusted flaggers)
- Content moderation decisions (by language, category, automated vs. human)
- Statements of Reasons (SoR) submitted to the DSA Transparency Database
- Average monthly active recipients of the service (broken out by the 27 EU Member States)
- Out-of-court dispute settlement statistics
- Automated decision-making accuracy disclosure
As of this snapshot date, TikTok’s cumulative SoR submissions to the DSA Transparency Database (the European Commission’s official database at https://transparency.dsa.ec.europa.eu/) place it at the VLOP top tier at the billion scale (exact figure per live database statistics). This means TikTok has a structured, machine-readable record for every content-moderation decision— a mechanism layer well beyond anything publicly disclosed inside China.
Art. 34 / 35: Systemic Risk Assessment (SRA)
Section titled “Art. 34 / 35: Systemic Risk Assessment (SRA)”Published annually and co-issued with an independent third-party auditor (TikTok selected Kroll + law firm A&O Shearman in 2024). The 2025 SRA notably discloses:
- Detection and takedown of pro-Russia / manipulation accounts around the 2024 European Parliament election (Romania presidential-election post-mortem)
- The prevalence and mitigation of AI-generated synthetic media in political contexts
- Algorithmic-recommendation adjustments for content related to minors’ mental health
The European Commission’s opening of formal proceedings under DSA Art. 66 against TikTok Lite in 2024-04 forced the depth of SRA disclosure to step up— a paradigmatic case of ByteDance’s international-side transparency being pulled up by “hard law.”
4. TikTok Government Requests Report
Section titled “4. TikTok Government Requests Report”Semi-annual since 2019. Discloses:
- Information requests (number of user-information requests from law-enforcement agencies in each country, and fulfillment rate)
- Legal removal requests (non-DSA takedown orders)
- National security / emergency requests (U.S. FISA and other countries’ emergency-request disclosure norms)
There is an ongoing tension in the U.S.-government-requests disclosures: TikTok has committed that, under the Project Texas architecture, U.S. user data is hosted by Oracle and operated independently by USDS (U.S. Data Security), but that architecture’s actual transparency has been questioned by CFIUS, Congress, and independent researchers. After PAFACA was passed in 2024, the “credibility” of Project Texas was fully aired in TikTok v. Garland at the U.S. Supreme Court.
5. Project Clover / Project Texas: transparency architectures vs. legal shutdown
Section titled “5. Project Clover / Project Texas: transparency architectures vs. legal shutdown”| Project | Jurisdiction | Objective | Transparency element |
|---|---|---|---|
| Project Clover | Europe / UK | European user-data localization + independent third-party oversight | Three European data centers (Dublin / Hamar / Norway); NCC Group as independent security auditor |
| Project Texas | U.S. | U.S. user data hosted by Oracle + USDS independent operation | Oracle source-code review; USDS independent governance; the architecture was not fully accepted by CFIUS |
PAFACA enforcement pressure in 2024–2025 meant that even with Supreme Court preservation, Project Texas cannot prevent the divestment or ban law from taking effect, and ByteDance has been pushed into back-and-forth negotiations between “selling TikTok’s U.S. assets” and “shutdown.” It is a case study in “no amount of transparency can overcome geopolitical risk.”
Rebecca Arcesati (MERICS) comments: TikTok’s transparency investment is among the most intensive of any global social platform, but the conclusion is that when the regulator’s goal is geopolitical security rather than information transparency, transparency engineering cannot produce the desired trust.
6. CapCut / Doubao / Coze: structural disclosure differences
Section titled “6. CapCut / Doubao / Coze: structural disclosure differences”CapCut
Section titled “CapCut”Has published a semi-annual Community Guidelines Enforcement Report since 2024, with narrower scope than TikTok’s but with a dedicated AI-generated-content section (significant as CapCut is the world’s largest AI-video creation tool, so the disclosure has public-interest value).
Doubao / Coze / Volcano Engine (domestic)
Section titled “Doubao / Coze / Volcano Engine (domestic)”No standalone transparency reports have been published. Disclosure reaches the public only indirectly:
- CAC algorithm-filing roster: list of filed products (filing number, entity, filing date, without content).
- CAC typical-case roster: in 2024-11 the CAC published for the first time “Typical Cases of Generative AI Algorithms,” listing ByteDance’s Doubao as one of the “compliance exemplars” (but without public case details).
- MIIT cybersecurity compliance notices: periodic enforcement announcements occasionally touching on ByteDance products, but without naming specific features.
- ByteDance Corporate Social Responsibility Report: covers content-moderation totals, minor protection, rural-education investment, etc., but disclosure on “algorithmic governance / model safety” is shallow.
This is what Kendra Schaefer and Matt Sheehan call “national archiving” rather than “public transparency”— the primary audience for disclosure is the regulator, with the public only a secondary audience.
7. Academic critique
Section titled “7. Academic critique”International scholarship
Section titled “International scholarship”- Ranking Digital Rights (RDR): TikTok’s Corporate Accountability Index score has risen significantly relative to 2022, but it still lags Google, Meta, and Microsoft on “Governance / Freedom of Expression” (precise scores per RDR’s annual report).
- Knight Institute (Columbia): in its TikTok v. Garland amicus brief (2024), it held a reserved position on whether “TikTok’s transparency is sufficient to dispel national-security concerns,” pointing to insufficient audit independence in Project Texas.
- EFF, in its Who Has Your Back?-style evaluations, rates TikTok’s government-request disclosures significantly improved from prior years, but the structural transparency of USDS remains low (precise scores per EFF’s annual report).
- EDRi (European Digital Rights): TikTok’s DSA report is exemplary in the machine-readable handling of Statements of Reasons, but SRA independence—dependent on a single third-party auditor (Kroll)—is still insufficient.
- Helen Toner / Paul Triolo: the domestic transparency of Chinese frontier-model firms falls below international peers both because of the regulatory architecture (filing in place of reporting) and because firms themselves prefer not to increase legal exposure.
Chinese scholarship
Section titled “Chinese scholarship”- Zhang Linghan: the legal sources of domestic “platform transparency duties” are scattered (Algorithm Recommendation Provisions Art. 12, Personal Information Protection Law Art. 24, Data Security Law Art. 21, etc.), and no statute mandates an independent transparency report, so firms’ disclosure incentives are far weaker than for the EU DSA.
- Dai Xin: proposes the dichotomy “regulatory transparency vs. public transparency”— the former is oriented to what regulators can see, the latter to what the public can see. China currently sits closer to the former.
- Zhu Yue: the CAC’s “typical cases roster” is a state-led exemplary disclosure, analogous to Western third-party rankings, but with the selection authority held by the regulator rather than by society.
- Wu Hong: for ByteDance, strengthening domestic transparency is not necessarily in the company’s interest— no domestic peer discloses aggressively, so unilaterally raising the bar effectively creates a compliance baseline for itself.
8. Operational insight: why is ByteDance “overseas > domestic”?
Section titled “8. Operational insight: why is ByteDance “overseas > domestic”?”Three stacking causes:
- Different legal intensity: DSA Art. 15 / 24 / 34 / 35 + U.S. FISA + California AB 587 + various state laws form a dense hard-law regime; domestically there is no hard-law mandate for transparency reports (filing in place of reporting).
- Different disclosure risk: publishing specific moderation data inside China may trigger “improper disclosure of national content-management information”— legal risk to the firm is high; publishing internationally is compliance-necessary and reputationally positive.
- Capability spillover already exists: TikTok Trust & Safety has built the machine-readable SoR capability; the Doubao China edition could reuse it at low cost even without a legal mandate—the tech is ready, the institutions are not.
9. 2025–2026 Q1 developments
Section titled “9. 2025–2026 Q1 developments”- 2025-09: TikTok published its 2025 H1 DSA Transparency Report, which for the first time reported AI-generated content misidentification rates to two decimal places.
- 2025-11: TikTok published its 2025 Systemic Risk Assessment, independently audited by Kroll, covering a detailed post-mortem of the 2024 European election period and AI-generated synthetic-media risk.
- 2025-12: TikTok reached a first-batch draft agreement with the European Commission on researcher data access under DSA Art. 40.
- 2026-02: CapCut published its H2 2025 Enforcement Report, including a dedicated section on AI-effects abuse.
- 2026-03: ByteDance Group published its 2025 Corporate Social Responsibility Report, with no standalone AI-governance chapter.
- 2026-03: TikTok published its 2025 H2 DSA Transparency Report; cumulative SoR submissions to the DSA Transparency Database remain at the VLOP top tier (precise figure per the database).
- 2026-04: as a result of the Anthropomorphic Interactive Services Measures, Doubao is expected to produce the first semi-public domestic governance disclosure in 2026 H2 (in the form of filing changes + typical cases).
10. Related index
Section titled “10. Related index”- Top-level rules: Generative AI Interim Measures · Algorithm Recommendation Provisions · EU DSA (external)
- Peer comparison: Anthropic / transparency-report · OpenAI / transparency-report · Google / transparency-report
- Adjacent company pages: usage-policy · safety-framework · red-team-disclosures