Off
View Post
C4C’s Perspective on the EU AI Act: Copyright in Real Life is Messy and AI Discussions Are Not Helping

C4C’s Perspective on the EU AI Act: Copyright in Real Life is Messy and AI Discussions Are Not Helping

Cross-posted from C4C’s LinkedIn page – see the original article

The trilogue black box effect combined with an end of term

The AI Act has reached that black box moment referred to as the trilogue, where each of the three EU institutions enter a room to make a deal. Two elements might affect this process, and not necessarily in a way that allows for the outcome to be reasonable and proportionate:

  • One, as this European Commission and European Parliament’s tenures are coming to an end, there is a bigger risk a deal is made at all cost. Add to that the element of the EU loving to “set the standard” for the rest of the world (whatever that could possibly mean) and you sense that the AI Act might be negotiated with a misplaced sense of urgency.
  • Two, the creative industries are voicing their concerns more and more about the possible threats AI could pose to their current business model, a situation which especially in Europe tends to generate knee jerk reactions by policy makers at the expense of careful assessments.

The copyright creep into the AI Act discussion

The AI Act as initially proposed did not comprise any copyright references. And this is absolutely justified as the Copyright Directive in the Digital Single Market had just been adopted and comprised two provisions covering text and data mining (TDM) – Article 3 covering research organisations and cultural heritage institutions and Article 4 covering all other uses. TDM is defined in a sufficiently broad manner to cover many known machine learning processes (“any automated analytical technique aimed at analysing text and data in digital form in order to generate information which includes but is not limited to patterns, trends and correlations”).

This was confirmed in an answer by Commissioner Thierry Breton to a parliamentary question from MEP Emmanuel Maurel, where he stated that the Copyright Directive applied to AI and that hence “the creation of art works by AI does not deserve specific legislative intervention”.  

Nobody know what is copyrighted and what isn’t: do not ask AI to deliver transparency on an untransparent status

So what was added by the European Parliament?

Under Article 28b, the current Parliament position requires that providers of generative AI models should document and share a “sufficiently detailed” summary of the use of training data protected under copyright law.

Simple no? Actually, absolutely not. No one knows what is copyrighted or not. Copyright is not vested upon a work through a deliberate act like a registration: it is bestowed on any creation that meets the requirements of copyright laws, and those requirements may vary from one country to another. One of those criteria is originality, a threshold that has led to many lengthy court cases and that is in no shape or form something a web crawler or automated tool could identify.

Of course, when you feed the whole Harry Potter series in an AI training model, you probably know it is copyrighted. But what about a drawing made by a child and posted proudly by one of their parents on social media, or a poem they wrote at school and got that perfect grade on? That is likely to be worthy of copyright too. Or not. But we just don’t know.

There is no register of copyrighted works and hence there is no way to list separately which of the elements in your data set are copyrighted. For this reason, any transparency obligation that creates a subset requirement for copyrighted works is a recipe for compliance failure through no fault of the entity trying to comply, and hence for legal uncertainty.

Or as aptly stated by the COMMUNIA Association: “AI developers also should not be expected to know which of their training materials are copyrightable. Introducing a specific requirement for this category of data adds legal complexity that is not needed nor advisable”.

And that is a very different compliance requirement to one that would try to enable the reservation right given to rightholders under the commercial TDM provision of Article 4 of the Copyright Directive, as explained below.

Generative AI does not necessarily use all the data in a data set to train a model or generate something

Machine learning is about collecting huge data sets, cleaning them up, chopping them in small parts referred to as tokens, splitting them into training and test data, and allowing them to be extracted in response to a prompt. Depending on the prompt, different tokens can be relevant while others might be completely disregarded.

When it comes to generative AI, as explained by Dr Andres Guadamuz, Reader in Intellectual Property Law at the University of Sussex, “the most important takeaway from the perspective of a legal analysis is that a generative AI does not reproduce the inputs exactly, even if you ask for a specific one” and “style and a ‘look and feel’ are not copyrightable” as “copyright protects the expression of an idea, not the idea itself”.

For those that think they hold copyright, a mechanism exists on paper: maybe it should be made to function in reality

The Copyright Directive has created a reservation right (also referred to as opt-out) for rightholders wishing to express that their content should not be used in commercial TDM activities.

The conclusion to this debate has been perfectly summarised by Assistant Professor João Pedro Quintais from the Institute for Information Law (IViR): “the type of transparency that is useful is one that allows copyright holders to access datasets in order to exercise their opt-outs. It is unclear how the present text would enable that, since it imposes a requirement that cannot  be met in practice”.

More practically: remove copyright references in the AI Act so that a horizontal measure does not get polluted by sector specific measures and start working on a practical implementation of the TDM measures in the Copyright Directive, with the input of all relevant stakeholders.

Off

C4C’s perspective on Article 17 of the European Media Freedom Act (EMFA)

The Coalition for Creativity (C4C) is concerned about the developments around Article 17 in the European Media Freedom Act (EMFA). This provision, initially introduced as an unwarranted media privilege by the European Commission in the EMFA, is turning into a fully-fledged media exemption.

Our take on what policymakers should do: reject Article 17, this provision cannot be salvaged or improved and should hence be deleted

Hence, we are pleased to see that Renew Europe MEPs Monica Semedo (Luxembourg) and Anna Julia Donath (Hungary) tabled an amendment to delete Article 17 EMFA (see AM 967) to the draft EP Culture and Education (CULT) Committee EMFA report by Rapporteur by MEP Sabine Verheyen. They consider Article 17 EMFA should be deleted for numerous reasons, and highlight 3 of them:

  1. equality of freedom of speech
  2. harmonisation of horizontal rules
  3. the issue of media declaration

Check C4C’s video above and infographic below on what’s wrong with Article 17 EMFA.

C4C considers that the DSA and the P2B Regulation suffice and believes that Article 17 raise multiple questions to solve a non-existent problem:

There should be no privilege: (1) for certain actors, as making some actors more important means the speech of others is automatically of less value; (2) for media, as this creates an easily exploitable loophole in the fight against disinformation; (3) for media on VLOPs as this creates a provision resting on the wrong scope at both ends; and (4) as such a privilege is extremely difficult to combine with the multiple other obligations platforms have.

The slippery slope from media privilege to media exemption/must-carry is a path that has already been clearly condemned by EU legislators in the recent DSA debates.

This concern is confirmed by a recent FT op-ed from Anika Collier Navaroli, a practitioner fellow at Stanford University’s Digital Civil Society Lab and a former senior content moderator at Twitter, on how ‘disinformation dangers lurk in the EU’s media freedom act’. She cautions that “the DSA’s attempt to address systemic risks such as disinformation may be undermined before they have a chance to work.”

More resources:

What’s Wrong with Article 17 EMFA – Infographic

Download imagePDF

Off

C4C’s Comments on the EC’s Exploratory Consultation on the Future of the Electronic Communications Sector and its Infrastructure

At the end of February 2023, the European Commission launched a highly anticipated exploratory public consultation on “the future of the electronic communications sector and its infrastructure” that notably looks into the fair share / network fees debate – deadline: 19 May, midnight Brussels time (CEST).

C4C’s response to this consultation can be found here and below. Our answers are limited to ‘Section 4. Fair contribution by all digital players’ (see p. 36ff). Similar views have been expressed by various other stakeholders, including civil society.

Why it matters: This discussion, which has been mainly framed as Big Tech vs Big Telco, has a broader impact, as it risks negatively impacting net neutrality and turn Internet users, and other stakeholders, like startups, into collateral damage. For example, any cost increases for content providers are likely to be passed on to users, and any cost increases for major cloud providers will be passed on to their customers.

Off

C4C Co-Signs Open Letter on the European Media Freedom Act (EMFA) to the European Parliament

Along with over 40 organisations, C4C co-signed an open letter on the European Media Freedom Act (EMFA) addressed to the European Parliament.

These signatories call on the Members of the European Parliament involved in the debate to ensure that the EMFA will fulfill its potential and become a strong regulation establishing harmonised legal safeguards across Europe.

Media freedom and pluralism is the precondition of stable democracy and the rule of law.

What are some of the issues at stake?

  • Many journalists throughout the EU find themselves in increasingly captured media markets, where independent voices are sidelined due to economic pressure;
  • Journalists and media outlets struggle for financial viability;
  • Critical media outlets are often vilified by politicians.;
  • Access to financial resources is limited; and,
  • Politically influenced media takeovers create an imbalanced media system.

The EMFA should address these issues and protect free media by outlining European standards on:

  • freedom from editorial interference and illegal surveillance;
  • guaranteeing media pluralism;
  • providing greater #transparency on ownership and state financing;
  • ending the abuse of state funds to create clientelist media;
  • protecting the independence of public service media and the independence of national regulatory authorities; and,
  • ensuring there is an economic environment in which independent journalism can flourish.
Off

C4C’s Comments on the European Media Freedom Act (EMFA)

In this position paper, C4C expresses its believe that Articles 17 and 18 of the European Media Freedom Act (EMFA) cannot be salvaged or improved and should hence be deleted for the following reasons:

  1. The DSA and the P2B Regulation suffice and there is no need for a specific media content
    moderation privilege: the DSA ink is barely dry, yet another unnecessary layer of complexity
    could get added with Article 17 of the EMFA and creating a narrower right applicable to less
    platforms than the P2B Regulation makes no sense.
  2. Articles 17 and 18 of the EMFA raise multiple questions to solve a non-existent problem.
    There should be no privilege: (1) for certain actors, as making some actors more important
    means the speech of others is automatically of less value; (2) for media, as this creates an
    easily exploitable loophole in the fight against disinformation; (3) for media on VLOPs as this
    creates a provision resting on the wrong scope at both ends; and (4) as such a privilege is
    extremely difficult to combine with the multiple other obligations platforms have.
  3. The introduction of these articles in the EMFA brings us back on the slippery slope from
    media privilege to media exemption
    /must-carry, a path that has already been clearly
    condemned by EU legislators.

Similar views have been expressed by a group of civil society organisations.

C4C shared this position in its response to the European Commission’s feedback opportunity on the EMFA proposal.

Off

C4C Statement on the European Media Freedom Act (EMFA)

Following the presentation of the European Media Freedom Act by the European Commission on 13 September, the Coalition for Creativity (C4C) would like to express its concerns about a ‘media exemption’ popping-up again and the lack of proper transparency rules for the media industry.

Come-Back of the ‘Media Exemption’ (Article 17)

We believe that the treatment of media content by online platforms is a complex and sensitive issue that merits a proper debate and a thorough evaluation. Despite the clear rejection of the strong calls for a media exemption during the DSA and DMA debates, this demand is making its come-back through the EMFA, an instrument that clearly identifies media capture as a problem the EU is not spared  of. This risks having a detrimental impact on access to information and media pluralism, while possibly jeopardising the EU’s efforts against disinformation.

Ms Caroline De Cock, C4C Coordinator, said:

“Attributing new privileges to media organisations without awaiting the implementation and impact of the DSA provisions, would at the very best be premature and ill-thought, and at the very worst could open Pandora’s box of issues related to the identification of which media outlets would benefit from it as well as possible claims from other actors to benefit from similar privileges. One would hope all users can benefit from due process in content moderation.”

Missing Element: Strong Transparency

The proposal appears to be very light in terms of ensuring media ownership transparency, despite this being one of the crucial elements that civil society stakeholders have been calling for.

Ms Caroline De Cock, C4C Coordinator, said:

“Brussels has been big on transparency, so the failure to put forward a strong framework to unravel the financial streams in the media industry is disappointing. EU policymakers need to urgently facilitate the public scrutiny of these money flow seeing the media’s role as watchdog of democracy and in shaping the public opinion.”

Off

C4C Joins Opposition Against Revised US Journalism Competition and Preservation Act

On 2 September, C4C joined over 20 organisations in opposing the revised US Journalism Competition and Preservation Act (JCPA) and warning the US Congress against adopting this Act. The battles fought over issues such as the ‘link tax’ during the negotiations on the EU Directive on Copyright in the Digital Single Market (DCDSM) are popping up again in the context of the JCPA in the US. Just as during the DCDSM discussions we strongly oppose any attempts to curtail freedom of expression and access to information, and continue to believe that these misguided approaches fail to benefit and support smaller media outlets. See EFF’s campaign again the US link tax.

The full text of the letter can be found here and below.

Other relevant resources:

Off

CJEU Dismisses Polish Action Against Article 17 DCDSM But Stresses Importance of User Safeguards

Brussels, BELGIUM – Today, 26 April, the Court of Justice of the European Union (CJEU) issued its highly anticipated ruling (CJEU press releasejudgement) in the case initiated in 2019 by the Polish government seeking the annulment of Article 17(4)(b) and (4)(c) of the Directive on Copyright in the Digital Single Market (DCDSM) (Case C-401/19).

The following statement can be attributed to Ms Caroline De Cock, Coalition for Creativity (C4C) coordinator:

“It is reassuring to see the attention the Court has given to the many safeguards put in place by the EU legislators in Article 17 in order to ensure the respect of the right to freedom of expression and information of users of platforms qualified as OCSSPs. This emphasis, as well as the reiteration of the general monitoring prohibition, must now be reflected in the national transpositions of Member States, which must deliver the fair balance between fundamental rights that the Court considers crucial for Article 17 to be deemed legal.”

C4C is a broad-based coalition that seeks an informed debate on how copyright can more effectively promote innovation, access, and creativity. C4C brings together libraries, scientific and research institutions, digital rights groups, technology businesses, and educational and cultural heritage institutions that share a common view on copyright.

For press inquiries on this please contact Ms De Cock at secretariat@coalition4creativity.org or +32 474 84 05 15.

[Note: We are still analysing the judgement and reserve the right to update.]


The CJEU ruling summarised in one image

Other relevant resources:

Off

C4C Statement on the Political Agreement on the EU Digital Services Act (DSA)

Following the political agreement reached between the EU institutions during the 5th and final trilogue negotiations on the Digital Services Act (DSA) on 22-23 April, the Coalition for Creativity (C4C) would like to express its initial disappointment about potential pitfalls in this regulation.

Ms Caroline De Cock, C4C Coordinator, said:

“The DSA political agreement lacks user safeguards and continues to push vague concepts open to dangerous misinterpretations. The pursuit of a quick deal under the French Council Presidency, ahead of the 2nd round of the Presidential elections, at the expense of upholding fundamental rights is questionable at best and counterproductive at worst.

Last minute attempts by certain stakeholders to transform search engines in dangerous censorship tools were thankfully deflected, even though one has to wonder how these repeated jabs at freedom of speech are even allowed to pop up.

Once again, users have been left out and sidelined in a debate on important concepts, such a preserving their freedom of expression and access to information online. One can only hope that the DSA will not end up empowering less democratic regimes to exercise control over the speech of citizens outside of their Member State borders.

The EU has once more failed in truly stepping up to the plate in its role as a global actor and a guardian of fundamental rights. One can only fear the ripple effects the DSA will have on the policy agendas of other geographies, as has been notably evidenced following the adoption of the Directive on Copyright in the Digital Single Market and national legislation, such as the German NetzDG.

This is notably the case with the introduction of a crisis mechanism that could potentially be easily misused.

The expected compliance burden will be so high that the DSA risks consolidating Big Tech’s position to the detriment of smaller players, annihilating any positive market effects stemming from the DMA.

It remains to be seen if the SME exemption recognizes the need to recognizes the need to handle not-for-profit players, such as educational and scientific repositories, digital archives and libraries, differently, or merely creates a gaping loophole to the sole benefit of commercial entities.

Finally, C4C believes that in fine-tuning and applying the rules in practice, the European Commission and national regulators will need to ask and listen to the input of civil society, especially as it pertains to key fundamental rights such as freedom of speech and access to information.”

Off

C4C-Liberties Joint Op-Ed ‘Ukraine: The DSA cannot let filters blind us to war crimes’

On 19 April, Eva Simon, senior advocacy officer at the Civil Liberties Union for Europe (Liberties), and Caroline De Cock, C4C Coordinator, co-wrote an op-ed published on EURACTIV on the problems surrounding filtering ahead of the crucial and determining 5th trilogue on the Digital Services Act (DSA) on 22 April 2022. The op-ed has also been translated in German by EURACTIV. Highlighted quotes below.

Liberties (@LibertiesEU) is a European network of independent non-governmental organisations which aims to safeguard the human rights of everyone in the European Union. 

The Coalition for Creativity (@_C4C_) brings together libraries, scientific and research institutions, digital rights groups, technology businesses, and educational and cultural heritage institutions that share a common view on copyright. 


Ukraine: The DSA cannot let filters blind us to war crimes

The Digital Services Act (DSA) must protect our rights by including meaningful safeguards to uphold the fundamental rights of internet users, especially for those engaged in the indispensable work of documenting war crimes. 

Caroline De Cock & Eva Simon – ‘Ukraine: The DSA cannot let filters blind us to war crimes‘ (EURACTIV, 19 April 2022)

In the case of the Syrian Archive, inaccurate automated tools removed the videos because these lacked understanding of linguistic or cultural nuances, and could not differentiate between journalistic resources and war propaganda. That contextual blindness still exists in today’s automated content moderation tools, and they still lead to situations where completely legitimate content is wrongfully made inaccessible.

We must learn from these mistakes, and ensure that journalists, activists, or anyone else can share their opinions or (video) evidence without the threat of it disappearing, simply because online platforms are coerced into implementing poorly working automation tools.

Caroline De Cock & Eva Simon – ‘Ukraine: The DSA cannot let filters blind us to war crimes‘ (EURACTIV, 19 April 2022)

The DSA requires Big Tech companies to introduce risk analysis to predict and mitigate the negative effects on the exercise of fundamental rights, such as privacy, free speech, the prohibition of discrimination or the rights of minors. This should not be interpreted as requiring the mandatory use of automation.

Similarly, under no circumstances should co-regulatory measures such as the Code of Practice on disinformation impose pressure on online platforms to remove content so swiftly that it would necessitate the intensified deployment of automated tools.  

Caroline De Cock & Eva Simon – ‘Ukraine: The DSA cannot let filters blind us to war crimes‘ (EURACTIV, 19 April 2022)

The DSA must protect our rights by including meaningful safeguards to uphold the fundamental rights of internet users. It must also maintain the prohibition of general monitoring obligations, let that be automated or non-automated. Also, it should certainly not impose, either directly or implicitly, the use of mandatory upload filters or other content moderation automation.​​

Caroline De Cock & Eva Simon – ‘Ukraine: The DSA cannot let filters blind us to war crimes‘ (EURACTIV, 19 April 2022)

The DSA must preserve users’ privacy online and the continued prohibition against general monitoring by online platforms is an essential element thereof. In parallel, the ability to use the internet anonymously and through encrypted services offers crucial safeguards against monitoring. Such safeguards are in the text proposed by the European Parliament but now need to be embraced by the Council too. 

Caroline De Cock & Eva Simon – ‘Ukraine: The DSA cannot let filters blind us to war crimes‘ (EURACTIV, 19 April 2022)

The DSA is going through the final stages of the trilogue negotiations. As the European Parliament rightfully recognizes in its mandate, it is crucial to prohibit the mandatory use of upload filters. Such prohibition is the only way to avoid disproportionate limitations on access to information, freedom of expression and personal data protection.

Caroline De Cock & Eva Simon – ‘Ukraine: The DSA cannot let filters blind us to war crimes‘ (EURACTIV, 19 April 2022)