You will be redirected to the website of our parent company, Schönherr Rechtsanwälte GmbH: www.schoenherr.eu
Welcome to the May edition of Schoenherr's to the point: technology & digitalisation newsletter!
We are excited to present a selection of legal developments in the area of technology & digitalisation in the wider CEE region.
Poland is one step closer to implementing the European Electronic Communications Code ("ECC"). On 7 May 2024, the Polish Council of Ministers adopted a draft Law on Electronic Communications ("LEC"), which is intended to align Polish law with EU regulations. In Poland, the initial legislative work on this act began in 2020 but ultimately was unsuccessful.
It is worth highlighting that the deadline for ECC implementation was 21 December 2020. Poland was not the only country to fail to meet this deadline. Spain, Croatia, Latvia, Lithuania, Ireland, Portugal, Romania, Slovenia and Sweden also did not meet this obligation. As a result, in 2022 the European Commission brought the cases before the Court of Justice of the European Union (CJEU). On 14 March 2024, the CJEU ruled that Poland was obliged to pay a one-time penalty of EUR 4m plus EUR 50,000 per day from the date of the judgment until the infringement is eliminated (i.e. implementation of the ECC provisions).
As a reminder, the ECC aims to:
The final adoption of the LEC will strengthen the consumer's position vis-à-vis the service provider at the pre-contractual stage and at the service provision stage. Service providers will have to provide consumers with a summary of the terms and conditions of the contract following a uniform template across the European Union. In addition, consumers will have access to a tool allowing them to compare offers from service providers and choose the best option from their perspective. The tool is intended to be developed by independent entities. However, if such a tool is not developed, the proposed regulations mandate that the Polish Office of Electronic Communications create and launch the tool itself.
The Polish Financial Supervision Authority recently submitted a questionnaire to financial entities covered by the Digital Operational Resilience Act (DORA).
The authority requires financial entities to indicate their fulfilment of the requirements under DORA. By doing so, the authority will assess the level of readiness of supervised entities to manage the risks associated with the technologies they use.
The authority's questions address issues such as:
Financial entities must respond to 200 questions. For each question, financial entities should indicate the level of compliance with the requirement, how the requirement has been implemented, or planned activities to fulfil the requirement.
The questionnaire can help financial entities assess the status of DORA implementation in their organisation and improve implementation. DORA comes into effect in January 2025.
On 24 May 2024, the EDPB adopted an Opinion on the use of facial recognition technologies by airport operators and airlines to streamline passenger flow at airports.
"More and more airport operators and airline companies around the world are piloting facial recognition systems allowing passengers to go more easily through the various checkpoints," commented by EDPB Chair Anu Talus. "It is important to be aware that biometric data are particularly sensitive and that their processing can create significant risks for individuals. Facial recognition technology can lead to false negatives, bias and discrimination. Misuse of biometric data can also have grave consequences, such as identity fraud or impersonation. Therefore, we urge airline companies and airport operators to opt for less intrusive ways to streamline passenger flows, when possible. In the view of the EDPB, individuals should have maximum control over their own biometric data."
The Opinion evaluates the compatibility of processing with the storage limitation principle, the integrity and confidentiality principle, data protection by design and default, and security of processing.
The EDPB assessed the compliance of processing passengers' biometric data using four different storage solutions, ranging from those that store the data only with the individual to those relying on centralised storage architectures. In all cases, only the biometric data of passengers who actively enrol and consent should be processed. Additionally, a report on the work of the ChatGPT taskforce was adopted. This taskforce, created by the EDPB, aims to promote cooperation among DPAs investigating the chatbot developed by OpenAI.
The report offers preliminary views on several aspects discussed among DPAs and does not prejudge each DPA's ongoing investigation. It examines various aspects concerning the common interpretation of GDPR provisions relevant to ongoing investigations, including:
The report underscores the importance of allowing data subjects to exercise their rights effectively. Taskforce members also developed a common questionnaire for exchanges with OpenAI, included as an annex to the report.
On 7 May 2024, the European Media Freedom Act (EMFA) entered into force. The EMFA introduces a new set of regulations to safeguard media pluralism and independence within the EU. These rules aim to facilitate the operation of both public and private media across borders within the EU internal market, free from undue pressure, while also considering the digital transformation of the media landscape. The EMFA aims to (i) protect editorial independence, (ii) protect journalistic sources, including against the use of spyware, (iii) ensure the independent functioning of public service media, (iv) enhance transparency of media ownership, (v) safeguard media against unjustified online content removal by very large online platforms, (vi) introduce a right of customisation of the media offer on devices and interfaces, (vii) guarantee transparency in state advertising for media service providers and online platforms, (viii) ensure Member States provide an assessment of the impact of key media market concentrations on media pluralism and editorial independence, and (ix) boost transparency in audience measurement for media service providers and advertisers.
The obligations established in the EMFA are addressed to (i) Member States, (ii) national authorities and regulators, (iii) media service providers, (iv) providers of very large online platforms (VLOPs), (v) manufacturers, developers and importers of devices or user interfaces that control access to media services, and (vi) providers of audience measurement systems.
As of February 2025, a newly established independent European Board for Media Services, consisting of representatives from national media authorities or bodies and supported by a Commission secretariat, will begin operating. The Board will promote the effective and consistent application of the EU media law framework, replacing the European Regulators Group for Audiovisual Media Services (ERGA) created under the Audiovisual Media Services Directive.
Generally, the newly introduced regime will apply as of 8 August 2025.
On 21 May 2024, the European Union finally passed the Artificial Intelligence Act ("AI Act"). With the adoption of the AI Act, the European Union has become the first to introduce a comprehensive regulation that will bring regularity to the use of artificial intelligence.
In a previous procedural step, the EU Parliament adopted the AI Act by a majority in March. Originally, only the optimists assumed that the EU's agreement on AI in December 2023 would lead to a straightforward path for the adoption of relevant legislation. However, the long-awaited legislation was finally adopted within six months.
The AI Act is unique not only because no one in the world has introduced similar regulations, but also because it differentiates AI according to the potential risk to the user.
In addition, the provisions of the AI Act further stipulate the establishment of new bodies dealing with AI. These include (i) an AI Office within the Commission to enforce the common rules across the EU, (ii) a scientific panel of independent experts to support the enforcement activities, (iii) an AI Board with Member States' representatives to advise and assist the Commission and Member States on consistent and effective application of the AI Act, and (iv) an advisory forum for stakeholders to provide technical expertise to the AI Board and the Commission.
The fines for infringements of the AI Act are set as a percentage of the offending company's global annual turnover in the previous financial year or a predetermined amount, whichever is higher.
In the context of the above news, the development of the newest version of GPT, GPT-4o, is crucial. GPT-4o offers features that were not available in GPT-3.5, such as data, file and image analysis, AI access to websites and the use of custom GPTs. The catch is that access to these features will be limited and restricted. Users will likely soon be able to enjoy the benefits of the next generation of ChatGPT on a permanent basis.
The EU AI Act will become applicable two years after it enters into force, with some exceptions. These include the prohibition regarding unacceptable AI risk (six months), finalisation of the codes of practice (nine months) or providing information on how the competent authorities and single points of contact can be contacted (12 months).
In 2022, the Austrian Constitutional Court (VfGH) annulled the comprehensive media privilege of Section 9 DSG, which exempted media companies from most data protection obligations under the GDPR. The court ruled that the media privilege was too broad, violated the fundamental right to data protection, and held that the legislator must find a balance between the fundamental rights of freedom of expression and information and the right to data protection. A new draft law aims to achieve this balance by introducing a nuanced data protection media privilege and limiting data subject rights in the context of data processing for journalistic purposes.
The new draft law, which is expected to enter into force in July 2024, proposes a data protection "editorial secret" that goes beyond the normal editorial secret stemming from the Austrian Media Act. The draft law protects data processed for journalistic purposes ("editorial secret") from data requests by data subjects, third parties, and even the supervisory authority. The draft law also imposes some restrictions on data subject access rights, such as requiring individual justification, charging a processing fee and allowing for the rejection of access requests if the editorial secret opposes them. Additionally, the rights to rectification, erasure and restriction of processing are largely to be excluded. In return, the protection of personal data is to be ensured through the enforcement of general obligations, particularly the processing principles under data protection law, duties of responsibility prior to data processing and data security obligations.
This means that journalists will be largely independent of the GDPR in their work and that most data subject rights cannot be exercised if personal data is processed for journalistic purposes. The supreme courts (particularly the ECJ) may soon have to decide whether this legal privilege is again too broad or whether the legislator has found a suitable balance between the fundamental rights to data protection, freedom of expression and information.
Piotr
Podsiedlik
Junior Associate
poland