You will be redirected to the website of our parent company, Schönherr Rechtsanwälte GmbH: www.schoenherr.eu
Welcome to the March edition of Schoenherr's to the point: technology & digitalisation newsletter!
We are excited to present a selection of legal developments in the area of technology & digitalisation in the wider CEE region.
13 March 2024 stands as a remarkable date in history, as the EU Parliament greenlit the Artificial Intelligence Act, marking a decisive stride towards ensuring safety, upholding fundamental rights and fostering innovation throughout Europe. This pivotal regulation, hammered out with Member States in December 2023 and backed by MEPs with resounding support, imposes explicit obligations to shield fundamental rights, democracy and environmental sustainability, while propelling Europe into a leadership role in the AI arena.
This groundbreaking legislation not only paves the way for technological progress but also reaffirms the steadfast commitment to safeguarding individual rights and societal principles. By outlawing high-risk AI applications that encroach upon citizens' rights and establishing stringent mandates for high-risk systems, the AI Act positions human dignity and European values at the forefront of AI advancement. It serves as a testament to our collective resolve in combating discrimination, ensuring transparency and championing the rights of workers and citizens.
Another significant development that demands attention is the recent surge in Bitcoin prices and the reinvigoration of the blockchain industry, poised to reignite investment in crypto startups. As the Crypto Asset Market gains momentum, numerous nations are aligning their legislative frameworks with these regulations, with Poland actively engaged in this endeavour. Given the substantial number of virtual asset service providers registered in Poland, their keen interest in the shape of Polish regulatory oversight is evident.
Furthermore, this edition delves into various privacy-related issues, such as employers' responsibilities regarding the deletion of candidates' CVs post-recruitment process and the jurisdictional scope for addressing complaints concerning the processing of personal data by parliamentary investigative committees.
Additionally, we revisit the Digital Services Act (DSA), which introduces fresh obligations for hosting service providers, underscoring the imperative of adherence to regulatory standards to uphold user safety and privacy. As key intermediaries in the digital landscape, hosting service providers play a pivotal role in content dissemination, making compliance with DSA guidelines indispensable for fostering a secure online environment.
After lengthy negotiations, the European Parliament finally approved the AI Act on 13 March 2024. The adoption of the regulation is a milestone event that all EU countries and their citizens have been waiting for. It will mean that the EU will be the first to introduce a comprehensive regulation dealing with the use of artificial intelligence. The majority of Members of the European Parliament voted in favour of the AI Act, with 523 in favour, 46 against and 49 abstentions.
The regulation that was passed maintained the most important principles of the AI Act, i.e. the classification of systems using artificial intelligence on the basis of the level of risk generated. In this case, four levels of such risk are distinguished: 1) unacceptable risk; 2) high risk; 3) limited risk; 4) minimal or no risk.
One of the additional goals of the AI Act is to prevent disinformation and fake news. Accordingly, content created by so-called generative AI (e.g. ChatGPT) has to be clearly labelled as such. Furthermore, the AI Act Regulation will establish a new body – the European Council on Artificial Intelligence. National AI authorities will also have to be established in EU Member States to monitor compliance with the new regulations.
It is also worth noting that while regulations are directly applicable under EU law, in this case, the provisions will also have to be partially implemented in the Member States within two years of publication of the regulations in the Official Journal of the EU.
Please find out more about the AI Act here: https://www.schoenherr.eu/capabilities/industries/technology-digitalisation/artificial-intelligence/
Worldcoin is a crypto/blockchain project that aims to create a global identity verification and financial network using iris scans as a unique identifier for its users ("proof of personhood"). Users who scan their irises with a device called the "Orb" receive a worldwide unique and irrevocable ID and free Worldcoin tokens, a digital currency. The Orbs are devices located in scanning centres operated by independent local service providers. The project was launched in 2023 by Sam Altman, the CEO of OpenAI, and has over four million users worldwide. However, Worldcoin has faced regulatory challenges and privacy concerns in various countries due to its collection and processing of biometric data.
One of the countries that has taken action against Worldcoin is Spain, where the data protection authority, AEPD, ordered the project to suspend its activities in the country. The AEPD issued a temporary ban, valid for up to three months, to stop Worldcoin from collecting and using personal data related to the iris scans. The AEPD based its decision on multiple complaints about Worldcoin, including providing insufficient information to users, collecting data from minors, and lacking a mechanism for users to withdraw consent. The AEPD also invoked "exceptional circumstances" to protect personal data and prevent irreparable harm to privacy rights.
The AEPD's intervention is in line with the GDPR, which requires a valid legal basis for processing personal data, especially sensitive data such as biometric data. Other European countries, such as Portugal, France and Germany, are also investigating Worldcoin's GDPR compliance, as well as the UK Information Commissioner's Office.
Worldcoin has denied any unlawful activity and expressed a willingness to engage with regulators and clarify its technology. The company's Data Protection Officer has claimed that the AEPD is circumventing EU law and spreading misleading claims. However, Worldcoin has also faced service suspensions and investigations in other countries, such as Kenya, India, Brazil, Argentina and Hong Kong, where the Privacy Commissioner for Personal Data raided six premises controlled by Worldcoin.
Worldcoin's mission and the novel method of identity verification have attracted public interest and curiosity, but also raised questions and doubts about the privacy and security of (sensitive) personal data.
Recently, a groundbreaking ruling was issued in Poland by the Supreme Administrative Court regarding the matter of whether companies must delete candidates' CVs immediately after a recruitment process.
The whole case started with the would-be employee asking the recruiting company to remove her data from its database. The company refused, arguing that it needed to keep the data in case one of the candidates accused the company of treating candidates unequally. In addition, the company indicated that the data would be kept for the period required by the GDPR and the company's internal acts, but was unable to provide the exact period.
As a result, the would-be employee filed a complaint to the Personal Data Protection Office (PDPA). In response to the complaint, the President of the PDPA ruled negatively for the company and imposed a warning on it for unlawfully processing the personal data of the would-be employee after the recruitment process had been completed and failing to properly comply with the information obligation in relation to the acquisition of personal data during the recruitment process.
After receiving this decision, the company filed a complaint to the Provincial Administrative Court (PAC). The PAC ruled in favour of the company and overturned the decision of the PDPA. In the justification, the court stated, among other things, that the company, as a data controller, but also as an employer, must comply with the Labour Code regarding the prohibition of discrimination during employment, but also during the recruitment of new employees. It also pointed out that, according to the regulations, there is a limitation period for claims related to, among other things, discrimination. The period is therefore clearly limited by law. Thus, the company had a legitimate interest in storing the data of the would-be employee for a certain period of time (amounting to three years) for potential use in a possible lawsuit.
Against the above ruling, the PDPA filed a cassation appeal to the Supreme Administrative Court (SAC). The SAC agreed with the judgment of the PAC and dismissed the complaint of the PDPA. In its reasoning, it indicated that the company had a legitimate interest in storing the would-be employee's data for a legally defined period of time, as the candidate could bring a discrimination claim against the would-be employer.
In the first half of March, the President of the Personal Data Protection Authority (PDPA) responded to the Minister for European Union Affairs on whether the judgment of the Court of Justice of the European Union (CJEU) of 16 January 2024 in Case C-33/22 Österreichische Datenschutzbehörde will require relevant amendments to Polish law.
The facts of the aforementioned judgment were as follows. In 2018, an investigative committee was set up in Austria to investigate the possibility of political pressure on the Bundesamt für Verfassungsschutz und Terrorismusbekämpfung, which was replaced on 1 December 2021 by the Direktion Staatsschutz und Nachrichtendienst (Directorate for State Security and Intelligence Services, Austria). In the course of its work, the investigation committee interviewed an officer of the Federal Police brigade for combating crime on public roads as a witness. Despite the witness's request to anonymise his data, the committee disclosed it. The witness therefore filed a complaint to the Austrian national data protection authority (the "Authority") regarding the disclosure of his data.
The Authority dismissed the complaint on the grounds that, according to the principle of the separation of powers, the executive (i.e. the Authority) cannot exercise control over the legislative authority, under which the investigation committee was subordinate.
The witness then challenged the Authority's decisions before the Federal Administrative Court, which upheld the complaint and cancelled the Authority's decision, indicating that the Authority was legally competent to hear the complaint under Article 77 of the GDPR. The Authority brought an action against this ruling before the Administrative Tribunal, which asked the CJEU, inter alia, what entity has the competence to supervise the application of the GDPR by an investigation committee set up by the parliament of a Member State. In the judgment, the CJEU held that where a Member State has chosen under the GDPR to set up a single supervisory authority, but without conferring on it the competence to supervise the application of the GDPR by an investigation committee set up by the parliament of the Member State in the exercise of its powers of executive scrutiny, the provisions in question directly confer on that authority the competence to hear complaints about the processing of personal data by the aforementioned investigation committee.
Therefore, the Polish PDPA stated that, in the light of the CJEU judgment in question, it is to be assumed that the GDPR explicitly grants the PDPA President the jurisdiction to hear complaints regarding the processing of personal data by the investigation committee.
In addition, the PDPA highlighted that there is no need to amend the law in this regard. However, it will be necessary to take the insights provided by the above judgment into account when interpreting the provisions of the GDPR.
On 22 February 2024, the Supreme Audit Office in Poland (the "NIK") published the results of inspections carried out in several local government bodies. The inspections were triggered by a growing number of media reports on inappropriate conduct by government officials related to such a basic element of security as using e-mail addresses in public domains for official purposes and processing personal data through them. NIK checked how the protection and correctness of data processing is ensured, including personal data collected electronically by local government units and their subordinate organisational units on websites, e-mail and in connection with the sessions of legislative bodies held. The results were not satisfactory, as the audit found years of negligence related to personal data protection, unawareness of risks and lack of clear guidelines. Certain elements of the personal data protection system in local government units were in poor condition. The NIK's further analysis shows a high probability of similar irregularities in several thousand public units across the country that exchange correspondence via e-mail inboxes on a daily basis, while using hosting and commercial domains. The analysis shows that 43 % of educational institutions, 32 % of public health care institutions and 28 % of social welfare centres use major e-mail providers in commercial domains on a daily basis, e.g. wp.pl, poczta.onet.pl, gmail.com. NIK has diagnosed the systemic nature of irregularities in the field of data protection and processing, including personal data in local government units. Therefore, the audit will be expanded to include all local government units in Poland.
In recent weeks, a new EU regulation, the so-called Digital Services Act, has come into force (the "DSA"). The DSA applies to intermediary services offered to service recipients who are based or located in the EU, regardless of the location of the providers of these intermediary services.
In this edition of the newsletter, we will focus on hosting services. According to the provisions of the DSA, hosting services involve the storage of information provided by and at the request of the recipient of the service.
Given the broad definition of hosting services, many entities can be regarded as providers of such a service. For example, they can be:
The DSA imposes several obligations on the hosting provider, including:
In relation to the above duties, it is necessary, among other things, to:
The proposed law aims to align national regulations with EU Regulation (EU) 2023/1114 on crypto-assets, ensuring uniformity across Member States while fostering innovation, fair competition and investor protection. It introduces a comprehensive framework for crypto-asset markets, including provisions for issuing permits to crypto-asset service providers and regulating various token categories, such as asset-referenced tokens and electronic money tokens. The law designates the Financial Supervision Commission (KNF) as the competent authority responsible for supervising and enforcing regulatory compliance. It outlines obligations for token issuers and service providers, including reporting requirements to the KNF and granting the authority additional powers to request necessary information for oversight purposes.
Moreover, the law introduces supervisory measures to prevent violations, including the ability to suspend public offerings of crypto-assets and impose sanctions on offenders. These sanctions may include fines, arrest or imprisonment, depending on the severity of the offence. Additionally, the law addresses confidentiality in crypto-asset services, defining situations where disclosing professional secrets to designated authorities is permissible.
To implement these measures effectively, the proposed law necessitates amendments to various existing laws governing financial and regulatory matters. These amendments aim to streamline the regulatory process and ensure consistency with the overarching goals of the EU Regulation.
Overall, the proposed law seeks to create a robust regulatory framework for crypto-assets, balancing the need for innovation and market development with investor protection and financial stability. By harmonising regulations and empowering supervisory authorities, it aims to enhance consumer confidence, facilitate cross-border activities, and contribute to the competitiveness of EU Member States in global financial markets.
OpenAI, revered for its cutting-edge technology and leadership in the field of AI, unveiled "Sora", a groundbreaking text-to-video generator, just a few weeks ago. This innovative tool is capable of crafting videos of up to 60 seconds in duration, driven by written prompts and powered by generative AI. Sora's debut marks a monumental leap forward in the landscape of AI-driven content creation, ushering us into a realm poised for the advent of Artificial General Intelligence (AGI). Merely a month and a half ago, Google's Lumiere unveiled a similar concept, setting the stage for OpenAI's foray into this exhilarating technological domain, igniting a fervent buzz within the industry.
However, alongside Sora's remarkable array of user benefits, it also brings forth a set of intellectual property, privacy and data protection concerns. This dynamic landscape is characterised by a rollercoaster of legal challenges, replete with lawsuits, uncertainties and nuanced legal implications, poised to unfold as never before.
Sora will enable the creation of videos, which could lead to mass production of synthetic content like deepfakes. Even though Sora is still not publicly available, with OpenAI reiterating that it wants to put some safety guardrails in place before making it public, it brings advanced issues regarding rights in likeness and voice (which Sora will not initially have but will probably be available to users at some point in the future).
The technology also raises ethical questions, particularly around the creation of deepfake videos or misleading content. In this matter, Sora's users will not be able to generate videos showing extreme violence, sexual content, hateful imagery or celebrity likenesses. There are also plans to combat misinformation by including metadata in Sora videos that indicate they were generated by AI; however, it is still unclear in which manner this should be regulated. In this aspect, we expect that the EU AI Act will come into play with transparency obligations for the providers of generative AI systems and impose watermark or different labelling obligations for systems which, like Sora, can create synthetic content.
Other concerns relating to mimicking likeness and voice may also risk reputational harm or legal actions, such as fraud or defamation. It is also questionable whether consent has been obtained from the people whose name, image, likeness or other personal data were used for training or whether there is any other legal ground stemming from the GDPR for processing personal data collected to train models like Sora. The use of text-to-video AI models like Sora and Lumiere will doubtless raise many privacy concerns and likely lead to a high number of privacy-related claims and court cases.
The resolution of IP issues surrounding Sora remains ambiguous, primarily due to the absence of established legal precedents in this industry. However, it is imperative to recognise that Sora is not exempt from the typical IP-related challenges encountered by all AI technologies.
Akin to its AI counterparts, Sora undergoes training on expansive datasets, often scraped from the internet at large. This practice introduces considerable legal uncertainty regarding whether the content employed in AI training, as well as the resulting outputs, result in the infringement of IP rights. It is conceivable that Sora has been trained using copyrighted materials owned by third parties. Given Sora's capability to generate lifelike video content and even simulate entire video games, there exists a tangible risk of inadvertently producing materials that infringe upon earlier copyrights. OpenAI is already facing several proceedings on account of IP infringement, including lawsuits alleging copyright infringement and other intellectual property issues, such as those initiated by the New York Times, the Authors Guild, Raw Story Media, Intercept Media and others.
Of fundamental concern is the liability of generative AI developers, service providers, customers and end-users for IP infringement. Courts have yet to rule on how we should apply the existing copyright rules to AI training processes. Is it appropriate to hold such entities accountable for IP violations, and if yes, which entity in particular? Does the use of copyright materials for AI training purposes falls under an exception? Is any compensation payment due for such training? Should infringing models or outputs be destroyed?
These questions are not much different from those we have already been asking over the past year and a half. However, with Sora we might see a more voluminous infringement of other IP materials that were not seriously utilised in earlier AI models, such as photographs, trademarks and designs. Bits and parts of Sora-created materials made publicly available seem to indicate that Sora heavily relies on the use of trademarks and designs (for products, scenery shops, etc.). Furthermore, with the integration of sound and music elements into Sora's capabilities, fresh concerns pertaining to copyright protection and sound trademarks are poised to emerge. Finally, photographers may contend that segments of Sora-generated videos encroach upon their rights. While OpenAI has not divulged the precise origins of the data used to train Sora, it has disclosed its utilisation of publicly available videos licensed from copyright holders and anticipated to address these concerns through licensing agreements or alternative contractual arrangements with intellectual property rights holders.
The question of authorship raises equally significant questions. Prevailing legal norms stipulate that only natural persons may be deemed authors, and in the current legal landscape, it is highly unlikely that AI-generated output can attract copyright protection. In practice, OpenAI will most likely (if it is the same case as it is with ChatGPT) assign you all rights to the output created through AI. But such outputs may not qualify for copyright protection, which raises questions about the possibility of enforcing any copyright in this context.
Sora is still not publicly available and is expected to undergo a few procedures before becoming available to the general public. As with other AI tools, not only is Sora not safe from legal issues, but it may be even more open to potential lawsuits, as it seems to rely even more heavily on IP protected materials (not just copyright, but also trademarks and designs). As we await decisions on the pending litigations, AI is rapidly evolving, and legislators need to closely follow the developments and act decisively to safeguard the interests of both users and IP owners. Establishing guidelines and safeguards to prevent misuse will be essential for maintaining trust in the technology and being beneficial to society. The era of AI is indisputably upon us. Whether you like it or not, it will evolve over time and has already become part of our reality.
We, Andreas and Niklas, participated in this years' Skinnovation conference from 13 to 15 March. The conference combined skiing and entrepreneurship and took place in the unique and beautiful setting of Innsbruck and Axamer Lizum. We were delighted to be part of this conference and to share our expertise with the participants.
As lawyers specialising in start-ups, venture capital and technology matters, we hosted a walk-in law clinic during which we offered legal advice and consultation to start-ups and entrepreneurs. This year, participants were particularly interested in the following topics:
In addition, we hosted a lunch table at the Hoadlhaus at an altitude of 2,340 m above sea level. We invited participants to join for a delicious meal while having a conversation on legal matters and best practices. We enjoyed the lively and informal exchange with the attendees and took some knowledge back to Vienna.
Finally, we also had the honour of attending the speakers' dinner at the Hoadlhaus, where we met and networked with some of the inspiring speakers and organisers of the event. We all appreciated the opportunity to learn from each other's perspectives on the current and future trends in the start-up and innovation scene.
We would like to thank the Skinnovation team and the participants for making this event a success and a memorable experience. We hope to see you again at the next edition of Skinnovation in 2025!
Venture capital: Taxation of ESOP shares and options: One step forward, two steps ... sideways?
The Czech start-up/VC community has long been calling for a change in how employee shares/options and related plans (ESOPs) are taxed, citing various other European Union jurisdictions as being significantly more progressive and therefore conducive to the development of a start-up ecosystem.