Slovakia

The eKasa case

Link to the original judgment (in Slovak): Constitutional Court of the Slovak Republic, PL. ÚS 25/2019-117

Summary:

The eKasa case regards proceedings initiated by a group of 33 deputies of the Slovak Parliament, who opposed certain feature of the electronic cash register (eKasa) system mandated by recently introduced tax transparency rules.

The claimants argued that the eKasa system remits data in-real time to a central ‘clearing house’ and AI risk-scoring systems, an aspect which is not provided in the law mandating the eKasa system. Accordingly, the claimants hold that the eKasa system is contrary to: Art. 8 ECHR (right to private life), as well as Art. 7 EU Charter (right to private life), Art. 8 EU Charter (right to data protection) and Art. 52(1) of the EU Charter (principle of legality), and their corresponding rights in the Slovak constitution.

Moreover, the claimants attested that the eKasa system obliges buyers to remit considerable amounts of data on both sellers and buyers, some of which seems irrelevant for tax compliance purposes – hence infringing on the principle of data minimization.

The Court sided with the claimants and ruled that the legislature, when integrating machine-learning algorithms, must ensure that such system has a formal legal basis with specific safeguards, particularly: i) transparency ii) individual protection iii) and, collective supervision.

The ruling in eKasa reinforces the conclusions of the SyRI case, where the Court of the Hague asserted that risk-scoring algorithms must be regulated by ad hoc norms with a high threshold of quality, so-called ‘transparency in the interest of verifiability’.

Key takeaways:

The eKasa case summarizes the major grievances of taxpayers and tax professionals surrounding the use of AI systems by EU tax administrations, namely:

  1. The lack of transparency over the AI systems and their integration into tax compliance processes
  2. The proportionality or disproportionality of the data collected by the tax administration
  3. The lack of access to the output of fully or partially automated decision-making systems

The first issue in the case is that, while the eKasa system is regulated by law, the machine-learning algorithm which processes that data further down the chain is not provided in the law, but was implemented ex-officio by the Slovak tax administration. As shown in our country reports, Slovakia is far from being the only Member State in such situation. Many EU Member States either do not have any specific legislative norms regulating the use of their machine-learning risk-scoring algorithms, or have laws that are neither sufficiently precise nor contain sufficient safeguards to protect taxpayers against misuse of their data or abuse of power. In fact, while every Member State of the EU have integrated AI systems to perform fiscal prerogatives, less than a quarter have ad hoc norms regulating their use. The Court sided with the claimants, highlighting the fact that risk-scoring algorithms bear particular risks for taxpayers, which should be mitigated or negated through specific safeguards. The Court found that risk-scoring via machine-learning should be accompanied by ex-ante and ex-post measures, in particular: i) transparency ii) individual protection iii) collective supervision (§132).

The second issue is that the eKasa system mandate the transfer of enormous amounts of data. As stated by the claimants, upon aggregation such data may reveal more insights about an individual’s private life. A specificity of the eKasa ERP system is that it mandated the issuance of a unique identifier not only for the seller, but also for the buyer (cfr. Law of the 18 June 2008, Art. 2(q)). In other words, the tax administration could have access to the entire purchase history of a customer. The claimants entertained serious doubts over the relevance of such information in the fight against VAT, and outlined how purchase histories may be proxies for legally protected characteristics, such as religion or ethnic origin. The Court sided with the claimants, ruling that the issuance of a unique identifier for the buyer was not compatible with data protection. The Court echoed the concerns of the claimants regarding risks of biases and discrimination when processing data over individual consumption habits.

Third, the Court expanded on the notion of ‘automated decision-making’ in Article 22 of the GDPR, which prescribes that data subjects have the right to not be subjected to a decision ‘based solely on automated decision-making’. The term ‘solely’ sets an extraordinarily high threshold to fulfil, particularly in tax compliance. Tax administrations are among the largest organisations in every jurisdiction, thus any administrative decision will be multi-factorial and will never be the sole product of one output, whether human or machine. The Court implicitly acknowledged that shortcoming of Art. 22 GDPR and extended its scope to include situations where the output of the machine-learning algorithm serves as crucial input for the subsequent decision. In the negative, Art. 22 would be completely moot in the context of administrative decision-making and would not enable taxpayers to materially assert the integrity of the decision-making process. The Court summarizes that grievance by saying: ‘the use of technology by public administrations cannot result in a State where decisions are inexplicable, unexplained and at the same time no is responsible for them’ (cf. §127).

Share this publication

More Publications

European Union

AI for Revenue Agencies – EU Overview

This page provides a high-level summary of our individual country reports and disaggregated data on the use of AI by...
Slovakia

The eKasa case

The 17th December 2021, the Slovak Supreme Constitutional Court ruled that a State-mandated electronic cash register system which obliged entrepreneurs...