In recent years, the convergence of artificial intelligence (“AI”) and blockchain technology has sparked a new wave of innovation. This article explores the intersection of these technologies, the accompanying hype, and the potential legal and regulatory issues that may arise.
1. Understanding AI and blockchain
It is important to first describe what we mean when discussing AI and blockchain.
Artificial Intelligence
Defining AI, certainly in the context of its regulation, has proved a controversial task. The initial definition in the EU’s AI Act was, for example, heavily criticised for being too wide. The final definition in the recently passed Act defines an “AI system” as:
“a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments”.1
More generally, AI has been described as a range of technologies – for example, the UK’s data regulator, the Information Commissioner’s Office (or “ICO”) describes it as “an umbrella term for a range of algorithm-based technologies that solve complex tasks by carrying out functions that previously required human thinking.”2
Blockchain
Blockchain is a subset of distributed ledger technology (“DLT”), where a distributed ledger exists across a network of computers or “nodes” rather than being stored in one central location. A distributed ledger may be shared between a private network of participants and only editable by a select group (permissioned) or open to, and editable by, the public (permissionless), or exist somewhere between these two poles. Distributed ledgers vary both in their degree of centralisation and the means by which information is recorded or “validated” on the ledger.3 Distributed ledgers can also host smart contracts,4 whose encoded obligations are performed automatically by all nodes,5 which is why these ledgers are sometimes referred to as “programmable”.
Blockchain is a particular type of distributed ledger which operates as a series of blocks of data linked together by a cryptographic hash. Because each block in the chain includes a hash of the previous block, it forms a continuous, unbroken chain of information which is (almost) impossible to change. Transactions on public blockchains such as Ethereum and Bitcoin are often said to be “pseudonymous”, linked to unique identifiers rather than individual identities.
The advantages presented by a given blockchain will be determined by its features and purpose, such as how it is permissioned, whether it is public or private, and how decisions are made. In general terms, however, proponents of blockchain cite its core benefits as including immutability (the record cannot easily be tampered with); transparency, and the ability to trace data through the chain; operational efficiency, as there is no need to reconcile multiple ledgers; security, as there is no single point of failure; the ability to solve for trust where there is no central authority (it’s a “confidence machine”);6 and the ability to disperse control and facilitate community governance via autonomous smart contracts and tokens.7 It is worth observing that some of these proposed advantages are contested.
2. Possible Use Cases
Blockchain has sometimes been accused of being a solution in search of a problem, and its recent partnering with AI should be viewed against this backdrop – particularly as AI is in its own hype cycle. There are, however, a number of (very different) projects fusing blockchain and AI which merit attention, and which make use of the full spectrum of blockchain’s properties. We list some of these varied projects here, before raising potential legal and regulatory speedbumps.
- Supply chains and data analytics – blockchain technology is already presented as a means of helping organisations with supply chain management, increasing transparency and traceability to counter issues such as fraud, counterfeiting, and inaccuracy in record keeping. AI is enhancing this offering by performing large scale analytics of data to identify trends and improve processes.8
- Creation of marketplaces for hardware capacity – blockchain is being used to create marketplaces where individuals and organisations can rent hardware capacity which can be utilised in the use/creation of AI, operating like a decentralised Amazon Web Services.9 The claimed benefits of hosting this type of system on the blockchain include having publicly verifiable sales, and fast and cheap international peer-to-peer transactions.
- Enhancing blockchain analytics – companies such as Chainalysis have long been mapping on-chain pseudonymous activity to real-world actors for the purposes of legal and regulatory enforcement, market insight and accurate tax reporting. These forensic techniques, which sift through vast amounts of data, are enhanced by the use of AI.10
- Tackling deepfakes – as the proliferation of AI-generated fake content (deepfakes) threatens individuals and societies, projects making use of blockchain’s immutability and traceability have been proposed as a means of authenticating digital media.11
- Decentralising the development of AI – some critics of blockchain have observed that it solves for a social problem, rather than a technological one, its origins grounded in the perceived need to bypass financial institutions.12 With the blossoming of AI, blockchain has been called on again to solve a (potentially) social problem. It has been mooted as a means of decentralising the development of the AI itself by, among other things, powering a decentralised system of governance and service exchange through tokens and smart contracts.13 In this way, alongside the use of open source, blockchain is presented as a means of avoiding concentration amongst a limited number of tech companies in this field.14 This use case is more speculative than the others we have described.
3. Legal and Regulatory Challenges
The regulatory challenges for both AI and blockchain are numerous and well documented. When considering any use cases which combine the two, it is therefore important to consider whether this combination creates unique issues in the context of the integration of AI and blockchain. While numerous areas of law and regulation may raise novel challenges at this intersection, including intellectual property rights and financial regulation, two areas in particular stand out: data protection, and the EU AI Act.
Data Protection15
AI projects may further strain the challenges posed by the conflict between the immutability of the blockchain and the rights and protections given to data subjects under the GDPR, including the right to be forgotten or have personal data erased,16 for example. Projects which aim to make blockchains more transparent, through enhancing blockchain analytics, are a good example of this tension. While such projects may state their aims as creating a balance between transparency and accountability on the blockchain with the right to privacy, they could expose personal data by revealing, for example, the identity of an individual behind a wallet address, raising broader data privacy concerns. With regards to this example, EU case law17 has found that IP addresses can constitute personal data where such data could be used to identify an individual – and “online identifiers” are explicitly included within the definition of personal data in the GDPR.18 As such, blockchain wallet addresses or cryptographic private keys are also likely to amount to personal data where they enable individuals to be identified.19
It is clear from the above that AI can amplify the data privacy issues currently associated with blockchain. It is also worth noting that the substantial fines for breaches of the GDPR20 pose a real risk for any organisation using AI for this purpose. While data protection regulators’ priorities and enforcement approaches undoubtedly vary across Europe, a number are focusing on the data privacy risks posed by innovative uses of data and AI. For example, the UK’s ICO21 has included AI as one of its three priority areas for the next year and has already taken action against Snap and ClearviewAI in this space.
Ultimately issues such as the above beg the key question as to whether such incompatibility with existing legislation is inherent in blockchain technology (which can be exacerbated by integration with AI) – as the regulation may have been drafted with more centralised, controllable systems in mind. Issues may also arise, for example, in identifying who is the data controller when project governance is spread between a large number of persons/organisations, or whether in such circumstances, it could be considered that there are a large number of joint controllers.22 Therefore, it can be argued either way: that either the technology needs to change as it evolves in order to achieve regulatory compliance, or that regulation needs to catch up with technological advancement and not seek to stifle it.
EU AI Act23
Where AI is integrated into a blockchain project it is important to consider whether legislation such as the EU’s AI Act (“the Act”) applies. This is particularly important given that the penalties for non-compliance can be substantial – the higher of €15m or 3% of worldwide annual turnover for many violations of the Act, with even higher fines where AI is being used for a prohibited practice.24
The Act has a wide scope. As described above, the definition of an AI System under the Act is broad, meaning many of the AI use cases mentioned above could technically fall within it. It also has wide extra-territorial reach, covering anyone who puts an AI System onto the market or into service in the EU, or anyone providing or using AI Systems where their output is used in the EU. The decentralised nature of many projects integrating AI and blockchain can make it difficult to limit the scope of their availability to certain jurisdictions – and to control a project in this way would, for some, conflict with the underlying philosophy of blockchain technology. As a result, many such projects could come into the scope of the Act.
However, the specific obligations which could apply to those involved in an AI project of this kind will differ depending on the risk category of the AI in question and the role they play in the AI supply chain. Not all AI use will trigger extensive obligations.
The risk categories in the Act tend to be split into four different types:
- Prohibited AI – which have an unacceptable level of risk and are therefore banned (e.g. social scoring).
- High risk AI – which are heavily regulated, triggering requirements in areas such as risk mitigation, data governance, human oversight and conformity testing. There are two types of high risk AI listed in the Act. The first covers products/safety components covered by the EU’s product safety legislation (e.g. toys, lifts, medical devices etc.) and the second covers designated high risk use cases/areas (e.g. credit scoring, CV filtering, use in migration).
- Limited/transparency risk AI – which effectively introduce additional transparency obligations when certain AI is used (e.g. it must be clear an individual is interacting with AI when using a chatbot, or that deepfakes are AI generated).
- Minimal risk AI – which is not expressly referenced in the Act but refers to AI Systems which do not fall into any of the above risk categories. These do not pose risks which require specific new obligations under the Act (e.g. spam filters).
There are also separate obligations for providers of general purpose AI (“GPAI”) models.
Those involved in AI and blockchain projects will therefore need to consider whether the AI use case triggers any high risk, transparency or GPAI obligations (and if so, how in practice it will fulfil them), or involves a prohibited use case – many may not.
They will also need to understand their role. The Act covers a wide spectrum of actors in the AI supply chain, from those providing, distributing and importing AI Systems to those deploying (i.e. using) it. Specific obligations differ depending on role, with the majority of obligations falling on providers and (to a slightly lesser degree) deployers. However, there may be instances where the distributed nature of blockchain projects makes it hard to determine who assumes what role and who will take the necessary actions to comply.
In terms of timing, while the Act is now law, many of its provisions do not apply until August 2026, giving current and future projects time to prepare.
4. Conclusion
While these legal and regulatory challenges have the potential to be significant, perhaps the biggest hurdle that the integration of AI and blockchain faces is over-saturation of blockchain projects that use the AI “buzzword” to grab attention (and investment). Ultimately, these two technologies both come with a lot of hype. Those investing in an AI blockchain project, or making use of this technology (given that the EU AI Act includes obligations for users of AI), need to be able to cut through the noise.
Comprehensive due diligence on any potential product or service will be crucial, and signs that AI is being used as a “window dressing” include:
- the project has existed for a long amount of time and only recently pivoted to being an AI project without a clear justification;
- the whitepaper (a document summarising how a project works, its goals, and the issue it is aiming to resolve) does not give a detailed or clear enough description as to what the project gains from the integration with AI and instead describes it in general or convoluted terms; and
- the team behind the project (if public) does not include any individual with AI expertise.
We will be following these uses cases closely, as both technologies – and the legal and regulatory ecosystems around them – mature.
Footnotes:
[1] Article 3(1) of Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act).
[2] ICO, The Alan Turing Institute, Explaining decisions made with AI: Guidance – Explaining decisions made with AI | ICO.
[3] See: The Lens, Merging Crypto and ESG (September 2022) – Merging crypto and ESG, Emily Bradley.
[4] The Law Commission has defined smart contracts as “computer code that, upon the occurrence of a specified condition or conditions, executes on a DLT system automatically and deterministically according to pre-specified functions..” See Law Commission, Decentralised autonomous organisations (DAOs): A scoping paper (July 2024), p.xi – Decentralised autonomous organisations (DAOs): A scoping paper.
[5] Ibid, p.21.
[6] Primavera De Filippi, Morshed Mannan, Wessel Reijers, Blockchain as a confidence machine: The problem of trust & challenges of governance, Technology in Society 62 (2020) – Blockchain as a confidence machine: The problem of trust & challenges of governance – ScienceDirect.
[7] For more on this last point, see the Law Commission scoping paper on DAOs, fn.4.
[8] See, for example, Unilever, SAP, Unilever pilot blockchain technology supporting deforestation-free palm oil (March 2022) – SAP, Unilever pilot blockchain technology supporting deforestation-free palm oil | Unilever.
[9] An example of this is a project called “Render” which facilitates the renting out of GPU processing power to and by users of the platform, which is hosted on blockchain – Render Network.
[10] See, for example, Arkham Intelligence – Arkham.
[11] Harvard Business Review, How Blockchain Can Help Combat Disinformation (19 July 2021) – How Blockchain Can Help Combat Disinformation.
[12] Satoshi Nakamoto, Bitcoin: A Peer-to-Peer Electronic Cash System (2008) – bitcoin.pdf.
[13] AI Business, The Future of Web3 and Decentralised AI (20 August 2024) – The Future of Web3 and Decentralized AI.
[14] See for example the “Vision Paper” for the project “Artificial Superintelligence Alliance – ASI Alliance Vision Paper.
[15] For a more in depth analysis on the interaction between GDPR and Blockchain, please see this article from our data privacy team – Blockchain and the GDPR: reconcilable differences?.
[16] E.g. Article 17(1) GDPR.
[17] Patrick Breyer v Bundesrepublik Deutschland.
[18] See Article 4(1) and Recital 30 GDPR with regards to IP addresses being considered “online identifiers”.
[19] This is further supported by the fact that ICO guidance refers to social media handles that uniquely identify an individual as being “online identifiers” – What are identifiers and related factors? | ICO.
[20] The higher of €20m or 4% of the annual worldwide turnover.
[21] As announced in John Edwards speech at the ICO’s October 2024 Data Protection Practitioners’ Conference.
[22] For more information, please see the ICO’s consultation response to the Treasury Select Committee inquiry into the crypto-asset industry – treasury-select-committee-inquiry-into-cyrpto-assets-ico-response.pdf.
[23] For a more details overview of the provisions of the EU AI Act, including an analysis of the key provisions, please see a recent article from our digital and technology team – EU AI Act to Enter into Force.
[24] Or the smaller of the two in the case of certain SMEs and start-ups – which may apply to some smaller projects in the space.