AI and data protection in FinTech: implications, challenges and concerns from a GDPR perspective
Artificial Intelligence has been crucial for FinTech companies to grow and to gain advantages. However, it can have some implications and risks for data protection regulations, especially General Regulation Data Protection (GDPR). In this post, we are going to analyze how AI and Data protection regulations coexist and work together, especially from the General Data Protection Regulation perspective.
Artificial Intelligence has changed how everything works. From manufacturing to FinTech companies, it has been proven that AI helps companies to gain advantages over competitors, as well as innovating and improving customer service and experience. But not only AI is useful to improve customer support, it helps in such important tasks like fraud detection or decision making, leading to better results overall.
For many reasons, AI has become a fundamental pillar for companies to develop new strategies, and FinTech companies all over the world are very aware of that, as they are implementing new AI based technologies. For example, they can offer personalized services to customers, helping them making decisions or tracking their financial transactions.
However, we need to understand that AI and Data Protection coexist together. AI implies processing personal data most of the time, and it is crucial for FinTech companies to ensure adequate protection against risks, and to preserve and protect the right to privacy and data protection in the process.
We are going to explain what are the implications and concerns around using AI in FinTech companies, and how GDPR directly affects everything related to it.
AI and FinTech: implications and concerns
As we said, FinTech companies have been using AI to achieve their goals and objectives, resulting in a very quick development of the industry. For many reasons, Artificial Intelligence has been proven to be really useful for many tasks regarding financial companies, leading to new approaches and solutions. But in order to do so, FinTech companies and AI need to process data. And here it is where the relation between AI and Data Protection becomes crucial.
For example, AI-based technologies in FinTech can be very useful for fraud detection and prevention. Fraud has become a huge problem for FinTech companies to overcome. By using AI, they can detect potential fraudulent activities and react to them, avoiding not only financial but reputational losses.
AI applications process personal data in many ways, and automated decision making and predictions (for example, in fraud situations as described before) can be very useful and precise, but in other cases it can result into mistaken and discriminatory decisions. Artificial intelligence offer great opportunities and benefits for most of companies, but it also can result into really serious problems, like unfairness and discrimination.
For these reasons and for many more, it is really important for Fintech companies (and for companies that use AI in general), to understand the implications and risks of using artificial intelligence, and to develop a deep understanding and strategy for GDPR implications. So, what is the approach and perspective of GDPR about using Artificial Intelligence and what is the relation betweenAI and Data Protection according to European regulations?
Data protection regulations approaches: GDPR and AI Act
Concerns about Artificial Intelligence have been a constant from a European Regulation perspective.
In 2018, the European Commission presented a series of measures regarding Artificial Intelligence in the “Communication on Artificial Intelligence for Europe”, proposing different approaches to invest in AI and ensure an appropriate ethical and legal framework.
In 2020, the European Parliament published “The Impact Of the General Data Protection Regulation on Artificial Intelligence”, analyzing what are the implications between the General Data Protection Regulation and Artificial Intelligence.
On top of that, on 21 April 2021, the European Commission presented “the Artificial Intelligence Act”, which sets horizontal rules for the development and use of AI-driven products, and declarates that AI requires to be legally, ethically and technically robust, while respecting democratic values and human rights.
As we can see, AI and Data Protection are very related and they have been studied and analyzed by EU regulators.
From a GDPR approach, we have to say that the General Data Protection Regulation does not contain the term “Artificial Intelligence”. But that doesn’t mean AI can be used without any legal concern. Many provisions contained in the GDPR directly apply to AI, and some of them are really challenging for the new ways to process data that AI provides.
As the European Parliament says, we can talk about a “tension” between the traditional data protection principles (purpose limitation, data minimisation, the special treatment of sensitive data, and specially the limitation on automated decisions). Following this line, AI can take a GDPR approach without making major changes, and we can conclude that any technology that is designed to process personal data must be inline with the GDPR. Regarding principles, it is important to consider the legal basis using Artificial Intelligence, and Article 6 GDPR says that all processing of personal data requires a legal basis, which can be very problematic in AI.
Artificial Intelligence can be also problematic when it comes to special categories of data and transparency, but one of the most important activities regarding AI and Data Protection in FinTech is profiling and automated decision-making.
FinTech companies can manage and process a huge amount of sensitive data, for example, data regarding payment. In this sense, it is very important for them to comply with the GDPR regulation, because it requires data to be processed lawfully and fairly, and to comply with Article 9 GDPR.
AI has also been very relevant developing new opportunities for profiling, and FinTech Companies need to be aware of the perspective and regulation of profiling in the GDPR. AI increases the potential for profiling, and even though it could have some benefits as we just said before, it can lead to serious risks, regarding Article 22 GDPR. In conclusion, we have to say that AI is a disrupting asset in FinTech Companies, and it is of capital importance for FinTech Companies to recognize AI and Data Protection as a needed relationship. FinTech Companies who recognize the importance of GDPR and privacy regulations using AI, are those who are going to develop better and more respectful strategies regarding data subjects, being in a better position to overcome competitors.