Sweeping new EU AI Act far from a cure-all as risk profiles change and use of technology evolves
Just 7pc of Irish businesses currently have AI and/or GenAI governance structures in place, yet an overwhelming majority (91pc) believe that GenAI will increase cybersecurity risks in the year ahead. This is according to PwC’s latest GenAI Business Leaders survey.
The EU AI Act, a sweeping new regulation, aims to change all of this to ensure businesses have appropriate governance and controls over AI to deliver safe and secure outcomes.
Indeed, a large majority (84pc) of participants in the same survey welcomed the introduction of the Act, saying regulation is necessary to prevent the potential negative impact of AI. But there are challenges with the new Act.
The Act aims to protect EU businesses, consumers and citizens from the risks of AI in terms of health, safety, fundamental rights, democracy, rule of law and the environment.
By introducing standards and providing legal certainty, it also seeks to foster innovation, growth and competitiveness in the EU’s internal market.
The regulation is the EU’s first comprehensive legal framework for artificial intelligence (AI).
AI can bring great opportunities to businesses, but it needs to be used safely and securely. The Act will level the playing field for businesses using AI.
The legislation adopts a risk-based approach with the biggest compliance requirements in the Act applying to ‘High Risk’ AI systems.
These requirements include addressing data governance concerns, mitigating bias, ensuring transparency and implementing a system of quality management.
The Act also requires that users be informed when interacting with chatbots, and any AI-generated content must be clearly identifiable.
The Act covers a lot of key considerations, however, it is silent in some areas.
For example, the Act does not address the broader risks such as reputational damage caused by the use of AI which is legal but viewed as contrary to the values and expectations of an organisation’s internal and external stakeholders. It remains to be seen how this plays out.
Specific risks including drag on innovation.
There are a number of specific risks which are particular to the new EU AI Act, including failure to identify all uses of AI across the business as well as potentially inaccurate risk classification of AI uses.
The new Act obliges organisations to assess all of their AI use cases.
This may prove to be an onerous and time-consuming task for many companies given the dispersed nature of the use of AI.
The risk of misclassification is high, as risk classifications may change over time as an organisation’s use of AI evolves. This necessitates the implementation of appropriate ongoing governance and control procedures to maintain compliance over time, bringing its own challenges.
There is also a risk that a focus purely on compliance may lead to a drag on innovation. We have seen this occur in response to the introduction of GDPR and, in the absence of an appropriate approach, it may be a trap which organisations fall into again.
Language and risk classification used in the Act may be problematic.
The nuanced nature of some of the language used in the Act coupled with risk classifications and role designations being subject to change may prove problematic for some organisations. The use of AI systems by third-parties acting on behalf of organisations may also cause a degree of complexity.
There is much to be considered by Irish businesses to ensure they will be compliant with the new EU AI Act. It will bring competitive opportunities but will be a complex process to ensure that the use of AI is safe and secure.
To secure the opportunities from AI, businesses need a clear strategy to ensure return on investment in a safe and secure way. Trustworthy algorithms with the right data to train them will be needed.
Going forward there will need to be a better understanding of the AI capabilities, better governance as well as more focus on measuring and achieving strong return on investment.
Reporting on:independent.ie