Seclea – Building Trust in AI
Lead Participant:
SECLEA
Abstract
Artificial Intelligence has the potential to improve our lives with rapid, personalised and assistive services. At the same time, it presents risks of negative impacts on both society and individual citizens. Recent debacles have demonstrated this impact; for example, Apple Credit Card has been called a 'sexist card' and Forbes reported that 'AI bias could put women's lives at risk'. Racial bias has also been exhibited by AI applications, like bail systems in the USA that preferentially let white people go home compared to black people. Such examples are just the tip of the iceberg.
There is a growing uneasiness around AI in the mind of the general public, but at the same time, AI is considered a global phenomenon that will transform our lives. The fourth industrial revolution is of a scale, speed and complexity that it requires the integration of autonomous agents powered by AI algorithms. Therefore, we want all the benefits of AI while limiting the risks associated with it -- especially the risks related to its Blackbox nature. Furthermore, we need mechanisms to build public trust and confidence in autonomous decision-making processes.
This can be achieved by making AI transparent, explainable, auditable and accountable. This will demystify AI applications and increase trust and confidence in AI for both the general public and businesses. Seclea is an innovative platform that achieves this goal. The Seclea Explainability Platform (SEP) is a cloud-based service that integrates with and supports an AI application through its entire lifecycle, ensuring that it is transparent, fair and accountable. During this project, a commercially viable and reliable SEP will be developed and released, so organisations can ensure their AI application(s) is fair and assists in safeguarding their investment in AI from any future regulatory changes.
Furthermore, Seclea is supporting the AI Grand Challenge identified in UK Industrial Strategy. The core business of Seclea is in line with the UK Industrial Strategy's goal of making the UK the global leader in 'safe and ethical use of data and Artificial Intelligence giving confidence and clarity to citizens and businesses.'
There is a growing uneasiness around AI in the mind of the general public, but at the same time, AI is considered a global phenomenon that will transform our lives. The fourth industrial revolution is of a scale, speed and complexity that it requires the integration of autonomous agents powered by AI algorithms. Therefore, we want all the benefits of AI while limiting the risks associated with it -- especially the risks related to its Blackbox nature. Furthermore, we need mechanisms to build public trust and confidence in autonomous decision-making processes.
This can be achieved by making AI transparent, explainable, auditable and accountable. This will demystify AI applications and increase trust and confidence in AI for both the general public and businesses. Seclea is an innovative platform that achieves this goal. The Seclea Explainability Platform (SEP) is a cloud-based service that integrates with and supports an AI application through its entire lifecycle, ensuring that it is transparent, fair and accountable. During this project, a commercially viable and reliable SEP will be developed and released, so organisations can ensure their AI application(s) is fair and assists in safeguarding their investment in AI from any future regulatory changes.
Furthermore, Seclea is supporting the AI Grand Challenge identified in UK Industrial Strategy. The core business of Seclea is in line with the UK Industrial Strategy's goal of making the UK the global leader in 'safe and ethical use of data and Artificial Intelligence giving confidence and clarity to citizens and businesses.'
Lead Participant | Project Cost | Grant Offer |
|---|---|---|
| SECLEA | £299,371 | £ 209,560 |
People |
ORCID iD |
| Roger Milroy (Project Manager) |