Enabling a Responsible AI Ecosystem
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Philosophy Psychology & Language
Abstract
Problem Space: There is now a broad base of research in AI ethics, policy and law that can inform and guide efforts to construct a Responsible AI (R-AI) ecosystem, but three gaps must be bridged before this is achieved:
(a) Traditional disciplinary silos, boundaries, and incentives limiting successful generation and translation of R-AI knowledge must be addressed by new incentives and supportive infrastructure.
(b) The high barriers to adoption of existing research and creative work in R-AI must be removed so that public institutions, non-profits and industry (especially SME) can convert and embed R-AI into reliable, accessible and scalable practices and methods that can be broadly adopted.
(c) The primary actors empowered to research, articulate and adopt responsible AI standards must be more broadly representative of, and answerable to, the publics and communities most impacted by AI developments.
Approach: We will develop and deliver a UK-wide infrastructure that lays secure foundations for the bridges across these three gaps so that AI in the UK is responsible, ethical and accountable by default. The programme will be structured around three core pillars of work (Translation and Co-Construction, Embedding and Adoption, and Answerability and Accountability) and four strategic delivery themes that establish programme coherence: (1) AI for Humane Innovation: integrating within AI research the humanistic perspectives that enable the personal, cultural, and political flourishing of human beings, by weaving historical, philosophical, literary and other humane arts into dialogue with AI communities of research, policy and practise (2) AI for Inspired Innovation: activities to infuse the AI ecosystem with more vibrant, imaginative and creative visions of R-AI futures (3) AI for Equitable Innovation: activities directing research and policy attention to the need to ensure that broader UK publics, particularly those marginalised within the digital economy, can expect more sustainable and equitable futures from AI development, and (4) AI for Resilient Innovation: uplifting research, policy and practise that ensures AI ameliorates growing threats to global and national security, rule of law, liberty, and social cohesion.
Team: Co-Directors Vallor and Luger will lead and deliver the programme alongside the Ada Lovelace Institute, with a cross-disciplinary team of Co-Is who will leverage their networks to broaden/diversify the R-AI community and enhance disciplinary engagement with R-AI, acting as translational interfaces to ensure AHRC programme communications speak effectively to underrepresented cohorts within their communities/disciplines. As partner, the BBC will support public engagement activities to ensure trust, breadth of reach and public legitimacy.
Activities: Through a comprehensive programme of translation, research and engagement activities we will:
(1) Support existing and foster new R-AI partnerships, connecting AI researchers, industry, policymakers and publics around the cross-cutting themes.
(2) Build broader responsible AI visions by developing infrastructure for translation of R-AI research, inviting ECRs and new voices from the arts, humanities and civil society, to co-shape, interrogate and enrich visions of flourishing with AI.
(3) Learning from early R-AI work, surface and map barriers, incentives and opportunities to make R-AI research responsive to the needs and challenges faced by policymakers, regulators, technologists, and wider publics.
(4) Embed R-AI in policy and practice by conducting research and building capacity for translation of R-AI research into accessible and usable guidance for policymakers, industry leaders, SMEs and publics.
(5) Build trust in AI by rethinking accountability through three applied lenses: accountability to wider publics, answerability of current systems, and public mechanisms for recourse through consultation, creative mechanisms and synthesis activities.
(a) Traditional disciplinary silos, boundaries, and incentives limiting successful generation and translation of R-AI knowledge must be addressed by new incentives and supportive infrastructure.
(b) The high barriers to adoption of existing research and creative work in R-AI must be removed so that public institutions, non-profits and industry (especially SME) can convert and embed R-AI into reliable, accessible and scalable practices and methods that can be broadly adopted.
(c) The primary actors empowered to research, articulate and adopt responsible AI standards must be more broadly representative of, and answerable to, the publics and communities most impacted by AI developments.
Approach: We will develop and deliver a UK-wide infrastructure that lays secure foundations for the bridges across these three gaps so that AI in the UK is responsible, ethical and accountable by default. The programme will be structured around three core pillars of work (Translation and Co-Construction, Embedding and Adoption, and Answerability and Accountability) and four strategic delivery themes that establish programme coherence: (1) AI for Humane Innovation: integrating within AI research the humanistic perspectives that enable the personal, cultural, and political flourishing of human beings, by weaving historical, philosophical, literary and other humane arts into dialogue with AI communities of research, policy and practise (2) AI for Inspired Innovation: activities to infuse the AI ecosystem with more vibrant, imaginative and creative visions of R-AI futures (3) AI for Equitable Innovation: activities directing research and policy attention to the need to ensure that broader UK publics, particularly those marginalised within the digital economy, can expect more sustainable and equitable futures from AI development, and (4) AI for Resilient Innovation: uplifting research, policy and practise that ensures AI ameliorates growing threats to global and national security, rule of law, liberty, and social cohesion.
Team: Co-Directors Vallor and Luger will lead and deliver the programme alongside the Ada Lovelace Institute, with a cross-disciplinary team of Co-Is who will leverage their networks to broaden/diversify the R-AI community and enhance disciplinary engagement with R-AI, acting as translational interfaces to ensure AHRC programme communications speak effectively to underrepresented cohorts within their communities/disciplines. As partner, the BBC will support public engagement activities to ensure trust, breadth of reach and public legitimacy.
Activities: Through a comprehensive programme of translation, research and engagement activities we will:
(1) Support existing and foster new R-AI partnerships, connecting AI researchers, industry, policymakers and publics around the cross-cutting themes.
(2) Build broader responsible AI visions by developing infrastructure for translation of R-AI research, inviting ECRs and new voices from the arts, humanities and civil society, to co-shape, interrogate and enrich visions of flourishing with AI.
(3) Learning from early R-AI work, surface and map barriers, incentives and opportunities to make R-AI research responsive to the needs and challenges faced by policymakers, regulators, technologists, and wider publics.
(4) Embed R-AI in policy and practice by conducting research and building capacity for translation of R-AI research into accessible and usable guidance for policymakers, industry leaders, SMEs and publics.
(5) Build trust in AI by rethinking accountability through three applied lenses: accountability to wider publics, answerability of current systems, and public mechanisms for recourse through consultation, creative mechanisms and synthesis activities.
Organisations
- University of Edinburgh (Lead Research Organisation)
- Ada Lovelace Institute (Collaboration)
- FutureEverything (Collaboration)
- DACS (Collaboration)
- Microsoft Research (Collaboration)
- Runnymede Trust (Collaboration)
- Alan Turing Institute (Collaboration)
- NATIONAL GALLERIES OF SCOTLAND (Collaboration)
- British Broadcasting Corporation (BBC) (Collaboration)
- Department for Digital, Culture, Media & Sport (Collaboration)
- KING'S COLLEGE LONDON (Collaboration)
- Ada Lovelace Institute (Project Partner)
- British Broadcasting Corporation - BBC (Project Partner)
Publications

Bird C
(2023)
Typology of Risks of Generative Text-to-Image Models

Catanzariti B
(2024)
Investigating the Impact of Facial Expression Recognition in Healthcare

Catanzariti B
(2024)
Investigating the Impact of Facial Expression Recognition in Healthcare

Ganesh B
(2024)
Policy Approaches for Building a Responsible Ecosystem

Ganesh B
(2024)
Policy Approaches for Building a Responsible Ecosystem

Gregory K
(2024)
Mitigating Harms in On-Demand Delivery Platforms

Gregory K
(2024)
Mitigating Harms in On-Demand Delivery Platforms

Hatherall L
(2023)
Responsible Agency Through Answerability

Jones B
(2023)
Generative AI & journalism: A rapid risk-based review

Leslie D
(2024)
'Frontier AI,' Power, and the Public Interest: Who Benefits, Who Decides?
in Harvard Data Science Review
Description | Contribution to UK POST Note on Policy Implications of AI |
Geographic Reach | National |
Policy Influence Type | Contribution to a national consultation/review |
URL | https://post.parliament.uk/research-briefings/post-pn-0708/ |
Description | Evidence Submission to House of Lords AI in Weapons Systems Committee |
Geographic Reach | National |
Policy Influence Type | Implementation circular/rapid advice/letter to e.g. Ministry of Health |
URL | https://publications.parliament.uk/pa/ld5804/ldselect/ldaiwe/16/1602.htm |
Description | Hosted delegation from Parliament of Canada |
Geographic Reach | North America |
Policy Influence Type | Contribution to a national consultation/review |
Description | Input to European Commission SAM on 'Successful and timely uptake of Artificial Intelligence in science in the EU' |
Geographic Reach | Europe |
Policy Influence Type | Participation in a guidance/advisory committee |
Description | Invited Briefing to BBC Board |
Geographic Reach | Local/Municipal/Regional |
Policy Influence Type | Participation in a guidance/advisory committee |
Description | Oversight Board for Ada Lovelace Institute |
Geographic Reach | National |
Policy Influence Type | Participation in a guidance/advisory committee |
Impact | The Ada Lovelace Institute is among the most trusted independent voices in AI policy that has a strong programme of work in public education and engagement around AI and data, as well as considerable policy influence in the UK and abroad. |
URL | https://www.adalovelaceinstitute.org/about/our-people/ |
Description | Presentation to MP Ian Murray on Responsible AI Governance and AI Health Challenges |
Geographic Reach | National |
Policy Influence Type | Implementation circular/rapid advice/letter to e.g. Ministry of Health |
Impact | This response was received following the briefing: "A huge thanks for your excellent presentations and the stimulating discussion today. Ian was very impressed with the quality and relevance of the research we shared with him, and we're optimistic there will be plenty of further follow up, including with other members of Labour's Shadow Cabinet. We really appreciate you giving your time for this - it's so important for us to engage early on with what is likely to be the next UK government." |
Description | Testimony to U.S. Senate Committee on AI and Democracy |
Geographic Reach | North America |
Policy Influence Type | Implementation circular/rapid advice/letter to e.g. Ministry of Health |
URL | https://www.hsgac.senate.gov/hearings/the-philosophy-of-ai-learning-from-history-shaping-our-future/ |
Description | Ada Lovelace Institute |
Organisation | Ada Lovelace Institute |
Country | United Kingdom |
Sector | Charity/Non Profit |
PI Contribution | We have provided leadership on the BRAID community engagement and launch event, scoping calls and fellowships calls which ALI helped us deliver; lead on both DCMS programmes of work in which ALI researchers are key participants and beneficiaries; helped promote aligned ALI activities and events through our BRAID network; co-organised BRAID workshops and blog posts with ALI, worked to integrate and build upon existing and ongoing ALI research in our BRAID outputs, and coordinated with ALI on contributions to the RAI UK Strategy Group. |
Collaborator Contribution | ALI is an equal partner in the decision-making structure, including participation in all core programme research and management meetings and core ecosystem support. ALI contributes to the public engagement arm of the project, focusing on public attitudes and policy research. ALI has hosted BRAID meetings and co-organised BRAID workshops in their offices, and promotes BRAID activities. They also provide input on the BRAID scoping and fellowship calls, and have worked closely with us on the fellowships. They helped to organise, promote and deliver our community engagement/launch event on 15 September 2023 and contributed to the GAI Policy Summit in December 2023. They are a key partner also in the DCMS programmes of work, with their own policy fellows interacting with and supporting ours. |
Impact | BRAID community launch event on 15 Sept 2023, coordinated with ALI Workshop with DSTL on LLMs in Defence DCMS1 supplement: A verbal account and short slide deck outlining key near-future policy challenges, a written report/literature review and a distilled policy briefing/note, verbal presentations in DCMS offices, Completion of three papers exploring supply chain accountability in AI systems, methods for assessing, risks of AI systems, and methods for government monitoring of AI developments. DCMS2 supplement: NOT YET DELIVERED: two reports: one on comparative approaches to regulation and a survey of evaluations of foundation models. Main BRAID: to follow. |
Start Year | 2022 |
Description | Alan Turing Institute: AI and the Arts Working Group |
Organisation | Alan Turing Institute |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | Contributions made via Science Gallery London events https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists |
Collaborator Contribution | Contributions made via Science Gallery London events https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists |
Impact | Contributions made via Science Gallery London events https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists |
Start Year | 2023 |
Description | BBC |
Organisation | British Broadcasting Corporation (BBC) |
Department | BBC Research & Development |
Country | United Kingdom |
Sector | Public |
PI Contribution | BRAID is led by the University of Edinburgh in partnership with the Ada Lovelace Institute and the BBC. BRAID covers 0.2FTE salary for Rhianne Jones, BBC Responsible Data Driven Innovation lead. The BRAID grant also covers the cost of a BBC Translational Fellow, Bronwyn Jones, hosted (employed) by UoE. Partners work in collaboration to further the goals of the BRAID programme. |
Collaborator Contribution | BRAID is led by the University of Edinburgh in partnership with the Ada Lovelace Institute and the BBC. The BBC are the impartial, independent and trusted public service broadcasting organisation in the UK and, with ALI, they represent the public/media engagement arm of the project. Rhianne Jones, their Responsible Data Driven Innovation lead, commits 0.2 FTE to the project, and works closely with Bronwyn Jones, the BRAID Translational Fellow. Through the BBC the BRAID project accesses networks, partnerships, and consortia (e.g., AI4ME, AI Media & Democracy Lab, Centre for Digital Citizens TAS, Digital Horizon, Rights foundation Digital Futures Commission, Partnership on AI, Public Spaces) in order to connect with existing rights and citizen-focused communities. The also support access/use of channels for public engagement through, for example, the BBC Blue Room which has a remit for public engagement around technology and who organise public events, conferences, demonstrations and exhibitions, and other BBC media literacy and outreach initiatives. They also provide a second neutral platform, alongside ALI, for the project through offering BBC blogs and social media channels as a platform for the project. They also provide the project team and ALI access to their buildings and facilities. The BBC hosted and co-organised BRAID's September 2023 community launch event in the BBC Radio Theatre at their Broadcasting House in London. They provided staffing support on the night and in the months prior to the launch, organising the majority of logistics and coordinating with suppliers. They also produced a blog post with highlights from the event. In addition in August 2023 BBC co-convened with BRAID researchers the public workshop - Making and Deepfaking the News |
Impact | Listed under engagements/publications: - BRAID community launch event - BBC blog post - Making and Deepfaking the News Workshop - research brief by Rhia Jones, Bronwyn Jones and Ewa Luger |
Start Year | 2022 |
Description | DCMS |
Organisation | Department for Digital, Culture, Media & Sport |
Country | United Kingdom |
Sector | Public |
PI Contribution | DCMS1: Policy Fellows - Six policy fellows completed individual reports for DCMS on various areas of Responsible AI - thematic deep dives aligned with areas that DCMS had identified as priorities leading into the AI Innovation white paper. In April 2023, the policy fellows traveled with the BRAID co-Directors to present their research at the Home Office to a group of UK civil service and policymakers from DCMS and DSIT, on the topics of Responsible AI governance, Explainable AI, Emotion detection/affective computing for mental health, Generative AI risks, and the rights of platform workers). DCMS2: BRAID hosted a Generative AI policy summit with DCMS staff on Dec 4 2023 in Edinburgh that gathered researchers from all across the UK. A UoE fellowship is also underway. |
Collaborator Contribution | DCMS funded £300,000 for the programme in stage one; £150,000 supported six policy fellows at the University of Edinburgh to complete reports on various areas of Responsible AI identified as priorities for the UK government and policymakers, between January 2023-March 2023. The remaining £150,000 funded another set of policy reports from researchers at our BRAID delivery partner, the Ada Lovelace Institute. DCMS and hosted the University of Edinburgh and ALI researchers at the Home Office on 19 April to meet with DCMS/DSIT staff and present their research. A second round of £267,327 in DCMS funding for a new set of policy fellowships was approved in September 2023 for work to conclude in March 2024. |
Impact | - Six research reports for DCMS produced in March 2023 from Responsible AI policy fellows at the University of Edinburgh; a further set of reports was delivered to DCMS by our partner Ada Lovelace Institute from their portion of the funding. - A white paper is being produced from the findings of the Dec 4 policy summit. - The work by Round 1 Policy Fellow Dr. Kasirzadeh led to the following publication: Bird C, Ungless E, Kasirzadeh A. (2023). Typology of Risks of Generative Text-to-Image Models. The fellows' work and wider collaboration was multidisciplinary, drawing from Philosophy; Law; Informatics; Science, Technology and Innovation Studies, and Design/HCI. The second round of activity is underway May 2024, with policy reports to follow. |
Start Year | 2022 |
Description | Design and Artists' Copyright Society (DACS) |
Organisation | DACS |
Country | United Kingdom |
Sector | Charity/Non Profit |
PI Contribution | BRAID invited Design and Artists' Copyright Society (DACS) to participate in a BRAID event under the AI for Inspired Innovation Theme, where they presented the results of their recent survey on AI and artists' work. |
Collaborator Contribution | Partner organisation spoke at Science Gallery London events. Launched the DACS AI and the Arts survey of 1,000 artists across UK, at the Science Gallery London event, which will feed back to government and policy conversations. |
Impact | Improved our overall profile and reach by having BRAID events part of a larger AI and the Arts body of artists representative/policy work (the Co-I's work features in the report). |
Start Year | 2023 |
Description | Future Everything |
Organisation | FutureEverything |
Country | United Kingdom |
Sector | Charity/Non Profit |
PI Contribution | Future Everything partnered with BRAID in the programming of AI and the Arts events at Science Gallery London. |
Collaborator Contribution | Future Everything partnered with BRAID in the programming of AI and the Arts events at Science Gallery London. Irini Papadimitriou, Creative Director of FutureEverything spoke at the Curatorial event. |
Impact | Report Artists depiction Consensus statement NB none of these yet published. Feedback questionnaire showed impact of events: Enhanced networks, increased confidence in speaking about RAI |
Start Year | 2023 |
Description | King's College London (Science Gallery London) |
Organisation | King's College London |
Department | Science Gallery London |
Country | United Kingdom |
Sector | Academic/University |
PI Contribution | BRAID worked together with Science Gallery London (SGL) to deliver two events as part of the AI for Inspiring Innovation theme. BRAID led on programming, provided leadership and budget. |
Collaborator Contribution | SGL provided local coordination, venue, and improved our overall profile and reach by allowing our events to be part of a larger AI season of events. |
Impact | https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists Consensus statement Report Artists depiction NB no outputs have yet been published |
Start Year | 2023 |
Description | Microsoft Research |
Organisation | Microsoft Research |
Country | Global |
Sector | Private |
PI Contribution | Purpose is to lead to the delivery of a successful 12-18 month collaborative research fellowship on embedding arts and humanities research in the field of responsible AI for the purpose of driving innovation. |
Collaborator Contribution | Partner organisation participated in the funding call for our first round of fellowships by setting a research challenge, meeting with candidates to discuss proposals, and contributing to the assessment process. |
Impact | Creation of a funding call with a wide variety of challenge-questions and stakeholders designed to foster successful fellowship projects embedding arts and humanities research in a variety of settings across the responsible AI ecosystem. |
Start Year | 2023 |
Description | National Galleries of Scotland |
Organisation | National Galleries of Scotland |
Country | United Kingdom |
Sector | Public |
PI Contribution | Purpose is to lead to the delivery of a successful 12-18 month collaborative research fellowship on embedding arts and humanities research in the field of responsible AI for the purpose of driving innovation. |
Collaborator Contribution | Partner organisation participated in the funding call for our first round of fellowships by setting a research challenge, meeting with candidates to discuss proposals, and contributing to the assessment process. |
Impact | Creation of a funding call with a wide variety of challenge-questions and stakeholders designed to foster successful fellowship projects embedding arts and humanities research in a variety of settings across the responsible AI ecosystem. |
Start Year | 2023 |
Description | Runnymede Trust |
Organisation | Runnymede Trust |
Country | United Kingdom |
Sector | Charity/Non Profit |
PI Contribution | Early stages of building this relationship which we anticipate will inform the BRAID Programme's work, in particular our AI for Equitable Innovation theme. We anticipate a partnership with both feeding into each others work. |
Collaborator Contribution | Early stages of building this relationship which we anticipate will inform the BRAID Programme's work, in particular our AI for Equitable Innovation theme. We anticipate a partnership with both feeding into each others work. |
Impact | Too early. |
Start Year | 2023 |
Description | AI & Arts event at Science Gallery London |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | BRAID, represented by Beverley Hood, hosted two events on 18 and 19 January at Science Gallery London, drawing together artists from across the UK for discussions on the challenges and opportunities posed by AI. This event resulted in ecosystem building, critical debate within community, and networking. |
Year(s) Of Engagement Activity | 2024 |
Description | BBC R&D blog on BRAID Launch |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | A round up of the September 2023 BRAID launch event by BRAID Translational Fellow Bronwyn Jones and BBC partner researcher Rhia Jones, summarising the most important takeaways from the event. Aimed to push more interest towards BRAID project, as well as driving readers to sign up for our newsletter. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.bbc.co.uk/rd/blog/2023-10-responsible-ai-trust-policy-ethics |
Description | BBC Science Focus article on AI Risk |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | PI Vallor published 'AI becoming sentient is risky, but that's not the big threat. Here's what is ' 12 Aug 2023, in BBC Science Focus. This contribution addressed emerging concerns about AI safety from a humanities-driven perspective. It resulted in emails from the public noting the value of the insight, and was picked up and republished by Skyways flight magazine in South Africa. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.sciencefocus.com/future-technology/will-ai-make-humans-dumber |
Description | BRAID Blog Post: "How the Arts and Humanities are Crucial to Responsible AI" |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Other audiences |
Results and Impact | On 15 June 23 BRAID Co-Directors Vallor and Luger published a blog post on the AHRC website announcing the BRAID programme and laying out its rationale and intended impact. It resulted in a surge of signups to the BRAID newsletter and interest in the programme's September launch event. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.ukri.org/blog/how-the-arts-and-humanities-are-crucial-to-responsible-ai/ |
Description | BRAID Community Launch Event |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | The BRAID community launch event at the BBC Radio Theatre on 15 Sept 2023 had 273 participants, 178 in person, and 95 online. A variety of high-profile panellists and speakers, including a keynote from Rumman Chowdhury, sparked discussions amongst participants and clearly defined BRAID's trajectory as a project. The event was followed by a networking reception featuring 6 artist performances and displays engaging Responsible AI themes. In a survey post-event, 94% of respondents said it met or exceeded their expectations and 85% got value from the panels and keynote speaker. This resulted in a surge of social media engagement and newsletter sign ups. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.bbc.co.uk/programmes/p0gq60y1 |
Description | BRAID Instagram page |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | BRAID started an Instagram page in December. Since then, we have grown to 105 followers and a 22.27% engagement rate as of 13/03/2024. It's used to communicate our opportunities, events, news, and website, and as a chance to engage with members of our community. |
Year(s) Of Engagement Activity | 2023,2024 |
URL | https://www.instagram.com/braid__uk/ |
Description | BRAID LinkedIn page |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | Since being established in September 2023, the BRAID LinkedIn page has amassed 677 followers as of 13/03/2024 and has a 22.34% engagement rate in this quarter to date. The page is used to promote opportunities, events, news, research outputs, and network with our community. |
Year(s) Of Engagement Activity | 2023,2024 |
URL | https://www.linkedin.com/company/98665671/ |
Description | BRAID X page |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | We have a BRAID X page which as of 13/03/2024 has 685 followers and a 4.67% engagement rate. Our social media hosting platform, Hootsuite, quotes a 1% to 5% engagement rate as excellent, so that's a very positive sign. We have a reached a wide audience of multi-stakeholders and use the channel to promote our website, events, news, and research outputs. |
Year(s) Of Engagement Activity | 2023,2024 |
URL | https://twitter.com/Braid_UK |
Description | BRAID blog on UK AI Safety Summit |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | The 13 October 2023 blog post, 'A shrinking path to safety: how a narrowly technical approach to align AI with the public good could fail', authored by PI Vallor and co-Director Luger, was a response to the announcement of the UK AI Safety Summit and its priorities. The blog is a call for the arts & humanities to be valued in such discussions just as much as technical experts. The blog post sparked policymaker and public discussion of this point at a panel with PI Vallor, Octavia Reeve of the Ada Lovelace Institute (BRAID partner), and representatives of DSIT at the AI Fringe Summit on 31 October leading up to the main Summit on 1-2 November. |
Year(s) Of Engagement Activity | 2023 |
URL | https://braiduk.org/a-shrinking-path-to-safety-how-a-narrowly-technical-approach-to-align-ai-with-th... |
Description | BRAID/ALI/DSTL Workshop on Foundation Models in Defence |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Other audiences |
Results and Impact | This meeting, hosted by PI Vallor and CoI Nehal Bhuta, convened with key institutional partners, brought together national policy makers from the defence sector with academics and technologists, to review and discuss 2 draft reports from the Ada Lovelace Institute concerning the possible public sector uses of Generative/Foundation Models. The meeting was attended by high level policy makers in CDEI and Defence, and also involved international participation to lend a comparative legal perspective. The participants drew lessons from the case studies presented by Ada Lovelace Institute, and reflected on their transposability to defence contexts. |
Year(s) Of Engagement Activity | 2023 |
Description | Baillie Gifford European Media Forum on AI in Edinburgh |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | PI Vallor participated in an expert panel on AI for a group of European media leaders, hosted by Baillie Gifford. Topics discussed included work drawn from BRAID on AI in media, the impact on journalism, and wider public understanding of AI. |
Year(s) Of Engagement Activity | 2023 |
Description | British Science Festival Presidential address |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | BRAID Co-I Vaishak Belle delivered presidential address on theme of AI, harms and risks - "Risky Robots ", discussed impact on arts. |
Year(s) Of Engagement Activity | 2023 |
URL | https://britishsciencefestival.org/event/risky-robots/ |
Description | Co-organised "No Harm Done" series event (an Ethics, AI & Design collaboration with an SME) |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | Bronwyn Jones organised event where three experts from the worlds of academia, design and business gave lightning talks that examined the ethical and design underpinnings of AI, followed by an open discussion. Aim was to increase understanding of transparency and accountability in AI decision-making |
Year(s) Of Engagement Activity | 2023 |
Description | DCMS Generative AI Policy Summit |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Policymakers/politicians |
Results and Impact | As part of our partnership with DCMS, BRAID organised a policy event with a variety of experts from many industries to draw together a white paper for DCMS. The following aim was given to participants: The purpose of this event is to draw a small group of experts together, to inform DCMS thinking around Generative Artificial Intelligence (GAI). We want this to be a collegiate and enjoyable event with space for colleagues to interact and so, rather than the traditional speaker model, we will not ask you to prepare anything in advance. Instead, the event will be highly interactive and design-led, to facilitate discussions and draw us towards a joint set of policy recommendations. The white paper is drafted and is currently undergoing review before the final copy is submitted to DCMS and published on the BRAID website. |
Year(s) Of Engagement Activity | 2023 |
Description | DSIT Roundtable on Human-Machine Interaction |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Policymakers/politicians |
Results and Impact | On 7 Mar 2024 PI Vallor joined a DSIT roundtable hosted at DSIT, on Human-Machine Interaction, in which DSIT sought expert advice from 15 academic, industry and third sector experts across the disciplines, on UK government response to new and emerging HMI risks, especially from generative AI. |
Year(s) Of Engagement Activity | 2024 |
Description | Digital Scotland panel on AI and Government Services |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | In this 21 Nov 2023 Digital Scotland expert panel on 'The emerging role of AI in government services', in conversation with representatives from Scottish government and Leidos, PI Vallor discussed the challenges of answerability to citizens and publics as part of the challenge of responsible use of AI in the public sector, drawing on her and her teams' work within BRAID and the TAS Responsibility projects. This was followed by several requests (verbal and later by email) by audience members in the public sector and industry, for future conversations and joining of the BRAID network. |
Year(s) Of Engagement Activity | 2023 |
URL | https://futurescot.com/futurescot-conferences/digitalscotland2023/ |
Description | ECA Y7 Film screening |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Local |
Primary Audience | Other audiences |
Results and Impact | premiere of AI generated feature film by Salford based artist duo. Beverley Hood haired panel. |
Year(s) Of Engagement Activity | 2023 |
Description | European Computer Science Summit |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | Ewa Luger hosted panel "How do we ensure that informatics shapes the future that we hope for?" and rose BRAID brand awareness. |
Year(s) Of Engagement Activity | 2023 |
Description | Expert Panel at NYU-KAIST AI Global Governance Summit in New York City |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | Following the announcement of a partnership between NYU and the Republic of Korea, attended by the President of Korea, the NYU President, and NYU/Meta's Yann LeCun, the event hosted a wide-ranging panel discussion about AI and responsible digital governance by prominent international scholars in the field including PI Vallor. The other panelists included: Professor Kyung-hyun Cho, Deputy Director for NYU Center for Data Science & Courant Institute, Professor Luciano Floridi, Founding Director of the Digital Ethics Center, Yale University, Professor Urs Gasser, Rector of the Hochschule fur Politik, Technical University of Munich, Professor Stefaan Verhulst, Co-Founder & Director of GovLab's Data Program, NYU Tandon School of Engineering, and Professor Jong Chul Ye, Director of Promotion Council for Digital Health, KAIST. The event led to conversations about future collaborations between BRAID and the new NYU-KAIST partnership. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.nyu.edu/about/news-publications/news/2023/september/nyu-and-kaist-launch-major-new-initi... |
Description | Expert Panel at TAS Symposium 2023, Edinburgh |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | PI Vallor spoke on expert panels on Responsible Autonomous Systems and on AI Policy and Regulation, to an audience of industry, government and academic partners, drawing on both research within BRAID and TAS projects. |
Year(s) Of Engagement Activity | 2023 |
URL | https://tas.ac.uk/bigeventscpt/first-international-symposium-on-trustworthy-autonomous-systems/ |
Description | Fellowships Information Evening at Ada Lovelace London |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Other audiences |
Results and Impact | In-person fellowships information evening that also gave participants a chance for networking. Participants able to clarify their ideas about applying to BRAID fellowships programme and ask questions. |
Year(s) Of Engagement Activity | 2023 |
Description | Fellowships briefing webinar |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Other audiences |
Results and Impact | Briefing session for academics potentially interested in BRAID fellowships. Information given on projects available, BRAID mission, scope, funding, and more. |
Year(s) Of Engagement Activity | 2023 |
Description | Google's Mediterranean ML Summer School in Thessaloniki |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | PI Vallor gave a keynote on Responsible AI to machine learning students and early career researchers in the Mediterranean, drawing from research on AI and responsibility in both BRAID and TAS. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.m2lschool.org/past-editions/m2l-2023-greece |
Description | International Law Association, Paris: Convened panel on new means and methods of warfare with a focus on AI |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Professional Practitioners |
Results and Impact | This panel was convened at the invitation of the International Law Association's organizing committee. The event brought together experts in law and military affairs, and AI, in order to discuss the current state of AI ethics in the military domain and the relationship with international law. The panel was very well attended (over 60 people) by international lawyers and foreign ministry officials from around the world. The impact of the event was to raise understanding among this group of the current state of the debate about responsible AI in the military domain and to highlight the need for normative development and practical measures to regulate military uses. Awareness of BRAID and programme's relevance to event generated. Attended by Nehal Bhuta |
Year(s) Of Engagement Activity | 2023 |
Description | Keynote 'The AI Mirror' at Charlotte Ideas Festival |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | On 1 April, PI Vallor gave a keynote on AI at the Charlotte Ideas Festival in the USA, drawing on themes of responsibility and the importance of the arts and humanities in shaping our future with AI. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.charlotteshout.com/events/detail/shannon-vallor |
Description | Keynote at Turing Fest 2023, Edinburgh |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | PI Vallor's keynote: 'Who is Responsible for Responsible AI? The Ecologies of a Responsible AI Ecosystem' drew from BRAID and TAS Responsibility research to help a broad audience understand the challenges and opportunities we face in building a Responsible AI Ecosystem in the UK. |
Year(s) Of Engagement Activity | 2023 |
URL | https://turingfest.com/speaker/shannon-vallor/ |
Description | Living with AI Podcast on AI and Taking Responsibility |
Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | On 19 July 2023 PI Vallor joined two other PIs of TAS Responsibility projects to talk about the work, the meaning of responsible AI, and future plans on the Living with AI podcast. |
Year(s) Of Engagement Activity | 2023 |
URL | https://podcasts.apple.com/au/podcast/ai-taking-responsibility/id1538425599?i=1000621628112 |
Description | OxGen Generative AI Summit |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Other audiences |
Results and Impact | Bronwyn Jones represented BRAID at the OxGen Generative AI Summit at Oxford University, raising awareness amongst multi-stakeholders. |
Year(s) Of Engagement Activity | 2023 |
Description | Panel on Responsible AI at SAS Innovate |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Industry/Business |
Results and Impact | On 8 June 2023 PI Vallor spoke on a panel on AI and Responsible Innovation with other Responsible AI experts at an industry event hosted at the Royal Institution by SAS, a partner on the TAS Responsibility project. One outcome of the event was the expression of interest from another panelist (Ray Eitel-Porter of Accenture) in getting involved with BRAID; he is now on the BRAID Advisory Board. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.sas.com/sas/events/innovate-on-tour/23/london.html |
Description | Panel on the future impact of AI on the Arts and Cultural sector |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Policymakers/politicians |
Results and Impact | Ewa Luger met new UK Minister for Arts and Heritage (Lord Parkinson), followed by (Chatham house rules) panel on the future impact of AI on the Arts and Cultural sector. Organised by 'What Next?'. Mayor of London's office, Jane Bonham Carter LD DCMS Spokesperson, House of Lords present |
Year(s) Of Engagement Activity | 2023 |
Description | Participant in FacultyAI 'Intelligence Rising' wargames |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Industry/Business |
Results and Impact | The AI company FacultyAI sponsored a set of 'wargames' in early October 2023 on AI risk that allowed the expert participants to explore various scenarios of geopolitical and technological risk development, and strategies for managing those risks, over a 5-10 year horizon; PI Vallor was one of the participants (on the team of players representing corporate interests) and discussed several pieces of BRAID and TAS Responsibility research that informed the strategic thinking. |
Year(s) Of Engagement Activity | 2023 |
Description | Plenary address: International AI Cooperation and Governance Forum in Hong Kong |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Policymakers/politicians |
Results and Impact | On 8 Dec 2023 PI Vallor gave a plenary invited talk on AI Governance at the forum, hosted by Tsinghua University and Hong Kong University of Science and Technology, and attended by approx 750 Hong Kong/Chinese government representatives, international policymakers, AI researchers, and students.The talk highlighted themes of care, responsibility and trust and the importance of the integration of these in AI governance, and mentioned UK support for our research in this area through BRAID and TAS. Several plans for future conversation and collaboration resulted, including plans to coauthor work on AI safety and risk governance with a Cambridge AI researcher. |
Year(s) Of Engagement Activity | 2023 |
URL | https://aicg2023.hkust.edu.hk/ |
Description | Presentation on AI Safety at AI Fringe Summit |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Policymakers/politicians |
Results and Impact | At this expert panel at the AI Fringe Summit, on 'Defining AI Safety' with panelists from Ofcom, Mozilla, DSIT, and Ada Lovelace Institute, PI Vallor discussed the role of the arts and humanities in a full and capable understanding of AI safety, drawing on the history of technical risk management and failures in other industries of overly narrow technical approaches that ignore human and cultural factors in safety. This was linked explicitly to the BRAID programme and the 13 Oct BRAID blog post on this topic. Further discussions afterward with fellow DSIT panelist Emran Mian indicated shaping of DSIT thinking on the importance of a sociotechnical and multidisciplinary approach to AI safety, which led to an invitation to a DSIT Roundtable on human-AI interaction hosted by Emran Mian on 7 March 2024. |
Year(s) Of Engagement Activity | 2023 |
URL | https://aifringe.org/events/expanding-the-conversation-defining-ai-safety-in-practice |
Description | Public workshop - Making and Deepfaking the News |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | A two-hour participatory workshop that was open to the public and convened with a research partner (BBC), covering the challenges AI-generated content poses for journalism and our shared information ecosystem. The purpose was to a) educate and upskill the public and professionals from other industries on how AI (including generative AI and deepfakes) work and provide them with critical skills for assessing media content, and b) engage participants in deliberation of the ethical and normative questions around the use of AI in news production. Several educators contacted me afterwards saying the session was helpful for their teaching and asking us to share materials. There were many questions from attendees during and after the session and some posted on social media about the event. |
Year(s) Of Engagement Activity | 2023 |
URL | https://inspace.ed.ac.uk/the-sounds-of-deep-fake/ |
Description | Royal Institution Youth Summit on AI in London |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Schools |
Results and Impact | On 18 Sept 2023 BRAID Co-Directors Shannon Vallor and Ewa Luger presented on BRAID and Responsible AI to the audience, made up largely of UK sixth form and college students brought together from around the country to hear from experts on AI and to discuss the topic of the upcoming Christmas Lectures.The event was comprised of a series of in-person talks by experts as well as guided and facilitated discussions by students, which Vallor and Luger also helped facilitate. The aim of the Youth Summit was to gather young people's opinions, concerns and interests about AI, and support the next generation of AI and STEM students. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.rigb.org/whats-on/royal-institution-youth-summit-2023 |
Description | Salzburg Conf for Young Analytic Philosophy: agency in artificial systems |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | Fabio Tollon represented BRAID at this conference, presenting a talk on his paper 'Artificial Systems: Entities or Agents?'. He also chaired other talks. |
Year(s) Of Engagement Activity | 2023 |
Description | Scotsman DDI Conference, Edinburgh, panel on AI Past, Present, Future |
Form Of Engagement Activity | A formal working group, expert panel or dialogue |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Industry/Business |
Results and Impact | PI Vallor spoke on an AI expert panel on AI Past, Present and Future at the 2023 Scotsman Data Conference in Edinburgh on 27 Sept 2023, at the Royal College of Physicians of Edinburgh. The panel was then reported on by the Scotsman (links below). The conference led to several requests to learn more about BRAID and join our network. https://www.scotsman.com/business/data-conference-time-for-us-to-rethink-the-thinking-machine-4361064 https://www.scotsman.com/business/data-conference-data-driven-innovations-jarmo-eskelinen-on-a-mission-to-tackle-major-challenges-4361050 |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.nationalworldevents.com/sdc-2023/agenda/ |
Description | Scottish AI Alliance blog post on BRAID Fellowships |
Form Of Engagement Activity | Engagement focused website, blog or social media channel |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Industry/Business |
Results and Impact | 1 Nov 2023 Blog on the Scottish AI Alliance website about the BRAID Fellowships, written by Gavin Leuzzi in order to raise more awareness of fellowships, and drive researchers to submit an application and industry partners to participate. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.scottishai.com/news/ai-arts-and-humanities-research-and-braid-fellowships |
Description | Scottish AI Summit workshop: AI For the Next Generation |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Professional Practitioners |
Results and Impact | PI Vallor helped organise and host this workshop, designed to engage with participants to think collaboratively about how to operationalise a high-level vision for AI that can truly serve a broad and diverse Scottish public. We built upon a series of prior workshops (funded by the Alan Turing Institute, and led by BRAID Policy Fellow Atoosa Kasirzadeh) that centered diverse and underrepresented perspectives, drawing also from BRAID's Equitable Innovation theme, and aimed at creating a shared vision for AI futures for the next generation. We solicited further input at the Summit that shaped the final report. |
Year(s) Of Engagement Activity | 2023 |
URL | https://www.scottishaisummit.com/ai-for-the-next-generation-realizing-an-inclusive-vision-for-scotti... |
Description | Somerset House Studios (SHS) & Serpentine events visit |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Third sector organisations |
Results and Impact | Beverley Hood represented BRAID at Serpentine's event. Hood has contributed to Serpentine's Future Art Ecosystem yearly publication. Interview materials will be incorporated and acknowledged in credits. |
Year(s) Of Engagement Activity | 2024 |
Description | South Africa, SACAIR: interdisciplinary conference on AI. |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Other audiences |
Results and Impact | BRAID PDRA Tollon presented a talk on 'Does Blaming Robots Make Them Responsible?', drawing in elements of BRAID research |
Year(s) Of Engagement Activity | 2023 |
URL | https://2023.sacair.org.za/wp-content/uploads/2023/11/SACAIR-2023_3Day-Website-26.11.23.pdf |
Description | Talk for the University of Copenhagen's 'Science of the Predicted Human' Series |
Form Of Engagement Activity | A talk or presentation |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Postgraduate students |
Results and Impact | On 17 April 2023 PI Vallor gave a lecture, 'The AI Mirror', to social data science, computer science and machine learning researchers and students at the University of Copenhagen's Copenhagen Center for Social Data Science, drawing from award research on Responsible AI as well as her forthcoming book |
Year(s) Of Engagement Activity | 2023 |
URL | https://sodas.ku.dk/events/the-science-of-the-predicted-human-talk-series-professor-shannon-vallor/ |
Description | The Good Robot Podcast |
Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
Part Of Official Scheme? | No |
Geographic Reach | International |
Primary Audience | Public/other audiences |
Results and Impact | The Good Robot podcast is a popular venue for AI related conversations drawing from multiple disciplines; PI Vallor's episode was recorded in 2023 and published in March 2024, drawing on work in feminist ethics linked to the Equitable Innovation theme of BRAID. |
Year(s) Of Engagement Activity | 2024 |
URL | https://podcasts.apple.com/gb/podcast/shannon-vallor-on-feminist-care-ethics-techno-virtues/id157023... |
Description | The National Technology News Podcast on The Future of AI |
Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
Part Of Official Scheme? | No |
Geographic Reach | National |
Primary Audience | Public/other audiences |
Results and Impact | On 25 July 2023 this podcast interview of PI Vallor on The Future of AI was broadcast; here is the podcast description: "To discuss the future of AI and what steps can be taken to ensure it develops in a way that is responsible and supports human flourishing, National Technology News was joined by Shannon Vallor, co-director of the UKRI Arts and Humanities Research Council's BRAID (Bridging Responsible AI Divides) Programme and the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute (EFI) at the University of Edinburgh." |
Year(s) Of Engagement Activity | 2023 |
URL | https://nationaltechnology.co.uk/podcast-archives.php |
Description | The New Real - Generative AI workshop and panel |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Other audiences |
Results and Impact | Event commissioned by Creative Scotland. Closed workshop for artists, Beverley Hood chaired panel at public event |
Year(s) Of Engagement Activity | 2023 |
Description | Trustworthy Assurance of Digital Health and Healthcare workshop |
Form Of Engagement Activity | Participation in an activity, workshop or similar |
Part Of Official Scheme? | No |
Geographic Reach | Regional |
Primary Audience | Other audiences |
Results and Impact | Developing an assurance platform with the following aims: Development of existing methodology and platform; Improved impact; User experience enhancements and validation. Nayha Sethi represented BRAID to further grow our reach into the creation of Responsible AI platforms as part of the greater UK ecosystem. |
Year(s) Of Engagement Activity | 2023 |