Enabling a Responsible AI Ecosystem
Lead Research Organisation:
University of Edinburgh
Department Name: Sch of Philosophy Psychology & Language
Abstract
Problem Space: There is now a broad base of research in AI ethics, policy and law that can inform and guide efforts to construct a Responsible AI (R-AI) ecosystem, but three gaps must be bridged before this is achieved:
(a) Traditional disciplinary silos, boundaries, and incentives limiting successful generation and translation of R-AI knowledge must be addressed by new incentives and supportive infrastructure.
(b) The high barriers to adoption of existing research and creative work in R-AI must be removed so that public institutions, non-profits and industry (especially SME) can convert and embed R-AI into reliable, accessible and scalable practices and methods that can be broadly adopted.
(c) The primary actors empowered to research, articulate and adopt responsible AI standards must be more broadly representative of, and answerable to, the publics and communities most impacted by AI developments.
Approach: We will develop and deliver a UK-wide infrastructure that lays secure foundations for the bridges across these three gaps so that AI in the UK is responsible, ethical and accountable by default. The programme will be structured around three core pillars of work (Translation and Co-Construction, Embedding and Adoption, and Answerability and Accountability) and four strategic delivery themes that establish programme coherence: (1) AI for Humane Innovation: integrating within AI research the humanistic perspectives that enable the personal, cultural, and political flourishing of human beings, by weaving historical, philosophical, literary and other humane arts into dialogue with AI communities of research, policy and practise (2) AI for Inspired Innovation: activities to infuse the AI ecosystem with more vibrant, imaginative and creative visions of R-AI futures (3) AI for Equitable Innovation: activities directing research and policy attention to the need to ensure that broader UK publics, particularly those marginalised within the digital economy, can expect more sustainable and equitable futures from AI development, and (4) AI for Resilient Innovation: uplifting research, policy and practise that ensures AI ameliorates growing threats to global and national security, rule of law, liberty, and social cohesion.
Team: Co-Directors Vallor and Luger will lead and deliver the programme alongside the Ada Lovelace Institute, with a cross-disciplinary team of Co-Is who will leverage their networks to broaden/diversify the R-AI community and enhance disciplinary engagement with R-AI, acting as translational interfaces to ensure AHRC programme communications speak effectively to underrepresented cohorts within their communities/disciplines. As partner, the BBC will support public engagement activities to ensure trust, breadth of reach and public legitimacy.
Activities: Through a comprehensive programme of translation, research and engagement activities we will:
(1) Support existing and foster new R-AI partnerships, connecting AI researchers, industry, policymakers and publics around the cross-cutting themes.
(2) Build broader responsible AI visions by developing infrastructure for translation of R-AI research, inviting ECRs and new voices from the arts, humanities and civil society, to co-shape, interrogate and enrich visions of flourishing with AI.
(3) Learning from early R-AI work, surface and map barriers, incentives and opportunities to make R-AI research responsive to the needs and challenges faced by policymakers, regulators, technologists, and wider publics.
(4) Embed R-AI in policy and practice by conducting research and building capacity for translation of R-AI research into accessible and usable guidance for policymakers, industry leaders, SMEs and publics.
(5) Build trust in AI by rethinking accountability through three applied lenses: accountability to wider publics, answerability of current systems, and public mechanisms for recourse through consultation, creative mechanisms and synthesis activities.
(a) Traditional disciplinary silos, boundaries, and incentives limiting successful generation and translation of R-AI knowledge must be addressed by new incentives and supportive infrastructure.
(b) The high barriers to adoption of existing research and creative work in R-AI must be removed so that public institutions, non-profits and industry (especially SME) can convert and embed R-AI into reliable, accessible and scalable practices and methods that can be broadly adopted.
(c) The primary actors empowered to research, articulate and adopt responsible AI standards must be more broadly representative of, and answerable to, the publics and communities most impacted by AI developments.
Approach: We will develop and deliver a UK-wide infrastructure that lays secure foundations for the bridges across these three gaps so that AI in the UK is responsible, ethical and accountable by default. The programme will be structured around three core pillars of work (Translation and Co-Construction, Embedding and Adoption, and Answerability and Accountability) and four strategic delivery themes that establish programme coherence: (1) AI for Humane Innovation: integrating within AI research the humanistic perspectives that enable the personal, cultural, and political flourishing of human beings, by weaving historical, philosophical, literary and other humane arts into dialogue with AI communities of research, policy and practise (2) AI for Inspired Innovation: activities to infuse the AI ecosystem with more vibrant, imaginative and creative visions of R-AI futures (3) AI for Equitable Innovation: activities directing research and policy attention to the need to ensure that broader UK publics, particularly those marginalised within the digital economy, can expect more sustainable and equitable futures from AI development, and (4) AI for Resilient Innovation: uplifting research, policy and practise that ensures AI ameliorates growing threats to global and national security, rule of law, liberty, and social cohesion.
Team: Co-Directors Vallor and Luger will lead and deliver the programme alongside the Ada Lovelace Institute, with a cross-disciplinary team of Co-Is who will leverage their networks to broaden/diversify the R-AI community and enhance disciplinary engagement with R-AI, acting as translational interfaces to ensure AHRC programme communications speak effectively to underrepresented cohorts within their communities/disciplines. As partner, the BBC will support public engagement activities to ensure trust, breadth of reach and public legitimacy.
Activities: Through a comprehensive programme of translation, research and engagement activities we will:
(1) Support existing and foster new R-AI partnerships, connecting AI researchers, industry, policymakers and publics around the cross-cutting themes.
(2) Build broader responsible AI visions by developing infrastructure for translation of R-AI research, inviting ECRs and new voices from the arts, humanities and civil society, to co-shape, interrogate and enrich visions of flourishing with AI.
(3) Learning from early R-AI work, surface and map barriers, incentives and opportunities to make R-AI research responsive to the needs and challenges faced by policymakers, regulators, technologists, and wider publics.
(4) Embed R-AI in policy and practice by conducting research and building capacity for translation of R-AI research into accessible and usable guidance for policymakers, industry leaders, SMEs and publics.
(5) Build trust in AI by rethinking accountability through three applied lenses: accountability to wider publics, answerability of current systems, and public mechanisms for recourse through consultation, creative mechanisms and synthesis activities.
Organisations
- University of Edinburgh (Lead Research Organisation)
- National Galleries of Scotland (Collaboration)
- FutureEverything (Collaboration)
- DACS (Collaboration)
- KING'S COLLEGE LONDON (Collaboration)
- Microsoft Research (Collaboration)
- Runnymede Trust (Collaboration)
- Amsterdam University of Applied Sciences (Collaboration)
- Alan Turing Institute (Collaboration)
- British Broadcasting Corporation (BBC) (Collaboration)
- Department for Digital, Culture, Media & Sport (Collaboration)
- Ada Lovelace Institute (Collaboration, Project Partner)
- Somerset House (Collaboration)
- Serpentine Gallery (Collaboration)
- British Broadcasting Corporation - BBC (Project Partner)
Publications
Ada Lovelace Institute
(2025)
Copyright and AI consultation: response from BRAID and Ada Lovelace Institute
Ada Lovelace Institute
(2025)
Copyright and AI consultation: response from BRAID and Ada Lovelace Institute
Bennett S
(2025)
"Everybody knows what a pothole is": representations of work and intelligence in AI practice and governance
in AI & SOCIETY
Bird C
(2023)
Typology of Risks of Generative Text-to-Image Models
Catanzariti B
(2024)
Investigating the Impact of Facial Expression Recognition in Healthcare
Gabriel I
(2024)
The Ethics of Advanced AI Assistants
Ganesh B
(2024)
Policy Approaches for Building a Responsible Ecosystem
Gregory K
(2024)
Mitigating Harms in On-Demand Delivery Platforms
| Title | BRAID AI and the Arts: Who's Responsible (Artists' event) - Visual Minutes |
| Description | This image documents visual minutes created by illustrator Jonny Glover at the BRAID Programme event AI and Arts: Who's Responsible (Artists' Event), hosted by Science Gallery London, with FutureEverything on January 18th 2024. The visual minutes document the artists' attendees' shared perspectives reached through discussion, critical reflection, and debate. BRAID is a UK-wide programme dedicated to integrating Arts and Humanities research more fully into the Responsible AI ecosystem, as well as bridging the divides between academic, industry, policy and regulatory work on responsible AI. Funded by the Arts and Humanities Research Council (AHRC), BRAID represents AHRC's major investment in enabling responsible AI in the UK. The Programme runs from 2022 to 2028. Working in partnership with the Ada Lovelace Institute and BBC, BRAID supports a network of interdisciplinary researchers and partnering organisations through the delivery of funding calls, community building events, and a series of programmed activities. Funding reference: Arts and Humanities Research Council grant number AH/X007146/1. |
| Type Of Art | Image |
| Year Produced | 2025 |
| Impact | The visual minutes document the artist attendees' shared perspectives reached through discussion, critical reflection, and debate and is freely available for other researchers and publics to access. Downloads as of March 2025 = 57. |
| URL | https://zenodo.org/doi/10.5281/zenodo.14710975 |
| Title | BRAID AI and the Arts: Who's Responsible (Curatorial event) - Visual Minutes |
| Description | This image documents visual minutes created by illustrator Jonny Glover at the BRAID Programme event AI and Arts: Who's Responsible (Curatorial Event), hosted by Science Gallery London, with FutureEverything on January 19th 2024. The visual minutes document the curatorial attendees' shared perspectives reached through discussion, critical reflection, and debate. BRAID is a UK-wide programme dedicated to integrating Arts and Humanities research more fully into the Responsible AI ecosystem, as well as bridging the divides between academic, industry, policy and regulatory work on responsible AI. Funded by the Arts and Humanities Research Council (AHRC), BRAID represents AHRC's major investment in enabling responsible AI in the UK. The Programme runs from 2022 to 2028. Working in partnership with the Ada Lovelace Institute and BBC, BRAID supports a network of interdisciplinary researchers and partnering organisations through the delivery of funding calls, community building events, and a series of programmed activities. Funding reference: Arts and Humanities Research Council grant number AH/X007146/1. |
| Type Of Art | Image |
| Year Produced | 2025 |
| Impact | The visual minutes document the curatorial attendees' shared perspectives reached through discussion, critical reflection, and debate and is freely available for other researchers and publics to access. Downloads as of March 2025 = 51. |
| URL | https://zenodo.org/doi/10.5281/zenodo.14710565 |
| Description | AHRC-BBC strategic planning meeting |
| Geographic Reach | National |
| Policy Influence Type | Contribution to new or improved professional practice |
| Description | Contribution to International AI Safety Report |
| Geographic Reach | Multiple continents/international |
| Policy Influence Type | Contribution to a national consultation/review |
| URL | https://www.gov.uk/government/publications/international-ai-safety-report-2025 |
| Description | Contribution to UK POST Note on Policy Implications of AI |
| Geographic Reach | National |
| Policy Influence Type | Contribution to a national consultation/review |
| URL | https://post.parliament.uk/research-briefings/post-pn-0708/ |
| Description | Evidence Submission to House of Lords AI in Weapons Systems Committee |
| Geographic Reach | National |
| Policy Influence Type | Implementation circular/rapid advice/letter to e.g. Ministry of Health |
| URL | https://publications.parliament.uk/pa/ld5804/ldselect/ldaiwe/16/1602.htm |
| Description | Hosted delegation from Parliament of Canada |
| Geographic Reach | North America |
| Policy Influence Type | Contribution to a national consultation/review |
| Description | Input on Judging for Futurescot AI Challenge |
| Geographic Reach | Local/Municipal/Regional |
| Policy Influence Type | Contribution to new or improved professional practice |
| URL | https://stormid.com/futurescot-ai-challenge/ |
| Description | Input to European Commission SAM on 'Successful and timely uptake of Artificial Intelligence in science in the EU' |
| Geographic Reach | Europe |
| Policy Influence Type | Participation in a guidance/advisory committee |
| Description | Invited Briefing to BBC Board |
| Geographic Reach | Local/Municipal/Regional |
| Policy Influence Type | Participation in a guidance/advisory committee |
| Description | Oversight Board for Ada Lovelace Institute |
| Geographic Reach | National |
| Policy Influence Type | Participation in a guidance/advisory committee |
| Impact | The Ada Lovelace Institute is among the most trusted independent voices in AI policy that has a strong programme of work in public education and engagement around AI and data, as well as considerable policy influence in the UK and abroad. |
| URL | https://www.adalovelaceinstitute.org/about/our-people/ |
| Description | Presentation to MP Ian Murray on Responsible AI Governance and AI Health Challenges |
| Geographic Reach | National |
| Policy Influence Type | Implementation circular/rapid advice/letter to e.g. Ministry of Health |
| Impact | This response was received following the briefing: "A huge thanks for your excellent presentations and the stimulating discussion today. Ian was very impressed with the quality and relevance of the research we shared with him, and we're optimistic there will be plenty of further follow up, including with other members of Labour's Shadow Cabinet. We really appreciate you giving your time for this - it's so important for us to engage early on with what is likely to be the next UK government." |
| Description | RAIUK Roundtable on Generative AI in Writing and Publishing (Jones B) |
| Geographic Reach | National |
| Policy Influence Type | Participation in a guidance/advisory committee |
| Description | Report of NEPC Working Group on AI Sustainability |
| Geographic Reach | National |
| Policy Influence Type | Participation in a guidance/advisory committee |
| URL | https://nepc.raeng.org.uk/media/2aggau2j/foundations-for-sustainable-ai-nepc-report.pdf |
| Description | Testimony to U.S. Senate Committee on AI and Democracy |
| Geographic Reach | North America |
| Policy Influence Type | Implementation circular/rapid advice/letter to e.g. Ministry of Health |
| URL | https://www.hsgac.senate.gov/hearings/the-philosophy-of-ai-learning-from-history-shaping-our-future/ |
| Description | AI Media and Democracy Lab, Amsterdam, Netherlands |
| Organisation | Amsterdam University of Applied Sciences |
| Department | AI Media and Democracy Lab |
| Country | Netherlands |
| Sector | Academic/University |
| PI Contribution | Bronwyn has been an invited visitor to the Lab in Amsterdam, where she presented at the early career scholars conference on infrastructures of AI, and met with the research team. She was also an invited speaker at the Lab's yearly community meeting 2024 to share BRAID work and plan for future collaborative activity. Bronwyn is also facilitating conversations between BRAID partner the BBC and the Lab and co-writing a research article with two scholars from the Lab. |
| Collaborator Contribution | The Lab have provided access to international researchers exploring responsible AI in the media industry and to international media companies with which the Lab partners. The AIMDL team have hosted Bronwyn at their offices in Amsterdam. |
| Impact | Multidisciplinary: Media and Communication, AI, Science and Technology Studies, Law. |
| Start Year | 2024 |
| Description | Ada Lovelace Institute |
| Organisation | Ada Lovelace Institute |
| Country | United Kingdom |
| Sector | Charity/Non Profit |
| PI Contribution | We have provided leadership on the BRAID community engagement and launch event, scoping calls and fellowships calls which ALI helped us deliver; lead on both DCMS programmes of work in which ALI researchers are key participants and beneficiaries; helped promote aligned ALI activities and events through our BRAID network; co-organised BRAID workshops and blog posts with ALI, worked to integrate and build upon existing and ongoing ALI research in our BRAID outputs, and coordinated with ALI on contributions to the RAI UK Strategy Group. |
| Collaborator Contribution | ALI is an equal partner in the decision-making structure, including participation in all core programme research and management meetings and core ecosystem support. ALI contributes to the public engagement arm of the project, focusing on public attitudes and policy research. ALI has hosted BRAID meetings and co-organised BRAID workshops in their offices, and promotes BRAID activities. They also provide input on the BRAID scoping and fellowship calls, and have worked closely with us on the fellowships. They helped to organise, promote and deliver our community engagement/launch event on 15 September 2023 and contributed to the GAI Policy Summit in December 2023. They are a key partner also in the DCMS programmes of work, with their own policy fellows interacting with and supporting ours. |
| Impact | BRAID community launch event on 15 Sept 2023, coordinated with ALI Workshop with DSTL on LLMs in Defence DCMS1 supplement: A verbal account and short slide deck outlining key near-future policy challenges, a written report/literature review and a distilled policy briefing/note, verbal presentations in DCMS offices, Completion of three papers exploring supply chain accountability in AI systems, methods for assessing, risks of AI systems, and methods for government monitoring of AI developments. DCMS2 supplement: NOT YET DELIVERED: two reports: one on comparative approaches to regulation and a survey of evaluations of foundation models. Main BRAID: to follow. |
| Start Year | 2022 |
| Description | Alan Turing Institute: AI and the Arts Working Group |
| Organisation | Alan Turing Institute |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | Contributions made via Science Gallery London events https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists |
| Collaborator Contribution | Contributions made via Science Gallery London events https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists |
| Impact | Contributions made via Science Gallery London events https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists |
| Start Year | 2023 |
| Description | BBC |
| Organisation | British Broadcasting Corporation (BBC) |
| Department | BBC Research & Development |
| Country | United Kingdom |
| Sector | Public |
| PI Contribution | BRAID is led by the University of Edinburgh in partnership with the Ada Lovelace Institute and the BBC. BRAID covers 0.2FTE salary for Rhianne Jones, BBC Responsible Data Driven Innovation lead. The BRAID grant also covers the cost of a BBC Translational Fellow, Bronwyn Jones, hosted (employed) by UoE. Partners work in collaboration to further the goals of the BRAID programme. |
| Collaborator Contribution | BRAID is led by the University of Edinburgh in partnership with the Ada Lovelace Institute and the BBC. The BBC are the impartial, independent and trusted public service broadcasting organisation in the UK and, with ALI, they represent the public/media engagement arm of the project. Rhianne Jones, their Responsible Data Driven Innovation lead, commits 0.2 FTE to the project, and works closely with Bronwyn Jones, the BRAID Translational Fellow. Through the BBC the BRAID project accesses networks, partnerships, and consortia (e.g., AI4ME, AI Media & Democracy Lab, Centre for Digital Citizens TAS, Digital Horizon, Rights foundation Digital Futures Commission, Partnership on AI, Public Spaces) in order to connect with existing rights and citizen-focused communities. The also support access/use of channels for public engagement through, for example, the BBC Blue Room which has a remit for public engagement around technology and who organise public events, conferences, demonstrations and exhibitions, and other BBC media literacy and outreach initiatives. They also provide a second neutral platform, alongside ALI, for the project through offering BBC blogs and social media channels as a platform for the project. They also provide the project team and ALI access to their buildings and facilities. The BBC hosted and co-organised BRAID's September 2023 community launch event in the BBC Radio Theatre at their Broadcasting House in London. They provided staffing support on the night and in the months prior to the launch, organising the majority of logistics and coordinating with suppliers. They also produced a blog post with highlights from the event. In addition in August 2023 BBC co-convened with BRAID researchers the public workshop - Making and Deepfaking the News |
| Impact | Listed under engagements/publications: - BRAID community launch event - BBC blog post - Making and Deepfaking the News Workshop - research brief by Rhia Jones, Bronwyn Jones and Ewa Luger |
| Start Year | 2022 |
| Description | DCMS |
| Organisation | Department for Digital, Culture, Media & Sport |
| Country | United Kingdom |
| Sector | Public |
| PI Contribution | DCMS1: Policy Fellows - Six policy fellows completed individual reports for DCMS on various areas of Responsible AI - thematic deep dives aligned with areas that DCMS had identified as priorities leading into the AI Innovation white paper. In April 2023, the policy fellows traveled with the BRAID co-Directors to present their research at the Home Office to a group of UK civil service and policymakers from DCMS and DSIT, on the topics of Responsible AI governance, Explainable AI, Emotion detection/affective computing for mental health, Generative AI risks, and the rights of platform workers). DCMS2: BRAID hosted a Generative AI policy summit with DCMS staff on Dec 4 2023 in Edinburgh that gathered researchers from all across the UK. A UoE fellowship is also underway. |
| Collaborator Contribution | DCMS funded £300,000 for the programme in stage one; £150,000 supported six policy fellows at the University of Edinburgh to complete reports on various areas of Responsible AI identified as priorities for the UK government and policymakers, between January 2023-March 2023. The remaining £150,000 funded another set of policy reports from researchers at our BRAID delivery partner, the Ada Lovelace Institute. DCMS and hosted the University of Edinburgh and ALI researchers at the Home Office on 19 April to meet with DCMS/DSIT staff and present their research. A second round of £267,327 in DCMS funding for a new set of policy fellowships was approved in September 2023 for work to conclude in March 2024. |
| Impact | - Six research reports for DCMS produced in March 2023 from Responsible AI policy fellows at the University of Edinburgh; a further set of reports was delivered to DCMS by our partner Ada Lovelace Institute from their portion of the funding. - A white paper is being produced from the findings of the Dec 4 policy summit. - The work by Round 1 Policy Fellow Dr. Kasirzadeh led to the following publication: Bird C, Ungless E, Kasirzadeh A. (2023). Typology of Risks of Generative Text-to-Image Models. The fellows' work and wider collaboration was multidisciplinary, drawing from Philosophy; Law; Informatics; Science, Technology and Innovation Studies, and Design/HCI. The second round of activity is underway May 2024, with policy reports to follow. |
| Start Year | 2022 |
| Description | Design and Artists' Copyright Society (DACS) |
| Organisation | DACS |
| Country | United Kingdom |
| Sector | Charity/Non Profit |
| PI Contribution | BRAID invited Design and Artists' Copyright Society (DACS) to participate in a BRAID event under the AI for Inspired Innovation Theme, where they presented the results of their recent survey on AI and artists' work. Co-I Beverley Hood was a speaker at DACS's 40th Anniversary event 'Art & Tech: Protecting artists and enabling creativity' 25th June 2024. |
| Collaborator Contribution | Partner organisation spoke at Science Gallery London events. Launched the DACS AI and the Arts survey of 1,000 artists across UK, at the Science Gallery London event, which will feed back to government and policy conversations. |
| Impact | Improved our overall profile and reach by having BRAID events part of a larger AI and the Arts body of artists representative/policy work (the Co-I's work features in the report). |
| Start Year | 2023 |
| Description | Future Everything |
| Organisation | FutureEverything |
| Country | United Kingdom |
| Sector | Charity/Non Profit |
| PI Contribution | Future Everything partnered with BRAID in the programming of AI and the Arts events at Science Gallery London. |
| Collaborator Contribution | Future Everything partnered with BRAID in the programming of AI and the Arts events at Science Gallery London. Irini Papadimitriou, Creative Director of FutureEverything spoke at the Curatorial event. |
| Impact | https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://zenodo.org/records/14710922 https://zenodo.org/records/14711009 https://zenodo.org/records/14710976 https://zenodo.org/records/14710566 Feedback questionnaire showed impact of events: Enhanced networks, increased confidence in speaking about RAI |
| Start Year | 2023 |
| Description | King's College London (Science Gallery London) |
| Organisation | King's College London |
| Department | Science Gallery London |
| Country | United Kingdom |
| Sector | Academic/University |
| PI Contribution | BRAID worked together with Science Gallery London (SGL) to deliver two events as part of the AI for Inspiring Innovation theme. BRAID led on programming, provided leadership and budget. |
| Collaborator Contribution | SGL provided local coordination, venue, and improved our overall profile and reach by allowing our events to be part of a larger AI season of events. |
| Impact | https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-curatorial https://london.sciencegallery.com/sgl-events/ai-and-the-arts-whos-responsible-artists https://zenodo.org/records/14710922 https://zenodo.org/records/14711009 https://zenodo.org/records/14710976 https://zenodo.org/records/14710566 Feedback questionnaire showed impact of events: Enhanced networks, increased confidence in speaking about RAI |
| Start Year | 2023 |
| Description | Microsoft Research |
| Organisation | Microsoft Research |
| Country | Global |
| Sector | Private |
| PI Contribution | Purpose is to lead to the delivery of a successful 12-18 month collaborative research fellowship on embedding arts and humanities research in the field of responsible AI for the purpose of driving innovation. |
| Collaborator Contribution | Partner organisation participated in the funding call for our first round of fellowships by setting a research challenge, meeting with candidates to discuss proposals, and contributing to the assessment process. |
| Impact | Creation of a funding call with a wide variety of challenge-questions and stakeholders designed to foster successful fellowship projects embedding arts and humanities research in a variety of settings across the responsible AI ecosystem. |
| Start Year | 2023 |
| Description | National Galleries of Scotland |
| Organisation | National Galleries of Scotland |
| Country | United Kingdom |
| Sector | Public |
| PI Contribution | Purpose is to lead to the delivery of a successful 12-18 month collaborative research fellowship on embedding arts and humanities research in the field of responsible AI for the purpose of driving innovation. |
| Collaborator Contribution | Partner organisation participated in the funding call for our first round of fellowships by setting a research challenge, meeting with candidates to discuss proposals, and contributing to the assessment process. |
| Impact | Creation of a funding call with a wide variety of challenge-questions and stakeholders designed to foster successful fellowship projects embedding arts and humanities research in a variety of settings across the responsible AI ecosystem. |
| Start Year | 2023 |
| Description | Runnymede Trust |
| Organisation | Runnymede Trust |
| Country | United Kingdom |
| Sector | Charity/Non Profit |
| PI Contribution | Early stages of building this relationship which we anticipate will inform the BRAID Programme's work, in particular our AI for Equitable Innovation theme. We anticipate a partnership with both feeding into each others work. |
| Collaborator Contribution | Early stages of building this relationship which we anticipate will inform the BRAID Programme's work, in particular our AI for Equitable Innovation theme. We anticipate a partnership with both feeding into each others work. |
| Impact | Too early. |
| Start Year | 2023 |
| Description | Serpentine Gallery |
| Organisation | Serpentine Gallery |
| Country | United Kingdom |
| Sector | Charity/Non Profit |
| PI Contribution | BRAID invited Serpentine Art & Technology to participate in a BRAID event under the AI for Inspired Innovation Theme, where they presented the results of their recent activities around AI and the arts. Co-I Beverley Hood was a contributor to the Serpentine's Future Art Ecosystems Volume 4 Art x Public AI (2024) |
| Collaborator Contribution | Partner organisation spoke at Science Gallery London events. |
| Impact | https://reader.futureartecosystems.org/briefing/fae4 |
| Start Year | 2024 |
| Description | Somerset House Studios |
| Organisation | Somerset House |
| Country | United Kingdom |
| Sector | Charity/Non Profit |
| PI Contribution | BRAID invited Somerset House Studios to participate in a BRAID event under the AI for Inspired Innovation Theme, where they presented the results of their recent activities around AI and the arts. |
| Collaborator Contribution | Partner organisation spoke at Science Gallery London events. |
| Impact | https://zenodo.org/records/14710922 |
| Start Year | 2024 |
| Description | 13th June '23: Advances in Data Science and AI Conference (IDSAI'23) - Keynote 'Bridging Responsible AI Divides: Why the Arts, Humanities and Social Sciences are Critical to developing a Healthy AI Ecosystem' (Luger) |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | Invited talk - The Advances in Data Science and AI (ADSAI) Conference is Manchester's annual data science and AI conference and brought together researchers from across the broad fields within data science and artificial intelligence to explore the latest advances and to discuss important topical questions being posed by society around the use of data science and AI. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.idsai.manchester.ac.uk/connect/events/conference/advances-in-data-science-and-ai-confere... |
| Description | AI & Arts event at Science Gallery London |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | BRAID, represented by Beverley Hood, hosted two events on 18 and 19 January at Science Gallery London, drawing together artists from across the UK for discussions on the challenges and opportunities posed by AI. This event resulted in ecosystem building, critical debate within community, and networking. |
| Year(s) Of Engagement Activity | 2024 |
| Description | AUSTE SIMKUTE - CHI24, Auste Simkute & Microsoft - Getting Back Together: HCI and Human Factors Joining Forces to Meet the AI Interaction Challenge |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | A hybrid workshop to be hosted at the 2024 ACM Conference on Human Factors in Computing Systems "Abstract: The Human Factors Society and ACM SIGCHI jointly organized the first CHI conference in 1983, but during the remainder of the 1980s, Human-Computer Interaction (HCI) and Human Factors Engineering (HFE) increasingly diverged. The focus of HCI shifted from exploring systems for routinized activities of trained personnel, to a more general use of technology. HCI became predominantly design-oriented, concentrating on usability and user experience, moving further from HFE principles. However, the rapid growth of Artificial Intelligence (AI) applications posed unique and urgent challenges that call for a reestablishment of the connection between the two disciplines. We argue that by working as a team, HCI and HFE could more effectively address AI-posed challenges. We invite HCI and HFE researchers to take part in a full-day interactive hybrid workshop. With this workshop, we aim to initiate a collaboration between HCI and HFE and set a clear plan forward for a more united future. Ref: CHI EA '24: Extended Abstracts of the CHI Conference on Human Factors in Computing Systems Article No.: 472, Pages 1 - 5 https://doi.org/10.1145/3613905.363628" |
| Year(s) Of Engagement Activity | 2024 |
| URL | https://hcihfetogether.wordpress.com/ |
| Description | Auste Simkute - CHI24, Special Interest Group accepted- Implications of Regulations on the Use of AI and Generative AI for Human-Centered Responsible Artificial Intelligence |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | 26 Oct 23: TAS Workshop with Regulators on TAS Making Systems Answer project and regulation and AI, Edinburgh, 13 participants (Sethi) |
| Year(s) Of Engagement Activity | 2024 |
| Description | BBC R&D blog on BRAID Launch |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | A round up of the September 2023 BRAID launch event by BRAID Translational Fellow Bronwyn Jones and BBC partner researcher Rhia Jones, summarising the most important takeaways from the event. Aimed to push more interest towards BRAID project, as well as driving readers to sign up for our newsletter. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.bbc.co.uk/rd/blog/2023-10-responsible-ai-trust-policy-ethics |
| Description | BBC Science Focus article on AI Risk |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | PI Vallor published 'AI becoming sentient is risky, but that's not the big threat. Here's what is ' 12 Aug 2023, in BBC Science Focus. This contribution addressed emerging concerns about AI safety from a humanities-driven perspective. It resulted in emails from the public noting the value of the insight, and was picked up and republished by Skyways flight magazine in South Africa. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.sciencefocus.com/future-technology/will-ai-make-humans-dumber |
| Description | BRAID Blog Post: "How the Arts and Humanities are Crucial to Responsible AI" |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Other audiences |
| Results and Impact | On 15 June 23 BRAID Co-Directors Vallor and Luger published a blog post on the AHRC website announcing the BRAID programme and laying out its rationale and intended impact. It resulted in a surge of signups to the BRAID newsletter and interest in the programme's September launch event. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.ukri.org/blog/how-the-arts-and-humanities-are-crucial-to-responsible-ai/ |
| Description | BRAID Community Launch Event |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | The BRAID community launch event at the BBC Radio Theatre on 15 Sept 2023 had 273 participants, 178 in person, and 95 online. A variety of high-profile panellists and speakers, including a keynote from Rumman Chowdhury, sparked discussions amongst participants and clearly defined BRAID's trajectory as a project. The event was followed by a networking reception featuring 6 artist performances and displays engaging Responsible AI themes. In a survey post-event, 94% of respondents said it met or exceeded their expectations and 85% got value from the panels and keynote speaker. This resulted in a surge of social media engagement and newsletter sign ups. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.bbc.co.uk/programmes/p0gq60y1 |
| Description | BRAID Instagram page |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | BRAID started an Instagram page in December. Since then, we have grown to 105 followers and a 22.27% engagement rate as of 13/03/2024. It's used to communicate our opportunities, events, news, and website, and as a chance to engage with members of our community. |
| Year(s) Of Engagement Activity | 2023,2024 |
| URL | https://www.instagram.com/braid__uk/ |
| Description | BRAID LinkedIn page |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | Since being established in September 2023, the BRAID LinkedIn page has amassed 677 followers as of 13/03/2024 and has a 22.34% engagement rate in this quarter to date. The page is used to promote opportunities, events, news, research outputs, and network with our community. |
| Year(s) Of Engagement Activity | 2023,2024 |
| URL | https://www.linkedin.com/company/98665671/ |
| Description | BRAID Resilience Workshop |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Study participants or study members |
| Results and Impact | 17 Apr 2024: BRAID Resilience Workshop at the Ada Lovelace Institute, workshop report forthcoming ~20 participants" |
| Year(s) Of Engagement Activity | 2024 |
| Description | BRAID X page |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | We have a BRAID X page which as of 13/03/2024 has 685 followers and a 4.67% engagement rate. Our social media hosting platform, Hootsuite, quotes a 1% to 5% engagement rate as excellent, so that's a very positive sign. We have a reached a wide audience of multi-stakeholders and use the channel to promote our website, events, news, and research outputs. |
| Year(s) Of Engagement Activity | 2023,2024 |
| URL | https://twitter.com/Braid_UK |
| Description | BRAID blog on UK AI Safety Summit |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | The 13 October 2023 blog post, 'A shrinking path to safety: how a narrowly technical approach to align AI with the public good could fail', authored by PI Vallor and co-Director Luger, was a response to the announcement of the UK AI Safety Summit and its priorities. The blog is a call for the arts & humanities to be valued in such discussions just as much as technical experts. The blog post sparked policymaker and public discussion of this point at a panel with PI Vallor, Octavia Reeve of the Ada Lovelace Institute (BRAID partner), and representatives of DSIT at the AI Fringe Summit on 31 October leading up to the main Summit on 1-2 November. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://braiduk.org/a-shrinking-path-to-safety-how-a-narrowly-technical-approach-to-align-ai-with-th... |
| Description | BRAID/ALI/DSTL Workshop on Foundation Models in Defence |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Other audiences |
| Results and Impact | This meeting, hosted by PI Vallor and CoI Nehal Bhuta, convened with key institutional partners, brought together national policy makers from the defence sector with academics and technologists, to review and discuss 2 draft reports from the Ada Lovelace Institute concerning the possible public sector uses of Generative/Foundation Models. The meeting was attended by high level policy makers in CDEI and Defence, and also involved international participation to lend a comparative legal perspective. The participants drew lessons from the case studies presented by Ada Lovelace Institute, and reflected on their transposability to defence contexts. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Baillie Gifford European Media Forum on AI in Edinburgh |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | PI Vallor participated in an expert panel on AI for a group of European media leaders, hosted by Baillie Gifford. Topics discussed included work drawn from BRAID on AI in media, the impact on journalism, and wider public understanding of AI. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Blog Post: Edinburgh Declaration on Responsibility for Responsible AI |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | Following a joint summer workshop hosted at the University of Edinburgh, bringing together the 4 TAS Responsibility projects, and joined also by members of the TAS Governance and Regulation node, PI Vallor led a joint publication of a Medium blog post on 14 July intended to translate the consensus outcomes of the workshop for a wider audience. The post was signed by 24 members of the TAS programme, widely shared on social media and referenced on stage at the TAS Symposium. It also led to plans for future work on these themes by members of the responsibility projects, and a forthcoming public panel event on 27 March at the University of Edinburgh focused on the Declaration. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://medium.com/@svallor_10030/edinburgh-declaration-on-responsibility-for-responsible-ai-1a98ed2... |
| Description | Blog engagement |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | BRAID blog engagement sessions from October to December 2023: 5515 page sessions |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://braiduk.org/blog |
| Description | Blog post 'Edinburgh Declaration on Responsibility for Responsible AI' |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | This public-facing blog post was the output of a workshop that brought together interdisciplinary research teams from the 4 TAS Responsibility projects as well as BRAID and the TAS Governance project - it was intended to be a provocation to the wider Responsible AI community with specific recommendations for reform |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://medium.com/@svallor_10030/edinburgh-declaration-on-responsibility-for-responsible-ai-1a98ed2... |
| Description | British Science Festival Presidential address |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | BRAID Co-I Vaishak Belle delivered presidential address on theme of AI, harms and risks - "Risky Robots ", discussed impact on arts. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://britishsciencefestival.org/event/risky-robots/ |
| Description | CitAI (University of London) How to fund AI Art |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | CitAI (University of London) organised an AI workshop the 3rd of October 2023, where scholars and the AI art-world explored novel ideas to fund AI art in the UK. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://cit-ai.net/events.html |
| Description | Co-organised "No Harm Done" series event (an Ethics, AI & Design collaboration with an SME) |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | Bronwyn Jones organised event where three experts from the worlds of academia, design and business gave lightning talks that examined the ethical and design underpinnings of AI, followed by an open discussion. Aim was to increase understanding of transparency and accountability in AI decision-making |
| Year(s) Of Engagement Activity | 2023 |
| Description | DCMS Generative AI Policy Summit |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | As part of our partnership with DCMS, BRAID organised a policy event with a variety of experts from many industries to draw together a white paper for DCMS. The following aim was given to participants: The purpose of this event is to draw a small group of experts together, to inform DCMS thinking around Generative Artificial Intelligence (GAI). We want this to be a collegiate and enjoyable event with space for colleagues to interact and so, rather than the traditional speaker model, we will not ask you to prepare anything in advance. Instead, the event will be highly interactive and design-led, to facilitate discussions and draw us towards a joint set of policy recommendations. The white paper is drafted and is currently undergoing review before the final copy is submitted to DCMS and published on the BRAID website. |
| Year(s) Of Engagement Activity | 2023 |
| Description | DSIT Roundtable on Human-Machine Interaction |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | On 7 Mar 2024 PI Vallor joined a DSIT roundtable hosted at DSIT, on Human-Machine Interaction, in which DSIT sought expert advice from 15 academic, industry and third sector experts across the disciplines, on UK government response to new and emerging HMI risks, especially from generative AI. |
| Year(s) Of Engagement Activity | 2024 |
| Description | Delivered BBC Academy career training: AI for creative practitioners |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | Delivered a capacity building and training session with about 30 creative media practitioners from independent production companies across Wales. Attendees were highly engaged and gave verbal feedback indicating increased understanding of ethical and practical concerns related to AI and how to approach engaging responsibly with AI in their practice. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Digital Scotland panel on AI and Government Services |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | In this 21 Nov 2023 Digital Scotland expert panel on 'The emerging role of AI in government services', in conversation with representatives from Scottish government and Leidos, PI Vallor discussed the challenges of answerability to citizens and publics as part of the challenge of responsible use of AI in the public sector, drawing on her and her teams' work within BRAID and the TAS Responsibility projects. This was followed by several requests (verbal and later by email) by audience members in the public sector and industry, for future conversations and joining of the BRAID network. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://futurescot.com/futurescot-conferences/digitalscotland2023/ |
| Description | ECA Y7 Film screening |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | Local |
| Primary Audience | Other audiences |
| Results and Impact | premiere of AI generated feature film by Salford based artist duo. Beverley Hood haired panel. |
| Year(s) Of Engagement Activity | 2023 |
| Description | European Computer Science Summit |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | Ewa Luger hosted panel "How do we ensure that informatics shapes the future that we hope for?" and rose BRAID brand awareness. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Ewa Luger - 15th May '23: Women in Data Science (WiDS) Edinburgh, 'Establishing a Career in Human-Data Interaction' |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Postgraduate students |
| Results and Impact | Audience of women in data science, mostly ECRs. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Ewa Luger - 24th April '23: Special Interest Group organised with Microsoft: 'Foundation Models in Healthcare: Opportunities, Risks & Strategies Forward' |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | Publication abstract: Foundation models (FMs) are a new paradigm in AI. First pretrained on broad data at immense scale and subsequently adapted to more specific tasks, they achieve high performances and unlock powerful new capabilities to be leveraged in many domains, including healthcare. This SIG will bring together researchers and practitioners within the CHI community interested in such emerging technology and healthcare. Drawing attention to the rapid evolution of these models and proposals for their wide-spread adoption, we aim to demonstrate their strengths whilst simultaneously highlighting deficiencies and limitations that give raise to ethical and societal concerns. In particular, we will invite the community to actively debate how the field of HCI - with its research frameworks and methods - can help address some of these existing challenges and mitigate risks to ensure the safe and ethical use of the end-product; a requirement to realize many of the ambitious visions for how these models can positively transform healthcare delivery. This conversation will benefit from a diversity of voices, critical perspectives, and open debate, which are necessary to bring about the right norms and best practices, and to identify a path forward in devising responsible approaches to future FM design and use in healthcare. 50 participants approx (attendees of CHI, academics and industry) |
| Year(s) Of Engagement Activity | 2023 |
| Description | Ewa Luger - 25th April '23: Special Interest Group organised with Nokia Bell Labs 'Human-Centered Responsible Artificial Intelligence: Current & Future Trends' |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | Publication abstract: In recent years, the CHI community has seen significant growth in research on Human-Centered Responsible Artificial Intelligence. While different research communities may use different terminology to discuss similar topics, all of this work is ultimately aimed at developing AI that benefits humanity while being grounded in human rights and ethics, and reducing the potential harms of AI. In this special interest group, we aim to bring together researchers from academia and industry interested in these topics to map current and future research trends to advance this important area of research by fostering collaboration and sharing ideas. 50 participants approx (attendees of CHI, academics and industry) |
| Year(s) Of Engagement Activity | 2023 |
| Description | Ewa Luger - 31st May: Artificial Intelligence and Democracy, Scottish Parliament Podcast |
| Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | "With developments in artificial intelligence (AI) speeding up, what effect will this technology have on our democracy? What do we mean when we talk about artificial intelligence? What are the opportunities and what are the dangers? And how, as citizens and as a democracy, should we react? This Scotland's Futures Forum podcast, recorded in May 2023, explored the potential impact of AI on our lives and the implications we should consider. Featuring expert contributions from Professor Ewa Luger, Professor of Human-Data Interaction at the University of Edinburgh, and Richard Leonard MSP, convener of the Scottish Parliament's Public Audit Committee. https://futuresforum.scot/artificial-intelligence-and-democracy/" |
| Year(s) Of Engagement Activity | 2023 |
| Description | Ewa Luger - DCMS Policy Summit event |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | Hybrid event (London): An opportunity for DCMS phase 1 policy fellows to present their work, and get feedback, to policy makers. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Ewa Luger/Gavin Leuzzi - 04 Oct 23: Fellowships briefing webinar |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | Online briefing about the BRAID fellowships including a talk and Q&A. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Ewa Luger/Gavin Leuzzi - 13 Oct 23: Fellowships Information Evening at Ada Lovelace Institute |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | In-person briefing about the BRAID fellowships including a talk and Q&A. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Expert Panel at NYU-KAIST AI Global Governance Summit in New York City |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | Following the announcement of a partnership between NYU and the Republic of Korea, attended by the President of Korea, the NYU President, and NYU/Meta's Yann LeCun, the event hosted a wide-ranging panel discussion about AI and responsible digital governance by prominent international scholars in the field including PI Vallor. The other panelists included: Professor Kyung-hyun Cho, Deputy Director for NYU Center for Data Science & Courant Institute, Professor Luciano Floridi, Founding Director of the Digital Ethics Center, Yale University, Professor Urs Gasser, Rector of the Hochschule fur Politik, Technical University of Munich, Professor Stefaan Verhulst, Co-Founder & Director of GovLab's Data Program, NYU Tandon School of Engineering, and Professor Jong Chul Ye, Director of Promotion Council for Digital Health, KAIST. The event led to conversations about future collaborations between BRAID and the new NYU-KAIST partnership. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.nyu.edu/about/news-publications/news/2023/september/nyu-and-kaist-launch-major-new-initi... |
| Description | Expert Panel at TAS Symposium 2023, Edinburgh |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | PI Vallor spoke on expert panels on Responsible Autonomous Systems and on AI Policy and Regulation, to an audience of industry, government and academic partners, drawing on both research within BRAID and TAS projects. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://tas.ac.uk/bigeventscpt/first-international-symposium-on-trustworthy-autonomous-systems/ |
| Description | Fellowships Information Evening at Ada Lovelace London |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Other audiences |
| Results and Impact | In-person fellowships information evening that also gave participants a chance for networking. Participants able to clarify their ideas about applying to BRAID fellowships programme and ask questions. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Fellowships briefing webinar |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Other audiences |
| Results and Impact | Briefing session for academics potentially interested in BRAID fellowships. Information given on projects available, BRAID mission, scope, funding, and more. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Google's Mediterranean ML Summer School in Thessaloniki |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Postgraduate students |
| Results and Impact | PI Vallor gave a keynote on Responsible AI to machine learning students and early career researchers in the Mediterranean, drawing from research on AI and responsibility in both BRAID and TAS. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.m2lschool.org/past-editions/m2l-2023-greece |
| Description | International Law Association, Paris: Convened panel on new means and methods of warfare with a focus on AI |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | This panel was convened at the invitation of the International Law Association's organizing committee. The event brought together experts in law and military affairs, and AI, in order to discuss the current state of AI ethics in the military domain and the relationship with international law. The panel was very well attended (over 60 people) by international lawyers and foreign ministry officials from around the world. The impact of the event was to raise understanding among this group of the current state of the debate about responsible AI in the military domain and to highlight the need for normative development and practical measures to regulate military uses. Awareness of BRAID and programme's relevance to event generated. Attended by Nehal Bhuta |
| Year(s) Of Engagement Activity | 2023 |
| Description | Interview with Techcrunch |
| Form Of Engagement Activity | A magazine, newsletter or online publication |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | 20 Apr 2025: Interview with Techcrunch as part of the women in AI series https://techcrunch.com/2024/04/20/women-in-ai-ewa-luger-studies-how-ai-impacts-cultureand- vice-versa/ |
| Year(s) Of Engagement Activity | 2025 |
| Description | Keynote 'The AI Mirror' at Charlotte Ideas Festival |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | On 1 April, PI Vallor gave a keynote on AI at the Charlotte Ideas Festival in the USA, drawing on themes of responsibility and the importance of the arts and humanities in shaping our future with AI. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.charlotteshout.com/events/detail/shannon-vallor |
| Description | Keynote at Turing Fest 2023, Edinburgh |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | PI Vallor's keynote: 'Who is Responsible for Responsible AI? The Ecologies of a Responsible AI Ecosystem' drew from BRAID and TAS Responsibility research to help a broad audience understand the challenges and opportunities we face in building a Responsible AI Ecosystem in the UK. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://turingfest.com/speaker/shannon-vallor/ |
| Description | Living with AI Podcast on AI and Taking Responsibility |
| Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | On 19 July 2023 PI Vallor joined two other PIs of TAS Responsibility projects to talk about the work, the meaning of responsible AI, and future plans on the Living with AI podcast. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://podcasts.apple.com/au/podcast/ai-taking-responsibility/id1538425599?i=1000621628112 |
| Description | Met new UK Minister for Arts and Heritage followed by Panel |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | 01 Nov 23: Met new UK Minister for Arts and Heritage (Lord Parkinson), followed by (Chatham house rules) panel on the future impact of AI on the Arts and Cultural sector. Organised by 'What Next?'. Mayor of London's office, Jane Bonham Carter LD DCMS Spokesperson, House of Lords present (Luger) |
| Year(s) Of Engagement Activity | 2023 |
| Description | OxGen Generative AI Summit |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Other audiences |
| Results and Impact | Bronwyn Jones represented BRAID at the OxGen Generative AI Summit at Oxford University, raising awareness amongst multi-stakeholders. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Panel on Responsible AI at SAS Innovate |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Industry/Business |
| Results and Impact | On 8 June 2023 PI Vallor spoke on a panel on AI and Responsible Innovation with other Responsible AI experts at an industry event hosted at the Royal Institution by SAS, a partner on the TAS Responsibility project. One outcome of the event was the expression of interest from another panelist (Ray Eitel-Porter of Accenture) in getting involved with BRAID; he is now on the BRAID Advisory Board. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.sas.com/sas/events/innovate-on-tour/23/london.html |
| Description | Panel on the future impact of AI on the Arts and Cultural sector |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | Ewa Luger met new UK Minister for Arts and Heritage (Lord Parkinson), followed by (Chatham house rules) panel on the future impact of AI on the Arts and Cultural sector. Organised by 'What Next?'. Mayor of London's office, Jane Bonham Carter LD DCMS Spokesperson, House of Lords present |
| Year(s) Of Engagement Activity | 2023 |
| Description | Participant in FacultyAI 'Intelligence Rising' wargames |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Industry/Business |
| Results and Impact | The AI company FacultyAI sponsored a set of 'wargames' in early October 2023 on AI risk that allowed the expert participants to explore various scenarios of geopolitical and technological risk development, and strategies for managing those risks, over a 5-10 year horizon; PI Vallor was one of the participants (on the team of players representing corporate interests) and discussed several pieces of BRAID and TAS Responsibility research that informed the strategic thinking. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Plenary address: International AI Cooperation and Governance Forum in Hong Kong |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Policymakers/politicians |
| Results and Impact | On 8 Dec 2023 PI Vallor gave a plenary invited talk on AI Governance at the forum, hosted by Tsinghua University and Hong Kong University of Science and Technology, and attended by approx 750 Hong Kong/Chinese government representatives, international policymakers, AI researchers, and students.The talk highlighted themes of care, responsibility and trust and the importance of the integration of these in AI governance, and mentioned UK support for our research in this area through BRAID and TAS. Several plans for future conversation and collaboration resulted, including plans to coauthor work on AI safety and risk governance with a Cambridge AI researcher. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://aicg2023.hkust.edu.hk/ |
| Description | Podcast for ABC Radio National (Australia): 'What is AI Doing to Our Humanity?' |
| Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | I spoke about my book The AI Mirror and our work in BRAID for this Australian national news podcast. (28 minutes) I later received emails from listeners sharing how the episode informed and shaped their understanding of AI. |
| Year(s) Of Engagement Activity | 2024 |
| URL | https://www.abc.net.au/listen/programs/philosopherszone/shannon-vallor-what-is-ai-doing-to-our-human... |
| Description | Presentation on 'What is AI?' to Welsh Government |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | I was invited to give an informational presentation on AI, BRAID and Responsible AI to Welsh government, along with a presentation from the ICO. The Chief Social Research Officer at Welsh Government noted that the employees attending would have limited knowledge of AI and thus need a non-technical session that covers the following themes: The differences between AI, machine learning and large language models, what they do and what they can be used for. What are the risks to be aware of when making use of such tools, with particular reference to GDPR/data protection issues and how AI can amplify existing discrimination etc. Responses to the talk from Welsh government "We feel the session went very well and it was very positively received by staff, who found it both informative and insightful." The acting Head of Welsh Affairs responded: "I thought Shannon's presentation was excellent and really informative (and a few of my team who attended said so too!). Am looking forward to working with organisations over the next few years to see what initiatives Wales can come up with." |
| Year(s) Of Engagement Activity | 2024 |
| Description | Presentation on AI Safety at AI Fringe Summit |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | At this expert panel at the AI Fringe Summit, on 'Defining AI Safety' with panelists from Ofcom, Mozilla, DSIT, and Ada Lovelace Institute, PI Vallor discussed the role of the arts and humanities in a full and capable understanding of AI safety, drawing on the history of technical risk management and failures in other industries of overly narrow technical approaches that ignore human and cultural factors in safety. This was linked explicitly to the BRAID programme and the 13 Oct BRAID blog post on this topic. Further discussions afterward with fellow DSIT panelist Emran Mian indicated shaping of DSIT thinking on the importance of a sociotechnical and multidisciplinary approach to AI safety, which led to an invitation to a DSIT Roundtable on human-AI interaction hosted by Emran Mian on 7 March 2024. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://aifringe.org/events/expanding-the-conversation-defining-ai-safety-in-practice |
| Description | Presentation on Responsible AI to Regional Enterprise Council |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | Regional |
| Primary Audience | Industry/Business |
| Results and Impact | A presentation to Edinburgh/Scottish regional industry leaders on AI, which received this response from Neil McLean of the Social Enterprise Council: "I just wanted to write to you to say thank you so much for a fascinating and inspiring talk at the EFI tour on Friday morning. I am part of the Regional Enterprise Council (City Region Deal group) and found the whole day fascinating. As someone who spent 15 years working in the Technology industry (some of it in the US), I found your talk and some of the issues you touched upon, absolutely fascinating." |
| Year(s) Of Engagement Activity | 2024 |
| Description | Presentation to UK-US Forum on Science in the Age of AI, Royal Society London |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | A presentation to leading science funding bodies and policymakers in the UK and US on how AI will be changing, advancing, and challenging the integrity of the practice of science. |
| Year(s) Of Engagement Activity | 2024 |
| URL | https://royalsociety.org/news-resources/projects/us-uk-forum-2024/ |
| Description | Presented and facilitated hands-on generative AI session for the "No Harm Done" series |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | About 40 people attended the session to raise the ethical issues associated with generative AI in particular industries (e.g. media and news publishing) and engage in hands-on use of generative AI tools including ChatGPT, DALL-E, Midjourney was a way to identify opportunities for use and limitations. This sparked discussion and further engagement from participants and businesses in later events (i.e. people signing up for the series, sharing their details etc.) |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://nilehq.com/ethics-design-ai |
| Description | Public Event 'Who is Responsible for Responsible AI?' |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | This was a large public event held and broadcast online from Edinburgh's Playfair Library, a fireside chat of experts involved in the July 23 TAS Responsibility projects' 'Edinburgh Declaration on Responsibility for Responsible AI" |
| Year(s) Of Engagement Activity | 2024 |
| URL | https://efi.ed.ac.uk/event/technomoral-conversations-who-is-responsible-for-responsible-ai/ |
| Description | Public workshop - Making and Deepfaking the News |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | A two-hour participatory workshop that was open to the public and convened with a research partner (BBC), covering the challenges AI-generated content poses for journalism and our shared information ecosystem. The purpose was to a) educate and upskill the public and professionals from other industries on how AI (including generative AI and deepfakes) work and provide them with critical skills for assessing media content, and b) engage participants in deliberation of the ethical and normative questions around the use of AI in news production. Several educators contacted me afterwards saying the session was helpful for their teaching and asking us to share materials. There were many questions from attendees during and after the session and some posted on social media about the event. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://inspace.ed.ac.uk/the-sounds-of-deep-fake/ |
| Description | Publishing Scotland, recorded panel |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | Presentation and panel discussion about AI in publishing, including Q&A with attendees, intended to inform and build capacity amongst professionals. The event was recorded and uploaded to Youtube and as of 2025, had 200 views. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.youtube.com/watch?v=PqcjiziZlsA |
| Description | Royal Institution Youth Summit on AI in London |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Schools |
| Results and Impact | On 18 Sept 2023 BRAID Co-Directors Shannon Vallor and Ewa Luger presented on BRAID and Responsible AI to the audience, made up largely of UK sixth form and college students brought together from around the country to hear from experts on AI and to discuss the topic of the upcoming Christmas Lectures.The event was comprised of a series of in-person talks by experts as well as guided and facilitated discussions by students, which Vallor and Luger also helped facilitate. The aim of the Youth Summit was to gather young people's opinions, concerns and interests about AI, and support the next generation of AI and STEM students. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.rigb.org/whats-on/royal-institution-youth-summit-2023 |
| Description | Russell Group University Communications Professionals Conference |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | Presentation to a group of approx 150 media and communications professionals working across UK universities. The intended purpose was to help them understand and prepare for the changes generative AI would bring to their profession and working lives and take that knowledge back to their teams and workplaces to enact changes that support responsible use and integration of AI. This led to many questions and discussions during and after the session and later, to further invites to speak to other cohorts of professionals from the same industry, and the organisers reported good feedback and shared further questions for us to answer over email. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Salzburg Conf for Young Analytic Philosophy: agency in artificial systems |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Postgraduate students |
| Results and Impact | Fabio Tollon represented BRAID at this conference, presenting a talk on his paper 'Artificial Systems: Entities or Agents?'. He also chaired other talks. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Scotsman DDI Conference, Edinburgh, panel on AI Past, Present, Future |
| Form Of Engagement Activity | A formal working group, expert panel or dialogue |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Industry/Business |
| Results and Impact | PI Vallor spoke on an AI expert panel on AI Past, Present and Future at the 2023 Scotsman Data Conference in Edinburgh on 27 Sept 2023, at the Royal College of Physicians of Edinburgh. The panel was then reported on by the Scotsman (links below). The conference led to several requests to learn more about BRAID and join our network. https://www.scotsman.com/business/data-conference-time-for-us-to-rethink-the-thinking-machine-4361064 https://www.scotsman.com/business/data-conference-data-driven-innovations-jarmo-eskelinen-on-a-mission-to-tackle-major-challenges-4361050 |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.nationalworldevents.com/sdc-2023/agenda/ |
| Description | Scottish AI Alliance blog post on BRAID Fellowships |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Industry/Business |
| Results and Impact | 1 Nov 2023 Blog on the Scottish AI Alliance website about the BRAID Fellowships, written by Gavin Leuzzi in order to raise more awareness of fellowships, and drive researchers to submit an application and industry partners to participate. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.scottishai.com/news/ai-arts-and-humanities-research-and-braid-fellowships |
| Description | Scottish AI Summit workshop: AI For the Next Generation |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | PI Vallor helped organise and host this workshop, designed to engage with participants to think collaboratively about how to operationalise a high-level vision for AI that can truly serve a broad and diverse Scottish public. We built upon a series of prior workshops (funded by the Alan Turing Institute, and led by BRAID Policy Fellow Atoosa Kasirzadeh) that centered diverse and underrepresented perspectives, drawing also from BRAID's Equitable Innovation theme, and aimed at creating a shared vision for AI futures for the next generation. We solicited further input at the Summit that shaped the final report. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.scottishaisummit.com/ai-for-the-next-generation-realizing-an-inclusive-vision-for-scotti... |
| Description | Shannon Vallor - 9th March '23: Talk for Monash Prato Dialogues |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | The Monash Data Futures Institute Prato Dialogue Distinguished Lecture Series aims to explore the evolving impact of data science and AI in society by fostering a global dialogue. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://www.youtube.com/watch?v=o66i5w1avcU |
| Description | Shannon Vallor - 14 Nov 23: Keynote at WASP-HS Conference on AI for Humanity |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Other audiences |
| Results and Impact | The Wallenberg AI, Autonomous Systems and Software Program - Humanities and Society invited me as keynote speaker at the WASP-HS annual conference AI for Humanity and Society 2023. WASP-HS welcomed over 200 researchers, representatives from industry, and policymakers to discuss these issues on 14-15 November at Malmö Live in Malmö, Sweden. Multiple further conversations and contacts have followed from this talk, including a potential collaboration with a Norwegian team on a proposed ERC grant. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://wasp-hs.org/shannon-vallor-is-keynote-speaker-at-ai-for-humanity-and-society-2023/ |
| Description | Shannon Vallor - 23rd May '23: Trustworthy Autonomous Systems podcast on Responsibility |
| Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Professional Practitioners |
| Results and Impact | A Projects episode focusing on projects dedicated to researching the responsibility aspects of AI: Responsibility Projects - UKRI Trustworthy Autonomous Systems Hub (tas.ac.uk) . The podcast was a conversation between Vallor, Lars Kunze and Ibrahim Habli, all PIs on the TAS programme. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://podcasts.apple.com/au/podcast/ai-taking-responsibility/id1538425599?i=1000621628112 |
| Description | Shannon Vallor - 30th May, 23: Podcast for National Technology News/Perspective Publishing, Ross Law |
| Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | To discuss the future of AI and what steps can be taken to ensure it develops in a way that is responsible and supports human flourishing, National Technology News reporter Ross Law was joined by Shannon Vallor, co-director of the UKRI Arts and Humanities Research Council's BRAID (Bridging Responsible AI Divides) Programme and the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute (EFI) at the University of Edinburgh. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://nationaltechnology.co.uk/podcast-archives.php |
| Description | Shannon Vallor/Ewa Luger 19th April '23: Collaborative events with DCMS policymakers |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Policymakers/politicians |
| Results and Impact | BRAID researchers and policy fellows from BRAID and Ada Lovelace Institute traveled to the Home Office to present their rapid policy reports on AI to policymakers at DCMS and DSIT Office of AI; earlier versions had been shared in advance of their completion of the UK policy paper 'A pro-innovation approach to AI regulation' |
| Year(s) Of Engagement Activity | 2023 |
| Description | Somerset House Studios (SHS) & Serpentine events visit |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Third sector organisations |
| Results and Impact | Beverley Hood represented BRAID at Serpentine's event. Hood has contributed to Serpentine's Future Art Ecosystem yearly publication. Interview materials will be incorporated and acknowledged in credits. |
| Year(s) Of Engagement Activity | 2024 |
| Description | Talk for the University of Copenhagen's 'Science of the Predicted Human' Series |
| Form Of Engagement Activity | A talk or presentation |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Postgraduate students |
| Results and Impact | On 17 April 2023 PI Vallor gave a lecture, 'The AI Mirror', to social data science, computer science and machine learning researchers and students at the University of Copenhagen's Copenhagen Center for Social Data Science, drawing from award research on Responsible AI as well as her forthcoming book |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://sodas.ku.dk/events/the-science-of-the-predicted-human-talk-series-professor-shannon-vallor/ |
| Description | The Good Robot Podcast |
| Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
| Part Of Official Scheme? | No |
| Geographic Reach | International |
| Primary Audience | Public/other audiences |
| Results and Impact | The Good Robot podcast is a popular venue for AI related conversations drawing from multiple disciplines; PI Vallor's episode was recorded in 2023 and published in March 2024, drawing on work in feminist ethics linked to the Equitable Innovation theme of BRAID. |
| Year(s) Of Engagement Activity | 2024 |
| URL | https://podcasts.apple.com/gb/podcast/shannon-vallor-on-feminist-care-ethics-techno-virtues/id157023... |
| Description | The National Technology News Podcast on The Future of AI |
| Form Of Engagement Activity | A broadcast e.g. TV/radio/film/podcast (other than news/press) |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Public/other audiences |
| Results and Impact | On 25 July 2023 this podcast interview of PI Vallor on The Future of AI was broadcast; here is the podcast description: "To discuss the future of AI and what steps can be taken to ensure it develops in a way that is responsible and supports human flourishing, National Technology News was joined by Shannon Vallor, co-director of the UKRI Arts and Humanities Research Council's BRAID (Bridging Responsible AI Divides) Programme and the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute (EFI) at the University of Edinburgh." |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://nationaltechnology.co.uk/podcast-archives.php |
| Description | The New Real - Generative AI workshop and panel |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | Regional |
| Primary Audience | Other audiences |
| Results and Impact | Event commissioned by Creative Scotland. Closed workshop for artists, Beverley Hood chaired panel at public event |
| Year(s) Of Engagement Activity | 2023 |
| Description | Trustworthy Assurance of Digital Health and Healthcare workshop |
| Form Of Engagement Activity | Participation in an activity, workshop or similar |
| Part Of Official Scheme? | No |
| Geographic Reach | Regional |
| Primary Audience | Other audiences |
| Results and Impact | Developing an assurance platform with the following aims: Development of existing methodology and platform; Improved impact; User experience enhancements and validation. Nayha Sethi represented BRAID to further grow our reach into the creation of Responsible AI platforms as part of the greater UK ecosystem. |
| Year(s) Of Engagement Activity | 2023 |
| Description | Webstory: Edinburgh Innovations | How AI regulation could create innovation |
| Form Of Engagement Activity | Engagement focused website, blog or social media channel |
| Part Of Official Scheme? | No |
| Geographic Reach | National |
| Primary Audience | Professional Practitioners |
| Results and Impact | Publicity for the programme and fellowships funding call. |
| Year(s) Of Engagement Activity | 2023 |
| URL | https://edinburgh-innovations.ed.ac.uk/news/how-ai-regulation-could-create-innovation |
