DAY 1: Mon 27 Feb 2023 | DAY 2: Tue 28 Feb 2023 | DAY 3: Wed 1 Mar 2023 | DAY 4: Thu 2 Mar 2023 |
Public Opening Lectures 6pm - 8pm Great Hall, Wills Memorial Building Topic: AI support for conservation and natural history TV productions. |
Interactive AI 10am - 3pm QB 1.6, Queens Building + MS Teams Stream Topic: AI systems that capture biological knowledge to interpret the natural world. |
MyWorld 9:30am - 12:30pm Waterside 3 at Watershed Topic: AI & Nature - Image processing and movement for future immersive experiences. |
Visual AI for Life Sciences Talk 10am - 12noon Pugsley LT, Queens Building + MS Teams Stream Topic: Visual AI systems and animal biometrics. |
Wilddrone Talk 12:30pm - 2pm Pugsley LT, Queens Building + MS Teams Stream Topic: Drones for science and conservation. | |||
Online AI Tutorial 4pm - 5pm Public Online, MS Teams only Topic: Learn how to use and deploy online-accessible AI. |
AI & Farming - John Oldacre Centre Talks 2pm - 4pm Pugsley LT, Queens Building + MS Teams Stream Topic: AI for farming and animal welfare. |
AI meets Ecology Workshop 3pm - 4:30pm Public Online, MS Teams only Topic: AI support for ecology. |
Join us at the Great Hall! Come along to Wills Memorial Building on Monday 27 February 2023 between 6pm and 8pm for an evening of two fascinating lectures about how AI can assist wildlife conservation and wildlife documentary production. Robert Dawes from the BBC Research and Development Team and internationally renowned AI scientist and ecologist Prof Tanya Berger-Wolf from Ohio State University will take you on a journey to see how AI for Good can be a reality today. Reserve your free spot at these public lectures by BOOKING HERE. Full information about the talks and speakers can be found in the speaker section. | ||
06:00pm-06:15pm | Arrival Conclusion and Welcome | |
06:15pm-06:50pm | Robert Dawes (BBC): Keeping Watch: Using AI to assist Natural History Television Production Robert Dawes from BBC Research and Development will talk about his team's fascinating collaborations with the BBC's Natural History Unit - using computer vision and machine learning to help with the production of shows such as Springwatch. | |
06:50pm-07:50pm | Prof Tanya Berger-Wolf (Ohio): Partnership for Conservation: AI and Humans Combatting Extinction Together We are losing the planet's biodiversity at an unprecedented rate and in many cases, we do not even have the basic numbers. Photographs, taken by field scientists, tourists, citizen scientists, automated cameras, incidental photographers, and collected from social media are the most abundant source of data on wildlife today. Data science and machine learning can turn massive collections of images into high resolution information database about wildlife, enabling scientific inquiry, conservation, and policy decisions. Machine learning and artificial intelligence have advanced significantly over the past decade. Nonetheless, to successfully address biodiversity crisis and other societal challenges, we need the complementary capabilities of both humans and machines, in partnership. I will show an example of how data-driven, AI-enabled decision process becomes trustworthy by opening a wide diversity of opportunities for participation, supporting community-building, addressing the inherent data and computational biases, and providing transparent measures of performance. The community becomes the decision-maker, and AI scales the community, as well as the puzzle of data and solutions, to the planetary scale. | |
07:50pm-08:00pm | Closing Remarks |
10:00am-10:10am | Welcome and Scene Setting | |
10:10am-10:35am | Otto Brookes (UoB): Triple-stream Deep Metric Learning of Great Ape Behavioural Actions | |
10:35am-11:00am | Rachael Laidlaw (UoB): Reviewing MegaDetector: Applications and Future Prospects | |
11:00am-11:25am | Tayfun Karaderi (UoB): Visual Microfossil Identification via Deep Metric Learning (TBC) | |
11:25am-11:50am | Mowen Xue (UoB): Eagle Eyes: Detecting Tiny Animals from Aerial Images | |
11:50am-12:00pm | Chat and Reflect |
04:00pm-05:00pm | Dr Sasha Luccioni, Nima Boscarino (HuggingFace): Tutorial: Deploying AI for Nature Solutions Online |
09:30am-09:50am | Arrival and Refreshments | |
09:50am-10:00am | Welcome and Scene Setting, Prof David Bull | |
10:00am-10:30am | Dr Pui Anantrasirichai (UoB): AI-based Image Processing on Challenging Data of Nature Challenging acquisition environments cause poor visibility and exhibit distortions that deteriorate the performance of the tools we normally use for editing and for automation. This talk will present AI-based image processing techniques developed to overcome these problems. Applications examples will include acquisition in low-light environments, mitigating the effects of atmospheric turbulence, and the use of satellite imaging to predict natural events. | |
10:30am-11:00am | Dr Martin Parsons (Bath): AI and Motion Capturing Animals | |
11:15am-12:00pm | Prof David Bull (UoB): OurWorld - A Vision for Future Interactive Zoo Experiences Driven by a passion for the natural world and the growing threat to, and importance of, biodiversity, this talk will explore ways we might exploit immersive experiences and AI tools to raise awareness of the impact of humans in the world around us. Bristol has a long-established place in the world as the ‘Hollywood’ of natural history film-making. This coupled with our region’s capacity for digital innovation, its leadership in creativity and story-telling and its appetite for radical social change provides a foundation for the creation of new immersive experiences. The OurWorld concept emerged from a desire to regenerate the site of the first provincial zoological garden in the World, based on emerging technologies that enable visitors to travel in time and space and to interact in undreamt of ways with the wildest and most secret aspects of the natural world; to understand for the first time where humankind really sits within the complex web of life on earth. | |
12:00pm-12:30pm | Chat and Reflect |
02:00pm-02:20pm | Syed Yunas (UWE): Early Detection of Stress in Pig Faces using Machine Vision to Reduce Anti-Microbial Use for Diseases | |
02:20pm-02:40pm | Jing Gao (UoB): Reducing the Individual Labelling Effort for Holstein-Friesian Cattle using Deep Learning | |
02:40pm-03:00pm | Andrew Dowsey (UoB): Artificial Intelligence Platforms for Monitoring Small Ruminant Health in Resource Poor Settings | |
03:00pm-03:20pm | Asheesh Sharma (UoB): Fully Automated Welfare Assessments, a Deep Learning Approach | |
03:20pm-03:40pm | Kane Colston (UoB): Using Computer Vision Technology in the Remote Detection of Animal Heart Rates | |
03:40pm-04:00pm | Mubeen Tayyab (UWE): Machine Vision for Mealworm Farming |
10:00am-10:15pm | Welcome and Scene Setting | |
10:15am-11:00am | Dr Tilo Burghardt (UoB): Vision for the Life Sciences 3.0 In 1952, Alan Turing published his last paper, on morphogenesis, attempting to explain aspects of life's emerging visual complexity by objective mathematical modelling. A first generation of 'mathematical biologists' then followed in his footsteps casting many of the visual and non-visual patterns of life into constructive and even predictive equations. The discovery of the genetic origins of life's information apparatus then sparked a second revolution founding disciplines such as 'bioinformatics' and 'DNA forensics'. Seventy years after Turing's paper new tools have now reinvigorated this field again. This time, deep learning in particular is changing the domain by starting to replace manually constructed models with data-driven AI solutions. One may call this field 'computational biometrics' or, for the related domain of computer vision modelling the phenotype as discussed in the opening lectures by Tanya Berger-Wolf, 'imageomics'. This third revolution is still in progress and the limits of its reach are still to be determined. This talk will trace key directions of this third revolution in visual computational aspects of biology and ecology, viewed under the scope of computer vision. Various lines of work at the University of Bristol will serve as examples to show the diversity of the emerging interdisciplinary science, from computational taxonomics and automatic calorific expenditure estimation from video, to species and individual identification via free-flying autonomous drones. The talk will highlight key challenges today and call for a renewed focus on cross-disciplinary science. | |
11:00pm-11:30pm | Q&A, Chat and Reflect |
12:30pm-12:45pm | Welcome and Scene Setting, Dr Tilo Burghardt | |
12:45pm-01:30pm | Prof Tom Richardson (UoB): Drones for Conservation and Science Tom will be talking about his work with drones ranging from science applications through to conservation. From flight tests in Bristol through to deployments in Central America, Papua New Guinea and even underground, he will talk about recent projects and future research directions. | |
01:30pm-02:00pm | Q&A, Chat and Reflect |
03:00pm-03:15pm | Welcome, Dr Tilo Burghardt | |
03:15am-03:45am | Prof Benjamin Risse (Muenster): Is Science Mostly Driven by Ideas or by Tools: In-field insect monitoring, tracking and environmental mapping systems Starting from the maybe provocative questions if 'science is mostly driven by ideas or by tools' (Dyson 2012) I will present one of the most difficult computer vision and machine learning scenario, namely quantifying insect behaviour in natural environments: Tiny, low contrast and fast moving objects have to be localised in cluttered and dynamic scenes resulting in visual ambiguities, frequent occlusions and the need to process huge video files with inappropriate foreground-background ratios and sparse visitation events. I will summarize some of the most prominent challenges to detect insects and quantify their in unconstrained settings and will introduce an appearance-agnostic in-field animal detection and tracking algorithm to extract accurate measurements from camera footage. I will close by presenting complementary environmental mapping approaches that yield rich contextual information, and moreover can be combined with behavioural data to achieve the urgently needed lab-level quantifications in wildlife conditions. | |
03:45pm-04:15pm | Prof Frank van Langevelde (WUR): LEARNING FROM ANIMAL MOVEMENT Understanding movement of animals is key in ecology. Tracking animals using GPS and accelerometers will not only yield understanding about the whereabouts of wildlife, also about their conservation. I illustrate this by using understanding about predator-prey interactions to develop a system to prevent poaching. I will further explore possible alternatives for GPS to follow wildlife. These examples show that AI is instrumental to developments in ecology. | |
04:15pm-04:30pm | Open Discussion |
Tanya Berger-Wolf | Robert Dawes | Benjamin Risse |
Ohio (USA) | BBC (UK) | Muenster (Germany) |
  | ||
Frank van Langevelde | Tom Richardson | David Bull |
WUR (Netherlands) | Bristol (UK) | Bristol (UK) |
  | ||
Pui Anantrasirichai | Martin Parsons | |
Bristol (UK) | Bath (UK) |
Public In-person Lecture | 27 Feb | 6pm | Great Hall Prof Tanya Berger-Wolf (Ohio) PARTNERSHIP FOR CONSERVATION: AI and Humans Combatting Extinction Together
| |||
Brief Bio of Prof Tanya Berger-Wolf. As one of the founders of the field of computational ecology, and recently imageomics, Tanya is leading research at the unique intersection of computer science, wildlife biology, and social sciences. She creates computational solutions to generate, collect, analyze and derive insight from data to answer questions and make decisions in ecology, environment and wildlife conservation. Tanya is a Professor of Computer Science and Engineering; Electrical and Computer Engineering; and Evolution, Ecology, and Organismal Biology at the Ohio State University, where she is also the Director of the Translational Data Analytics Institute. She is also a director and co-founder of the AI for conservation software non-profit Wild Me, home of the Wildbook project, which brings together computer vision, crowdsourcing, and conservation. It has been recently chosen by UNSECO as one of the top AI 100 projects worldwide supporting the UN Sustainable Development Goals. Tanya is a co-founder of a several big projects and fields of study. She works with many teams of brilliant people who do cool, amazing, and impactful things. | |||
Public In-person Lecture | 27 Feb | 6pm | Great Hall Robert Dawes (BBC) KEEPING WATCH: Using AI to assist Natural History Television Production
| |||
Brief Bio Robert Dawes (MEng). Robert is a Lead Research Engineer in the BBC Research & Development. He studied computer science at the University of Bristol before joining BBC R&D in 2006. He worked on set top box technologies and in the BBC iPlayer team before specialising in the application of image analysis, computer vision and tracking technologies to broadcasting. He currently leads the Visual Computing team, as part of the Production Applied Research Area, which investigate and develop tools to process, analyse and understand video – normally in real time. The team works with broadcast companies and production teams to try and create practical tools to help them with their work. This has included several years of work on sports analysis and graphics systems. In recent years, Robert has worked in collaboration with the Natural History Unit to investigate new ways to monitor and capture wildlife footage. Much of this work takes advantage of the rapidly developing field of artificial intelligence and machine learning. | |||
Public Online Lecture | 2 Mar | 3pm | MS Teams Prof Benjamin Risse (Muenster) IS SCIENCE MOSTLY DRIVEN BY IDEAS OR BY TOOLS: In-field insect monitoring, tracking and environmental mapping systems Starting from the maybe provocative questions if 'science is mostly driven by ideas or by tools' (Dyson 2012) I will present one of the most difficult computer vision and machine learning scenario, namely quantifying insect behaviour in natural environments: Tiny, low contrast and fast moving objects have to be localised in cluttered and dynamic scenes resulting in visual ambiguities, frequent occlusions and the need to process huge video files with inappropriate foreground-background ratios and sparse visitation events. I will summarize some of the most prominent challenges to detect insects and quantify their in unconstrained settings and will introduce an appearance-agnostic in-field animal detection and tracking algorithm to extract accurate measurements from camera footage. I will close by presenting complementary environmental mapping approaches that yield rich contextual information, and moreover can be combined with behavioural data to achieve the urgently needed lab-level quantifications in wildlife conditions. | |||
Brief Bio of Prof Benjamin Risse. Benjamin Risse is the head of the Computer Vision & Machine Learning Systems group at the University of Münster. He is interested in interdisciplinary research questions involving the development of novel image analysis techniques with a particular focus on the imaging hardware and algorithms for quantitative animal behaviour and sustainable development. After receiving his PhD in 2015, which was in the intersection of computer vision, machine learning and neuroscience, he was a postdoc at the University of Edinburgh in which his research was focussed artificial intelligence including insect tracking, habitat reconstruction, robotics and neurobiologically inspired algorithms. Since 2022 he is a Professor for Sustainable Development at the Institute for Geoinformatics in Muenster, Germany. | |||
In-person Lecture | 2 Mar | 12:30pm | Pugsley LT > Prof Tom Richardson (University of Bristol) DRONES FOR CONSERVATION AND SCIENCE Tom will be talking about his work with drones ranging from science applications through to conservation. From flight tests in Bristol through to deployments in Central America, Papua New Guinea and even underground, he will talk about recent projects and future research directions. | |||
Brief Bio of Prof Tom Richardson. Tom Richardson is Professor of Aerial Robotics at the University of Bristol in the UK. Passionate about aerospace engineering, he specializes in the application of modern control theory and novel sensors to UAS/Drones. He has been granted permission for BVLOS (Beyond Visual Line of Sight) operations in multiple countries and was the International Drone Safety Lead for the Multinational Deep Carbon Observatory (DCO) funded ABOVE field campaign to Papua New Guinea in 2019. Tom is a founding partner of Perceptual Robotics, has held an NPPL (pilots license) for over 15 years, and has worked with a range of industrial partners including DSTL, BAE Systems and Thales. | |||
Public Online Lecture | 2 Mar | 3pm | MS Teams Prof Frank van Langevelde (WUR) LEARNING FROM ANIMAL MOVEMENT Understanding movement of animals is key in ecology. Tracking animals using GPS and accelerometers will not only yield understanding about the whereabouts of wildlife, also about their conservation. I illustrate this by using understanding about predator-prey interactions to develop a system to prevent poaching. I will further explore possible alternatives for GPS to follow wildlife. These examples show that AI is instrumental to developments in ecology. | |||
Brief Bio of Prof Frank van Langevelde. Frank van Langevelde is professor and chair of the Wildlife Ecology and Conservation group at Wageningen University, The Netherlands. His research focuses broadly on ecological and evolutionary adaptations of animals to stress. Environmental stresses can constrain movement and searching for resources. As a consequence, constrained movement can result in suboptimal choices in foraging and habitat selection. This is studied in a variety of systems, varying from mammalian herbivores to insects. |
See University of Bristol Website Terms of Use |