MyWorld set to make SouthWest a Digital Media Leader on Global Stage

Image credit: Nick Smith Photography

We are pleased to advise that a University of Bristol initiative, led by VI Lab’s Professor David Bull, has been awarded £30 million by the UK Research and Innovation Strength in Places fund.

The South West is on track to become an international trailblazer in screen-based media thanks to £30 million funding from UKRI, with a further £16m coming from an alliance of more than 30 industry and academic partners joining forces in the five-year scheme due to start by the end of the year. This will launch a creative media powerhouse called MyWorld and supercharge economic growth, generating more than 700 jobs.

It will forge dynamic collaborations between world-leading academic institutions and creative industries to progress research and innovation, creative excellence, inclusive cultures, and knowledge sharing.

Professor David Bull commented, “The South West is already a creative capital in the UK and MyWorld aims to position the region amongst the best in the world, driving inward investment, increasing productivity and delivering important employment and training opportunities.

“This is the beginning of an exciting journey, which will align research and development endeavours across technology and the creative arts, to help businesses realise their innovation potential, raise their international profile, and maximise the advantages of new technologies.”

The MyWorld Bristol team of investigators has representation from across Engineering, Psychology, Arts and Management and includes:  Professor Andrew Calway, Professor Dimitra Simeonidou, Professor Mary Luckhurst, Professor Iain Gilchrist, Professor Martin Parker and Professor Kirsten Cater.

The full press release is available on the University of Bristol news page.

And for an insight into MyWorld, Strength in Places, watch this video.


The MultiDrone Consortium organized the first Drone Cinematography Workshop at the University of Bristol on 6 December 2017. The aim of the workshop was to bring together experts in drone cinematography – users, producers, and technologists – to explore the future potential for this exciting and growing area. The workshop included invited talks from specialist operators and producers, alongside research presentations on the rules, requirements, tools, potential and constraints of shooting with drones.

The workshop commenced with an overview of the EU Multidrone project by Professor David Bull of the University of Bristol and Professor Ioannis Pitas of the University of Thessaloniki, who presented the aims, challenges and preliminary results of the project, focusing on cinematography aspects. This was followed by an illuminating talk by Colin Jackson from the BBC Natural History Unit in Bristol presented ‘Natural Highs – the use of drones in wildlife filmmaking’. Colin explained how drones have transformed wildlife filmmaking in terms of their ease of deployment, flexibility and potential to achieve shots previously not possible. He used examples from BBC’s recent Blue Planet 2 series and outlined a number of operational constraints as well as opportunities for future use. The third talk was given by Ioannis Mademlis and Ioannis Pitas from the University of Thessaloniki. entitled ‘UAV shot type taxonomy’. They explained the importance of a taxonomy for shot types and for appropriate mathematical representations of these. A lively discussion chaired by Nicolaus Heise of Deutsche Welle ended the morning session. This further explored the potential of drone and multi-drone-based cinematography.


After lunch Dr Andrew Calway of the University of Bristol introduced the audience to his work on ‘Model based drone odometry for object centric filming’. He drew parallels between drones being used for visual inspection of wind turbines and drone cinematography, particularly in connection with pose estimation and target localization. This was followed by a talk by Ben Keene, Chief Development Officer of Consortiq, who spoke on ‘Visual Innovation in the air: from Kitkat to Transformers 5’. Ben shared his experiences of shooting for major commercial partners, highlighting the potential of drones for creating high impact shots, alongside the importance of regulatory compliance and managing client expectations in terms of deployment timescales. The third talk in the afternoon was given by Aaron Zhang, Stephen Boyle and David Bull of the University of Bristol, on ‘Simulation engines and subjective quality testing in MultiDrone. They explained the need for defining an operating envelope for drone shots and explained how simulation tools such as Unreal Engine can provide a powerful platform for rapid simulation and testing of shot types. They presented results from a pilot study that showed viewer preferences for 6 shot types.

The final talk in the afternoon was given by Professor Iain Gilchrist and Dr Stephen Hinde of the University of Bristol, who described their methodology for ‘Measuring visual immersion’ using a dual task approach. They presented results of recent work with BBC R&D comparing viewer immersion for Planet Earth 2 in High and Standard Dynamic Range Formats – demonstrating the discriminative power of their approach. The workshop closed with a second extended discussion, chaired by David Bull exploring how we can assess the quality and immersive properties of drone video content.

This highly successful workshop was attended by 46 people, many of whom expressed their appreciation for the event that they fund interesting, insightful and informative. It was agreed that this would form the basis for a series of similar future events, culminating with an open call workshop in 2019.

For further information on the Multidrone project please visit



Best poster award for SEMBED at BMVA summer school

The SEMBED paper, Semantic Embedding of Egocentric Actions Videos (authors: Michael Wray, Davide Moltisanti , Walterio Mayol-Cuevas and Dima Damen) has won the second best paper award at the British Machine Vision Association (BMVA) summer school held in Swansea in July 2016. The paper has also been accepted at the EPIC (Egocentric Perception, Interaction and Computing) ECCV 2016 workshop.

The arxiv submission of the paper is available at the following link:



Deep Driving: Learning Representations for Intelligent Vehicles

Together with University of California at Berkeley, University of Jenna, NICTA and Daimler we are organising a workshop in representation learning at the IEEE Intelligent Vehicles Symposium in Gothenburg, Sweden.

Vision is a rich and un-intrusive sensor modality that can be captured by cheap and flexible sensors. However, its strengths are also its downside as it is challenging to extract what is relevant from the high dimensional signal. Recently computer vision has experienced what can only be referred to as a revolution. The second coming of neural networks, known as deep learning, has lead to a significant increase in performance in tasks across the field. It is now possible to learn image representations directly from data rather than relying on ad-hoc handcrafted features.

The aim of this workshop is to bring together researchers to discuss both theoretical and practical issues related to the application of deep vision techniques for intelligent vehicles. We want to demonstrate what computer vision currently is capable of and identify important directions of future work. The workshop will be centered around a set of invited talks from prominent researchers together with a poster session of submitted extended abstracts as well as a tutorial in the afternoon. The tutorial will focus on the software library caffe, one of the most popular convolutional neural network toolboxes for vision tasks.

BVI on The One Show


In an episode of The One Show which aired on the 15th April, BVI contributed to an item demonstrating the dangers of texting while driving. Recent studies have indicated that reading and even writing text messages while behind the wheel is on the rise, especially amongst younger adults.

After approaching the BVI for advice on the best way to communicate this to its audience, The One Show production team invited University of Bristol BVI researcher Dr Felix Mercer Moss onto an airfield just outside Bristol to help perform an experiment.

A BVI mobile eye tracker was used to discover that drivers navigating the makeshift course spent as manyas two seconds looking away from the road while replying to a text. The same drivers then learnt that for subsequent laps a customised pair of ‘black-out’ goggles would be worn that simulated their inattention on previous laps.

The `black-out’ goggles caused carnage on the airfield, and in doing so, provided viewers with a stark reminder to keep their phonCapturees in their pockets when on the road.

To watch before Sunday 16th May, go to minute 13:44 on the link: