BRISTOL DRONE CINEMATOGRAPHY WORKSHOP

The MultiDrone Consortium organized the first Drone Cinematography Workshop at the University of Bristol on 6 December 2017. The aim of the workshop was to bring together experts in drone cinematography – users, producers, and technologists – to explore the future potential for this exciting and growing area. The workshop included invited talks from specialist operators and producers, alongside research presentations on the rules, requirements, tools, potential and constraints of shooting with drones.

The workshop commenced with an overview of the EU Multidrone project by Professor David Bull of the University of Bristol and Professor Ioannis Pitas of the University of Thessaloniki, who presented the aims, challenges and preliminary results of the project, focusing on cinematography aspects. This was followed by an illuminating talk by Colin Jackson from the BBC Natural History Unit in Bristol presented ‘Natural Highs – the use of drones in wildlife filmmaking’. Colin explained how drones have transformed wildlife filmmaking in terms of their ease of deployment, flexibility and potential to achieve shots previously not possible. He used examples from BBC’s recent Blue Planet 2 series and outlined a number of operational constraints as well as opportunities for future use. The third talk was given by Ioannis Mademlis and Ioannis Pitas from the University of Thessaloniki. entitled ‘UAV shot type taxonomy’. They explained the importance of a taxonomy for shot types and for appropriate mathematical representations of these. A lively discussion chaired by Nicolaus Heise of Deutsche Welle ended the morning session. This further explored the potential of drone and multi-drone-based cinematography.

      

After lunch Dr Andrew Calway of the University of Bristol introduced the audience to his work on ‘Model based drone odometry for object centric filming’. He drew parallels between drones being used for visual inspection of wind turbines and drone cinematography, particularly in connection with pose estimation and target localization. This was followed by a talk by Ben Keene, Chief Development Officer of Consortiq, who spoke on ‘Visual Innovation in the air: from Kitkat to Transformers 5’. Ben shared his experiences of shooting for major commercial partners, highlighting the potential of drones for creating high impact shots, alongside the importance of regulatory compliance and managing client expectations in terms of deployment timescales. The third talk in the afternoon was given by Aaron Zhang, Stephen Boyle and David Bull of the University of Bristol, on ‘Simulation engines and subjective quality testing in MultiDrone. They explained the need for defining an operating envelope for drone shots and explained how simulation tools such as Unreal Engine can provide a powerful platform for rapid simulation and testing of shot types. They presented results from a pilot study that showed viewer preferences for 6 shot types.

The final talk in the afternoon was given by Professor Iain Gilchrist and Dr Stephen Hinde of the University of Bristol, who described their methodology for ‘Measuring visual immersion’ using a dual task approach. They presented results of recent work with BBC R&D comparing viewer immersion for Planet Earth 2 in High and Standard Dynamic Range Formats – demonstrating the discriminative power of their approach. The workshop closed with a second extended discussion, chaired by David Bull exploring how we can assess the quality and immersive properties of drone video content.

This highly successful workshop was attended by 46 people, many of whom expressed their appreciation for the event that they fund interesting, insightful and informative. It was agreed that this would form the basis for a series of similar future events, culminating with an open call workshop in 2019.

For further information on the Multidrone project please visit https://multidrone.eu/

 

 

Best poster award for SEMBED at BMVA summer school

The SEMBED paper, Semantic Embedding of Egocentric Actions Videos (authors: Michael Wray, Davide Moltisanti , Walterio Mayol-Cuevas and Dima Damen) has won the second best paper award at the British Machine Vision Association (BMVA) summer school held in Swansea in July 2016. The paper has also been accepted at the EPIC (Egocentric Perception, Interaction and Computing) ECCV 2016 workshop.

The arxiv submission of the paper is available at the following link: http://arxiv.org/abs/1607.08414

 

IMG_5803

Deep Driving: Learning Representations for Intelligent Vehicles

Together with University of California at Berkeley, University of Jenna, NICTA and Daimler we are organising a workshop in representation learning at the IEEE Intelligent Vehicles Symposium in Gothenburg, Sweden.

Vision is a rich and un-intrusive sensor modality that can be captured by cheap and flexible sensors. However, its strengths are also its downside as it is challenging to extract what is relevant from the high dimensional signal. Recently computer vision has experienced what can only be referred to as a revolution. The second coming of neural networks, known as deep learning, has lead to a significant increase in performance in tasks across the field. It is now possible to learn image representations directly from data rather than relying on ad-hoc handcrafted features.

The aim of this workshop is to bring together researchers to discuss both theoretical and practical issues related to the application of deep vision techniques for intelligent vehicles. We want to demonstrate what computer vision currently is capable of and identify important directions of future work. The workshop will be centered around a set of invited talks from prominent researchers together with a poster session of submitted extended abstracts as well as a tutorial in the afternoon. The tutorial will focus on the software library caffe, one of the most popular convolutional neural network toolboxes for vision tasks.

BVI on The One Show

The-OneShow

In an episode of The One Show which aired on the 15th April, BVI contributed to an item demonstrating the dangers of texting while driving. Recent studies have indicated that reading and even writing text messages while behind the wheel is on the rise, especially amongst younger adults.

After approaching the BVI for advice on the best way to communicate this to its audience, The One Show production team invited University of Bristol BVI researcher Dr Felix Mercer Moss onto an airfield just outside Bristol to help perform an experiment.

A BVI mobile eye tracker was used to discover that drivers navigating the makeshift course spent as manyas two seconds looking away from the road while replying to a text. The same drivers then learnt that for subsequent laps a customised pair of ‘black-out’ goggles would be worn that simulated their inattention on previous laps.

The `black-out’ goggles caused carnage on the airfield, and in doing so, provided viewers with a stark reminder to keep their phonCapturees in their pockets when on the road.

To watch before Sunday 16th May, go to minute 13:44 on the link: http://www.bbc.co.uk/iplayer/episode/b078wvcs/the-one-show-15042016

How do humans interact with a changing visual world?

Press release issued: 23 March 2015BVIvision-article

A new £1.4 million research project led by the University of Bristol will use engineering and science in the design of radically new approaches and solutions to vision-based technology.

Researchers from the University’s Bristol Vision Institute (BVI) have been awarded an Engineering and Physical Sciences Research Council (EPSRC) Platform Grant for their project ‘Vision for the future’.

The grant will focus on deve
loping a better understanding of the visual mechanisms and processes evolved in humans and other animals, and relating these to similar technology challenges.

Vision is central to the way animals interact with the world. A deeper understanding of the fundamental aspects of perception and visual processing in humans and animals, across the areas of immersion, movement and visual search, together with innovation in engineering solutions, is essential in delivering future consumer technology for internet, robotic and environmental monitoring applications.

The project will carry out research across three themes:

  • Visual immersion – investigating, developing and characterising future consumer and professional video formats that will enable increased viewer engagement with the content;
  • Finding and hiding things – understanding visual search processes and translating these into machine solutions for detecting, hiding, classifying, describing and making decisions about complex visual information;
  • Vision in motion – a better understanding of how motion facilitates scene understanding will inform the design of autonomous systems, provide new healthcare solutions and underpin new camera technologies.

Professor David Bull, Director of Bristol Vision Institute, who is leading the project, said: “This is a major award for the University and BVI. It will enable us to take a strategic and long-term view of our research, addressing grand challenges in vision science and its applications. The interdisciplinary landscape of BVI will allow us to meet these challenges and to translate biological and psychological discoveries into technology solutions for human and machine applications.”

The BVI researchers on the project include:  Professor David Bull, J F Burn, Professor Nishan Canagarajah, Professor Innes Cuthill, Professor Iain Gilchrist, Dr Casimir Ludwig and Dr Nicholas Roberts.

Alongside support from the University of Bristol the project has international collaboration with world-leading organisations including: the Academy of Motion Picture Arts and Sciences,Max Planck Institute (Tuebingen), camera manufacturer ARRIBBC R&DAardman Animations,QinetiQ and Thales, together with the universities of RochesterLundTexas at AustinUWA,Purdue and UWE.


 

Further information

EPSRC’s Platform Grant Scheme is focused on “providing flexible support that underpins adventurous research in world-leading research groups”.

About the EPSRC
The Engineering and Physical Sciences Research Council (EPSRC) is the UK’s main agency for funding research in engineering and physical sciences. EPSRC invests around £800m a year in research and postgraduate training to help the nation handle the next generation of technological change. The areas covered range from information technology to structural engineering, and mathematics to materials science. This research forms the basis for future economic development in the UK and improvements for everyone’s health, lifestyle and culture. EPSRC works alongside other Research Councils with responsibility for other areas of research. The Research Councils work collectively on issues of common concern via Research Councils UK.

About Bristol Vision Institute
The University of Bristol is a world leader in vision-research, spanning human and animal vision, artificial vision, visual information processing and the creative arts. Bristol Vision Institute (BVI), formed in 2008 with some 120 researchers, has successfully created an intellectual landscape alongside practical facilities for vision research and has facilitated engineers and scientists working together with experts in medicine and the arts. It is the largest inter-disciplinary grouping of its type in Europe, and is unique worldwide. BVI has been highly successful in attracting research income and stimulating new relationships with an impressive list of collaborators.