PaNanoMRI

Precision imaging of premalignant cancer

Source: mrimaster

Current imaging tests provide inadequate sensitivity/specificity for detection of pre-malignant pancreatic cancer lesions because they are either too small (e.g. on MRI/CT) or isointense/isodense to normal tissue. This potentially leads to missed diagnoses on imaging in at-risk patients. This project gathers a team of clinical/non-clinical scientists to work on this challenge, more specifically, through the combination of two novel non-invasive approaches to increase the diagnostic yield of MRI in patients with early malignant disease: 1- Development of new multifunctional targeted nanoparticles for detection of pre-malignant pancreatic lesions, and 2- Improvement of MRI diagnostic yield through the application of super-resolution reconstruction (SRR) and quantitative MRI from MR Fingerprinting (MRF) for reproducible and rapid scanning.

Pancreatic cancer has the lowest survival of all common cancers, with five-year survival less than 7%, and the 5th biggest cancer killer in the UK [PCUK]. Early diagnosis is crucial to improve survival outcomes for people with pancreatic cancer; with one-year survival in those diagnosed at an early stage six times higher than one-year survival in those diagnosed at stage four. However, around 80% of patients are not diagnosed until the cancer is at an advanced stage. At this late stage, surgery/treatment is usually not possible. Not only do we need to have the tools and knowledge to diagnose people at an earlier stage, but we also need to make the diagnosis process faster so that we don’t waste any precious time in moving people onto potentially life-saving surgery or other treatments.

Collaborators

UCL Hospitals NHS Trust, universities of Strathclyde, Glasgow, Liverpool and Imperial College London.

Contact

Mohammad Golbabaee

Deep Compressive Quantitative MRI

Making MRI fast and quantitative

Closeup of X-ray photography of human brain

This interdisciplinary project focuses on the development of next-generation fast quantitative MRI precision imaging solutions based on AI and compressed sensing.

Magnetic resonance imaging (MRI) has transformed the way we look through the human body, noninvasively, making it the gold-standard imaging technique for diagnosis and monitoring of many diseases. However, conventional MRI scans do not produce “quantitative” i.e. standardised measurements, and therefore it is difficult to compare MRI images acquired at different hospitals, or at different points in time, limiting the potential of this imaging technology for advanced diagnostic and monitoring precision.

Quantitative MRI (qMRI) aims to overcome this problem by providing reproducible measurements that quantify tissue bio-properties, independent of the scanner and scanning times. This can transform the existing scanners from picture-taking machines to scientific measuring instruments, enabling objective comparisons across clinical sites, individuals and different time-points. Despite established benefits in precise evaluation of diseases (e.g., cancer, cardiac, liver, brain disorders), qMRIs suffer from excessively long scan times that currently obstruct their wide adoption in clinical routines. This project works with some of the world’s top organisations in healthcare research towards solving this challenge and enabling the qMRI scans to become substantially faster, more patient-friendly and more affordable and accessible.

Collaborators

GE Healthcare, UCL’s Centre for Medical Imaging, IRCCS Stella Maris-IMAGO7, University of Zurich.

Funding

£340k, EPSRC

Contact

Mohammad Golbabaee

IBC 2022 Innovation Award win for UK 5G Edge-XR project

The University of Bristol’s Visual Information Laboratory, part of Bristol Vision Institute is a member of a consortium led by BT Sport that has won the IBC 2022 Innovation Award in the Content Everywhere category. The consortium beat outstanding shortlisted entries from PGA TOUR and SKY Sports. The work resulted from the innovate UK 5G Edge-XR project which can viewed on YouTube

In collaboration with local start-up, Condense Reality, who have developed a volumetric video capture rig, which captures the 3D essence of people in real time and facilitates digital transportation to a different location, the project enables users to watch a holographic view of the live action, replaying the footage via a game engine from any angle in the comfort of their home.

The experience relies on the speed and low latency of 5G connectivity combined with edge of network computing, to move some of processor load associated with volumetric rendering away from the devices,  enabling experiences that have never been seen before.

The University of Bristol’s work in the project, associated with advanced video compression and rendering of volumetric content, led by Andrew Calway, David Bull, Aaron Zhang and Angeliki Katsenou is now being further developed through the MyWorld programme, collaboratively with BT and Condense Reality.

Further information

CondenseSalsa SoundTHE GRID FACTORYUniversity of BristolDANCE EAST

VI Lab researchers working with BT to enhance the experience of live events

5G Edge XR AR Dance

Press release, 8 October 2020

Computer vision experts from the University of Bristol are part of a new consortium, led by BT , that is driving the technology that will revolutionise the way we consume live events, from sports such as MotoGP and boxing, to dance classes.

The 5G Edge-XR project, one of seven projects funded by the Department for Digital, Culture, Media & Sport (DCMS) as part its 5G Create programme, aims to demonstrate new exciting ways that live sport and arts can be delivered remotely using immersive Virtual and Augmented Reality (VR/AR) technology combined with the new 5G network and advanced edge computing.

The 5G Edge-XR consortium, which is led by BT, also includes; The GRID Factory, Condense Reality, Salsa Sound, and Dance East. The project started in September 2020 and will run until March 2022, with a budget of over £4M, with £1.5M coming from DCMS.

The University of Bristol team is based in the Visual Information Lab (VI Lab) and will be working primarily with Condense Reality (CR). The Bristol-based SME, whose CTO and CSO are both Bristol graduates, has developed a state-of-the-art volumetric capture system, capable for generating live 3-D models for AR applications. This brings the prospect of viewing live sports and dance classes in 3-D and in your home, as though you were there in person.

The Bristol team is led by Professors Andrew Calway and David Bull, who will bring their expertise in computer vision and video coding to enhance the system developed by CR. They will be working with researchers from the BT Labs in Adastral Park, Suffolk, which is recognised for its global leadership in 5G research and standards development.

“This is a very exciting opportunity for the lab and our students, enabling us to engage in important research and knowledge and skills transfer to a local company, whilst at the same time being part of a national programme to showcase what is possible using AR and the new 5G technology,” said Professor Calway.

Professor Tim Whitley, BT’s MD of Applied Research, said: “The approaches we’re exploring with these teams in Bristol can transform how we experience sport, music, drama and education. With access to live cultural events and sport being limited by the ongoing pandemic, this project seems more relevant and urgent than ever.”

Nick Fellingham, CEO and Co-Founder of CR, added: “The 5G Edge-XR project is a real boost for us and working with the University will help us to push our technology to new levels, giving us that important edge in the market.”

End

MyWorld set to make SouthWest a Digital Media Leader on Global Stage

Image credit: Nick Smith Photography

We are pleased to advise that a University of Bristol initiative, led by VI Lab’s Professor David Bull, has been awarded £30 million by the UK Research and Innovation Strength in Places fund.

The South West is on track to become an international trailblazer in screen-based media thanks to £30 million funding from UKRI, with a further £16m coming from an alliance of more than 30 industry and academic partners joining forces in the five-year scheme due to start by the end of the year. This will launch a creative media powerhouse called MyWorld and supercharge economic growth, generating more than 700 jobs.

It will forge dynamic collaborations between world-leading academic institutions and creative industries to progress research and innovation, creative excellence, inclusive cultures, and knowledge sharing.

Professor David Bull commented, “The South West is already a creative capital in the UK and MyWorld aims to position the region amongst the best in the world, driving inward investment, increasing productivity and delivering important employment and training opportunities.

“This is the beginning of an exciting journey, which will align research and development endeavours across technology and the creative arts, to help businesses realise their innovation potential, raise their international profile, and maximise the advantages of new technologies.”

The MyWorld Bristol team of investigators has representation from across Engineering, Psychology, Arts and Management and includes:  Professor Andrew Calway, Professor Dimitra Simeonidou, Professor Mary Luckhurst, Professor Iain Gilchrist, Professor Martin Parker and Professor Kirsten Cater.

The full press release is available on the University of Bristol news page.

And for an insight into MyWorld, Strength in Places, watch this video.

How do humans interact with a changing visual world?

Press release issued: 23 March 2015BVIvision-article

A new £1.4 million research project led by the University of Bristol will use engineering and science in the design of radically new approaches and solutions to vision-based technology.

Researchers from the University’s Bristol Vision Institute (BVI) have been awarded an Engineering and Physical Sciences Research Council (EPSRC) Platform Grant for their project ‘Vision for the future’.

The grant will focus on deve
loping a better understanding of the visual mechanisms and processes evolved in humans and other animals, and relating these to similar technology challenges.

Vision is central to the way animals interact with the world. A deeper understanding of the fundamental aspects of perception and visual processing in humans and animals, across the areas of immersion, movement and visual search, together with innovation in engineering solutions, is essential in delivering future consumer technology for internet, robotic and environmental monitoring applications.

The project will carry out research across three themes:

  • Visual immersion – investigating, developing and characterising future consumer and professional video formats that will enable increased viewer engagement with the content;
  • Finding and hiding things – understanding visual search processes and translating these into machine solutions for detecting, hiding, classifying, describing and making decisions about complex visual information;
  • Vision in motion – a better understanding of how motion facilitates scene understanding will inform the design of autonomous systems, provide new healthcare solutions and underpin new camera technologies.

Professor David Bull, Director of Bristol Vision Institute, who is leading the project, said: “This is a major award for the University and BVI. It will enable us to take a strategic and long-term view of our research, addressing grand challenges in vision science and its applications. The interdisciplinary landscape of BVI will allow us to meet these challenges and to translate biological and psychological discoveries into technology solutions for human and machine applications.”

The BVI researchers on the project include:  Professor David Bull, J F Burn, Professor Nishan Canagarajah, Professor Innes Cuthill, Professor Iain Gilchrist, Dr Casimir Ludwig and Dr Nicholas Roberts.

Alongside support from the University of Bristol the project has international collaboration with world-leading organisations including: the Academy of Motion Picture Arts and Sciences,Max Planck Institute (Tuebingen), camera manufacturer ARRIBBC R&DAardman Animations,QinetiQ and Thales, together with the universities of RochesterLundTexas at AustinUWA,Purdue and UWE.


 

Further information

EPSRC’s Platform Grant Scheme is focused on “providing flexible support that underpins adventurous research in world-leading research groups”.

About the EPSRC
The Engineering and Physical Sciences Research Council (EPSRC) is the UK’s main agency for funding research in engineering and physical sciences. EPSRC invests around £800m a year in research and postgraduate training to help the nation handle the next generation of technological change. The areas covered range from information technology to structural engineering, and mathematics to materials science. This research forms the basis for future economic development in the UK and improvements for everyone’s health, lifestyle and culture. EPSRC works alongside other Research Councils with responsibility for other areas of research. The Research Councils work collectively on issues of common concern via Research Councils UK.

About Bristol Vision Institute
The University of Bristol is a world leader in vision-research, spanning human and animal vision, artificial vision, visual information processing and the creative arts. Bristol Vision Institute (BVI), formed in 2008 with some 120 researchers, has successfully created an intellectual landscape alongside practical facilities for vision research and has facilitated engineers and scientists working together with experts in medicine and the arts. It is the largest inter-disciplinary grouping of its type in Europe, and is unique worldwide. BVI has been highly successful in attracting research income and stimulating new relationships with an impressive list of collaborators.