Research Projects

This page contains all completed and current research projects of the VAR group since April 2014.

Virtual Reality Supported Emergency Simulation for Medical Education and Training

PI Prof. C. Hansen, B. Ruzic
Project term 3/2021 - 2/2023
Funder
Partner 2tainment GmbH, Magdeburg (B. Ruzic)

Description

In this project, a novel VR simulator is being developed. Primarily, a software-supported simulation of diagnostic and treatment procedures is to be achieved. The project aims to implement VR technology and simulation algorithms for selected emergency medical trainings as well as the necessary medical equipment. The goal is to significantly improve emergency medical care in Germany in terms of training quality by using the VR simulator as a training tool for future physicians and paramedics.

Advanced Tracking and Interaction Techniques for Medical Augmented Reality Projection Techniques

PI Prof. C. Hansen, C. Steinmann
Project term 2/2021 - 1/2023
Funder
Partner domeprojection.com, Magdeburg (C. Steinmann)
Description This project aims to investigate and develop new interactive, stereoscopic AR display techniques for medical applications. For example, minimally invasive interventions will be supported by projector-based AR by superimposing navigation cues for surgical instruments or virtual anatomical objects with motion compensation on the patient in three dimensions and displaying them in partial projections for multiple users. Here, the problem of shadowing by objects in the projection path is pursued by splitting the usually large-area projection into several partial projections. These partial projections are optimally aligned on the basis of automatically detected viewer positions, so that perspective-correct visualization and multi-user interaction are possible.

Research Campus STIMULATE / Research Group Human-Machine Interaction

PI Prof. C. Hansen, Prof. Rose, Prof. O. Speck, Prof. F. Wacker
Project term 10/2020 - 09/2025
Funder
Partner

Medical School Hannover (Prof. F. Wacker, Dr. B. Hensen)
University Hospital Magdeburg (Prof. M. Pech, PD Dr. Jazan Omari)
CAScination AG (Christoph Thiede)
Fraunhofer IFF, Magdeburg (Prof. Dr. N. Elkmann)
Siemens Healthineers, Erlangen

Description

In this research group, which was founded in the second phase of the STIMULATE research campus in 2020, advanced methods of human-machine interaction are being investigated. One goal is to enable the direct operation of an MR scanner during a procedure without having the scanner controlled (indirectly) by a second person in the control room. Our research builds on the results of the first STIMULATE phase and complements them with new interaction techniques and a VR-based training solution. In addition, studies will be conducted to investigate the impact of a primary task (performing the intervention) on the performance of the interaction techniques under laboratory conditions and in the clinical setting.

In addition, we plan to optimize the workflow of CT-guided interventions from planning to follow-up. In this context, our research group is investigating a novel instrument tracking system and interaction techniques for a lightweight robot to guide a US probe.

Planning, Navigation and Monitoring Device for CT-guided Interventions

PI Prof. C. Hansen
Project term 06/2020 - 05/2024
Funder
Partner

Research Campus STIMULATE, Magdeburg, Germany (Prof. G. Rose, Prof. B. Preim)

Description

In this project within the framework of the DFG major research instrumentation programme, a planning/navigation device is to be interfaced with a computer tomograph so that it can act as a central information system. In addition, algorithms are to be developed to facilitate CT-supported interventions in cooperation with several research groups on the STIMULATE research campus. These include, for example, new deep-learning-based segmentation procedures and path optimization algorithms to support multi-applicator planning or new CT image reconstruction procedures to reduce artifacts while saving radiation dose.

A VR-UI for Virtual Planning and Training Applications over Large Distances

PI Jun.-Prof. C. Hansen, N. Kempe
Project term 11/2019 - 10/2021
Funder
Partner

UCDplus GmbH, Magdeburg, Germany(N. Kempe)
Luxsonic Technologies Inc., Saskatoon, Saskatchewan, Canada (Dr. M. Wesolowski)
University of Waterloo , Ontario, Canada (Prof. L. Nacke)

Description

In this international ZIM project, the consortium wants to concentrate on the research and development of Virtual Reality User Interfaces (VR-UIs). The application focus will be on virtual planning and training applications in medicine. With the solution envisaged in this project, physicians are to be able to communicate over long distances (intercontinental between Germany and Canada), distributed and in groups of up to 5 users and exchange medical skills. From a technical point of view, the VR-exploration of medical case data (text, image and video data) and the annotation of the data in VR as well as the VR-selection and manipulation of the data should be in the foreground. Successful implementation requires an interdisciplinary consortium of UI experts (UCDplus GmbH, University of Waterloo) and medical VR software developers (Luxsonic Technologies Ltd., Otto-von-Guericke University Magdeburg).

Biofeedback-based AR system for Medical Balance Training

PI Prof. C. Hansen, R. Warnke
Project term 11/2019 - 10/2021
Funder
Partner MediTECH Electronic GmbH, Wedemark, Germany (R. Warnke)
Thought Technology Ltd., Montreal, Quebec, Canada (M. Cardichon)
University of Waterloo, Ontario, Canada (Prof. L. Nacke)
Research Campus STIMULATE, Magdeburg, Germany (Prof. G. Rose)

Description

The therapy of impaired balance is usually done with medication in combination with physiotherapeutic training. The Medißalance Pro medical device from MediTECH Electronic GmbH has successfully established itself on the market. However, it is currently only used in specialized therapy centers for dizziness treatment and is limited there only to a training of the control of the equilibrium focus. In this international ZIM project, the existing hardware is to be equipped with an advanced AR-based operating and game interface. ln addition, the system is to be expanded with a multiphysiological sensor system. Within the scope of the project, a prototype for a new medical device will be developed.

Next Generation of Surgical Simulators for Surgical Planning, Training and Education

PI Prof. C. Hansen, Prof. B. Preim
Project term 09/2019 - 08/2020
Funder

Partner

MIMESIS Group, Inria Strasbourg, France (Prof. S. Cotin)
Center for Medical Image Science and Visualization, Linköping University, Schweden (Prof. C. Lundström)
University of Waterloo (Prof. L. Nacke)
Harvard Medical School, Boston, USA (Prof. R. Kikinis, Dr. T. Kapur)
Research Campus STIMULATE, Magdeburg, Germany (Prof. G. Rose)

Description

The aim of the project "Next Generation of Surgical Simulators for Surgical Planning, Training and Education" is to prepare an EU application in the field of "Health, demographic change and well-being". The aim is to apply for a Marie-Skłodowska Curie action, more precisely an ITN (Innovative Training Network). The applicants share the opinion that the improvement of surgical training is becoming more and more important in surgery. As patients get older, these procedures often become more complex and risky. Surgical simulators on today's market cannot reflect the reality and complexity of surgery, nor are they at an acceptable price level. The planned EU project aims precisely at this problem. An open-source framework for the simulation of surgical interventions is to be developed, which can be extended by research institutions and companies and used scientifically and commercially.

Improving Spatial Perception for Medical Augmented Reality with Interactable Depth Layers

PI Prof. C. Hansen, Jun.-Prof. Kai Lawonn
Project term 08/2019 - 07/2022
Funder
Partner

University Koblenz-Landau (Jun.-Prof. K. Lawonn)
Hannover Medical School (Prof. F. Wacker)
University Hospital Mainz (Prof. W. Kneist)

Description

Incorrect spatial interpretation is still one of the most common perceptional problems in medical augmented reality (AR). To further investigate this challenge, our project will elaborate on new methods that can improve the spatial perception for medical AR. Existing approaches are often not sufficient to explore medical 3D data in projected or optical see-through AR. While aiming at providing additional depth information for the whole dataset, many current approaches clutter the scene with too much information, thus binding valuable mental resources and potentially amplifying inattentional blindness.

Therefore, we will develop and evaluate new visualization an interaction techniques for multilayer AR. Our objective is to determine if depth layer decompositions help to better understand spatial relations of medical 3D data, and if transparency can facilitate depth perception for multi-layer-visualizations. In addition, we will investigate whether methods for multimodal and collaborative interaction can help to reduce the amount of currently displayed AR information. The results of this project should gain new insights for the representation of multilayer information in medical AR. These insights could be used to enhance established AR visualization techniques, to increase its usability, and thus to reduce risks during AR-guided medical interventions.

VR/AR-based Explorer for Medical Education

PI Prof. C. Hansen, D. Anderson
Project term 06/2019 - 12/2021
Funder
Partner 3DQR GmbH (D. Kasper, D. Anderson)
Description

With the establishment of smartphones and tablet computers in large parts of our society, new possibilities are emerging to convey knowledge in a vivid way. Many of the newer devices also make it possible to create immersive virtual reality (VR) or to enrich reality with virtual elements in the form of augmented reality. Such VR/AR-based environments are already used in a variety of training scenarios, especially in pilot training, but are based on stationary, high-priced components, e.g. VR caves, and require special stationary VR/AR hardware.

This project aims to investigate VR/AR solutions for basic medical education based on the use of affordable mobile input devices. The aim is to give learners access to this new form of digital knowledge transfer. The virtual contents are to be linked directly with existing textbooks in order to enrich them didactically and to supplement them meaningfully with digital media. Within the scope of this project, the project partners would like to concentrate on basic medical training, in particular on conveying medical-technical knowledge in anatomy and surgery. In addition, a software will be developed which enables teachers to create new learning scenarios themselves with the help of an authoring tool.

Development of Augmented and Virtual Multi-User Applications for Medical-Technical Exchange in Immersive Rooms (AVATAR)

PI Prof. C. Hansen, Prof. B. Preim
Project term 09/2018 - 08/2021
Funder

Partner

University Hospital Mainz (Dr. T. Huber, Prof. W. Kneist, PD Dr. M. Paschold, Prof. Hauke Lang)
Research Campus STIMULATE, Otto-von-Guericke University Magdeburg (Dr. Mandy Kaiser, Prof. G. Rose)
Harvard Medical School, Boston, USA (Prof. Jayender Jagadeesan, Prof. Ron Kikinis)
metratec GmbH, Magdeburg (K. Dannen)
2tainment GmbH, Magdeburg (B. Ruzik)

Description

The exchange of surgical experience and competence nowadays mainly takes place at conferences, through the presentation of surgical videos and through the organisation of visits to each other. Complex manual skills and surgical techniques have to be newly developed, trained and passed on to younger surgeons or colleagues. With the methods currently used, this exchange is very costly and time-consuming.

In this project, VR interaction and visualization techniques will be developed to improve the exchange of experience and competence between medical professionals. In a virtual reality, several users are to train collaboratively - simultaneously and in real time. The positions of locally distributed persons will be determined using hybrid tracking systems based on ultra-wideband technologies and inertial sensors. On this basis, VR training scenarios are designed, implemented in a multi-user communication system and clinically evaluated over distance.

The innovation of this project is the combination of collaborative interaction and visualization techniques with hybrid tracking technologies in an advanced multi-user communication system. The project results should form a basis for the development of future VR-based communication and simulation systems in medicine.

Intelligent Insole for Interaction Applications

PI Prof. C. Hansen, Dr. T. Szczepanski
Project term 10/2017 - 12/2020
Funder
Partner

Thorsis Technologies GmbH (Dr. T. Szczepanski)
University Hospital Magdeburg (Prof. M. Skalej)
Research Campus STIMULATE (Prof. G. Rose)

Description

In this project a novel interaction approach will be investigated, which enables the operation of software via simple foot-based gestures. This enables the user to operate the software by foot, but at the same time they can fully concentrate on the actual work process using their hands. In surgical applications in particular, this reduces the risk for the patient as the surgeon does not have to touch potentially unsterile input devices.

The project will be established as a joint project between Thorsis Technologies and the research campus STIMULATE of the Otto-von-Guericke University. The primary objective is to develop the necessary hardware and software components to provide functional verification in the context of surgical applications. A basic prerequisite for the acceptance of the insole as an interaction medium for a wide range of applications is the uncomplicated applicability and compatibility of the insole with standard footwear.

Augmented Reality Supported 3D Laparoscopy

PI Prof. C. Hansen, K. Dannen
Project term 07/2017 - 06/2020
Funder
Project number1704/00027
Partner University Hospital Magdeburg (Prof. M. Schostak)
Research Campus STIMULATE (Prof. G. Rose)
metratec GmbH, Magdeburg (K. Dannen)
2tainment GmbH, Magdeburg (B. Ruzik)
Description

The introduction of 3D technology has led to considerably improved orientation, precision and speed in laparoscopic surgery. It facilitates laparoscopic partial nephrectomy even for renal tumors in a more complicated position. Not every renal tumor is easily identifiable by its topography. There are different reasons for this. For one thing, renal tumors cannot protrude from the parenchymal border; for another thing, the kidney is enclosed in a connective tissue capsule that is sometimes very difficult to dissect from the parenchyma.

On the other hand, the main goal of tumor surgery is to completely remove the carcinomatous focus. Thus open surgery is regularly performed for tumors that either do not protrude substantially from the parenchyma or intraoperatively show strong adhesions with the renal capsule, as described above. In terms of treatment safety for the kidney, this technique yields basically similar results. However, the larger incision involves significant disadvantages with regard to the patients’ quality of life.

In this project, we aim to develop am augmented reality approach in which cross-sectional images (MRI or CT) are fused with real-time 3D laparoscopic images. The research project aims to establish the insertion and identification of markers particularly suitable for imaging as the basis for image-guided therapy.

Foot-Eye Interaction to Control Medical Software under Sterile Conditions

PI Prof. C. Hansen, Prof. L. Nacke
Project term 05/2017 - 04/2019
Funder International Research Program Grants (IRPG) of University of Waterloo, European Union
Partner University of Waterloo (Prof. L. Nacke)
Research Campus STIMULATE (Prof. G. Rose)
Description

The use of medical image data for interventional navigation support requires an increasing degree of interaction between surgeon and computer. At the same time, the sterile, narrow working space restricts the available input modalities. The common delegation of tasks in everyday medical care to assistants is error-prone and subject to variations in effectiveness, depending on the qualification and experience of the employees. Admittedly, touchless interaction devices make the required direct interface available to the surgeon, but for the purpose of software operation they cause time-consuming interruptions to the main task.

This project aims at the investigation of touchless input devices and Human-Machine Interfaces. Especially the user experience (UX) for the use of such interfaces shall be improved. The goal is to develop an input system that reverts to several modalities which are agreeable with the requirements in the operating room.

To fully examine the topic, a close collaboration with Prof. Dr. Lennart Nacke of the University of Waterloo (Ontario, Canada) is intended. Prof. Nacke is an expert in the field of Human-Computer Interaction and User Experience and studies different input systems with specialization on physiologic sensors and eye trackers.

3D Projections to Support Medical Training and Interventions

PI Prof. C. Hansen, C. Steinmann
Project term 04/2017 - 04/2020
Funder
Project number1704/00038
Partner domeprojection.com, Magdeburg (C. Steinmann)
Medical School Hannover (Prof. F. Wacker)
Research Campus STIMULATE (Prof. G. Rose)
Description

The projection technology experienced a strong development over the course of the last decade and during the advancing digitization of all areas of life and work. The ability to generate bright and large projections is already used in many areas, e.g. for simulation and training applications in the automotive and aircraft industry. High-quality multi-channel projections allow to expand the real environment by virtual objects without the use of additional hardware (Augmented Reality), or even to replace it (Virtual Reality).

In a joint project, which involves the company domeprojection.com® GmbH and the Research Campus STIMULATE of the Otto-von-Guericke University, we seek to investigate 3D projection views for the training and support of medical interventions, and to prepare their clinical application.

Based on a camera-supported 3D multi-projector system new medical 3D visualization and interaction techniques shall be investigated at Otto-von-Guericke University Magdeburg. This includes the development of new algorithms for rendering and visualization of virtual 3D objects, the evaluation and development of appropriate 3D interaction techniques as well as the systematic evaluation of the developed techniques in medical application scenarios.

Home Training for the Treatment of Cognitive Disorders

PI Prof. C. Hansen, Prof. B. Preim
Project term 03/2017 - 02/2020
Funder
Project number1704/00016
Partner Hasomed GmbH, Magdeburg (Dr. P. Weber)
University Hospital Leipzig (Dr. A. Thoene-Otto)
Research Campus STIMULATE (Prof. G. Rose)
Description

The cost pressure on rehabilitation hospitals results in stroke patients being released from hospital after 3-4 weeks and having further therapy with occupational therapists and neuropsychologists in private practice. However, under current conditions, the treatment intensity that is necessary for an efficient follow-up therapy is not further ensured after rehabilitation hospital discharge. To achieve therapeutic effects, the initiated therapy must be continued by intensive and preferably daily training.

This research project aims at the development of a system for the therapy of cognitive disorders for patients after stroke in home training. For this purpose, user interfaces with new interaction and visualization techniques shall be developed. Furthermore, studies shall validate whether reward and motivation techniques from computer games can be transferred to the new therapy software. For example, one element of the motivation and reward strategy is the suitable illustration of patient’s performance data.

2D Map Displays to Support Neurosurgical Interventions

PI Prof. C. Hansen
Project term 03/2017 - 02/2018
Funder
Partner Surgical Planning Laboratory, Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston (Prof. R. Kikinis)
Description

For the planning of complex surgical interventions, 3D models of relevant anatomical and pathological structures are used. Primarily, these models were developed for preoperative surgery planning. Due to the often very high geometric complexity and the associated interpretation and interaction effort for the viewer, the potential of 3D models during surgical interventions can only be exploited in a limited way.

During a 12-month research stay at the Surgical Planning Laboratory, Department of Radiology, Brigham and Women’s Hospital, Harvard Medical School, Boston, USA, this problem shall be analyzed in more detail for neurosurgical interventions. Therefore, a new method for 2D map display for navigational support during neurosurgical interventions shall be designed, developed, and evaluated. Algorithms that provide classified, weighted neurosurgical data for a 2D map display shall be explored. Based on these algorithms, a prototype for the visualization of relevant neurosurgical data in the form of a 2D map display shall be created.

Automated Online Service for the Preparation of Patient-individual 3D Models to Support Therapy Decisions

PI Prof. C. Hansen, L. Dornheim
Project term 11/2016 - 01/2020
Funder
Project number1604/00095
Partner Dornheim Medical Images GmbH, Magdeburg (L. Dornheim)
University Hospital Magdeburg (Prof. M. Schostak)
Research Campus STIMULATE (Prof. G. Rose)
Description

To provide hospitals with tools for the preparation of patient-individual 3D models of organs and pathologic structures, an automated online service shall be developed in this research project in co-operation with the company Dornheim Medical Images. Therefore, a clinical solution using the example of oncologic therapy of the prostate will be investigated. In this context, the Computer-Assisted Surgery group develops techniques for improved image segmentation and human-computer interaction.

Improving Spatial Perception for Medical Augmented Reality using Illustrative Rendering and Auditory Display

PI Prof. C. Hansen, Jun.-Prof. K. Lawonn
Project term 02/2016 - 01/2019
Funder
Partner

University Koblenz-Landau (Jun.-Prof. K. Lawonn)
Hannover Medical School (Prof. F. Wacker)
Technical University of Berlin (Prof. D. Manzey)

DescriptionOne of the most common perceptional problems in medical AR is incorrect spatial interpretation. To address this challenge, we propose to investigate methods to encode spatial information by using illustrative visualization techniques and auditory display. We plan to focus our research on projector-based and multilayer AR representations that do not demand special displays such as monitors, head-mounted displays, or hand-held devices. Our scenario assumes no stereoscopic view. Hence, we investigate monoscopic and static representations of 3D medical image data.

We plan to develop new methods for AR distance encoding using illustrative shadows and glyphs, techniques for contextual adaptive shape illustration, and an auditory display for spatial encoding. The methods will be evaluated under lab conditions within clinical-oriented user studies. Our aim is to determine whether the proposed methods provide sufficient information for instrument guidance and if they provide additional benefits during image-guided interventions compared to existing approaches. We want to investigate how auditory display could facilitate image-guided interventions (compared to visual-only navigation), and whether a combined modality would be beneficial for clinical users.

The results of our project should provide new insights for encoding spatial information in medical AR. The insights could be utilized to reduce incorrect spatial interpretation in medical AR, to enhance established AR visualization methods, and to aid medical experts to reduce risks during image-guided interventions.

Navigated Thermoablation of Liver Metastases in the MR

PI Prof. C. Hansen, Prof. G. Rose
Project term 02/2015 - 12/2019
Funder

Partner

Hannover Medical School (Prof. F. Wacker)
Fraunhofer MEVIS, Bremen (Dr. C. Rieder)
Siemens Healthineers, Erlangen (Dr. J. Reiß)
Research Campus STIMULATE (Prof. G. Rose)

Description

This project of the research campus STIMULATE deals with the investigation of an MR-compatible navigation system for MR image-guided thermoablation of liver metastases. Central contributions are methods for the improved navigation under MR imaging, especially for the intra-interventional adjustment of prospective planning data. The navigation system shall be operable by a projector-camera system which is to be developed in this project.

Navigated Thermoablation of Spine Metastases

PI Prof. C. Hansen, Prof. G. Rose
Project term 01/2015 - 12/2019
Funder

Partner University Hospital Magdeburg (Prof. M. Skalej)
Fraunhofer MEVIS, Bremen (Dr. C. Rieder)
Fraunhofer IFF, Magdeburg (Prof. Dr. N. Elkmann)
metratec GmbH, Magdeburg (Klaas Dannen)
CAScination AG, Bern (Dr. M. Peterhans)
Siemens Healthineers, Erlangen (Dr. J. Reiß)
Research Campus STIMULATE (Prof. G. Rose)
Description The investigation of a radio-based navigation system for the support of percutaneous thermoablations is in the center of this project in the research campus STIMULATE. The navigation system shall be used and evaluated in the context of navigated spine interventions, especially for the treatment of spine metastases, with the aid of the angiography system Artis zeego.

AngioNav: Planning of Vascular Interventions

PI Prof. C. Hansen
Project term 04/2014 - 08/2016
Funder ARTORG Center for Biomedical Engineering Research, Bern
Description

Interventions in radiology are often conducted via the vessel system of the patient, e.g., to treat vessel diseases or to specifically place a special therapeutic agent in the body. This project aimed at the development of a software assistant for the planning of vascular interventions. Therefore, methods for the interactive segmentation of complex vessel systems were investigated.