Sidebar Menu

Team

   Professors
   External Lecturers
   Office Assistance
   System Administration
   Internal PhD Students
   External PhD Students
   Students
   Alumni
   Company Founders
   DSS Cups

Research

   Research Topics
   Real-time Framework
   Publications

Teaching

   Lectures
   Labs
   Seminars
   Student Projects
   Theses
   Evaluations
   Exams
   RED

News

Events

   Talks

Media Center

   Audio Examples   
   Data Bases
   Surveys

GaS

   Basic Information
   Prices
   Members
   Reports
   Statute

Movement Analysis

   
  Contents:  

Movements are often analysed by interpretation of the human eye, e.g. medical doctors examining patients. Alternatively, by optical motion capture systems that accurately track the movements of the body, but this requires a complicated setup in a laboratory. This is normally done in a clinical or laboratory setting. However, patients move in a different manner when they know a medical doctor is watching or when their movements are tracked in the lab. Therefore, we want to analyse pathological movement patterns in the natural environment of the patient.

In the last few years, there has been an increasing interest in using wearable sensors, like inertial measurement units (IMUs), to analyse movement patterns. IMUs measure acceleration and angular velocity in 3D. The IMUs can be used to collect data continuously in an unobtrusive and objective manner. The patient only has to wear one or multiple IMUs on the body throughout the day.

IMU data is pretty sensitive to noise, therefore different procedures have to be applied to optimize the signals. Thereafter, the data can be used for example for activity recognition, classification of different diseases or estimation of disease severity. This can help the medical staff to understand and quantify pathological movement patterns of patients. The medical staff can use this information for example to diagnose patients, to monitor disease progression, or to evaluate the effect of treatment.

This research is performed in close cooperation with the Neurogeriatrics group of Prof. Dr. med. Walter Maetzler.

   

 

Corresponding Publications:

   

E. Warmerdam, M. H. Pham, C. Hansen, W. Maetzler: P 130—The Influence of Different Settings on Accuracies of Gait Algorithms,  Poster,  Gait and Posture, 2018

   

M. H. Pham, E. Warmerdam, M. Elshehabi, C. Schlenstedt, L. Bergeest, M. Heller, L. Haertner, J. J. Ferreira, D. Berg, G. Schmidt, C. Hansen, W. Maetzler: Validation of a Lower Back “Wearable”-Based Sit-to-Stand and Stand-to-Sit Algorithm for Patients With Parkinson’s Disease and Older Adults in a Home-Like Environment, Front. Neurol., 2018 (doi: 10.3389/fneur.2018.00652)

   

L. Haertner, M. Elshehabi, L. Zaunbrecher, M. H. Pham, C. Maetzler, J. M. T. van Uem, M. A. Hobert, S. Hucker, S. Nussbaum, D. Berg, I. Liepelt-Scarfone, W. Maetzler: Effect of Fear of Falling on Turning Performance in Parkinson’s Disease in the Lab and at Home, Front. Aging Neurosci., 10:78, 2018 (doi: 10.3389/fnagi.2018.00078)

   

M. H. Pham, M. Elshehabi, L. Haertner, S. D. Din, K. Srulijes, T. Heger, M. Synofzik, M. A. Hobert, G. S. Faber, C. Hansen, D. Salkovic, J. J. Ferreira, D. Berg, A. Sanchez-Ferro, J. H. van Dieën, C. Becker, L. Rochester, G. Schmidt, and W. Maetzler: Validation of a Step Detection Algorithm during Straight Walking and Turning in Patients with Parkinson’s Disease and Older Adults Using an Inertial Measurement Unit at the Lower Back, Front. Neurol. 8:457, 2017 (doi: 10.3389/fneur.2017.00457)

   

M. H. Pham, M. Elshehabi, W. Maetzler: Validation of Gait Detection and Analysis, Proc. EMBC, Jeju, Korea, 2017

   

M. H. Pham, M. Elshehabi, L. Haertner, T. Heger, M. A. Hobert, G. S. Faber, D. Salkovic, J. J. Ferreira, D. Berg, A. Sanchez-Ferro, J. H. van Dieën, W. Maetzler: Algorithm for Turning Detection and Analysis Validated under Home-Like Conditions in Patients with Parkinson’s Disease and Older Adults using a 6 Degree-of-Freedom Inertial Measurement Unit at the Lower Back, Front. Neurol. 8:135, 2017 (doi: 10.3389/fneur.2017.00135)