Hi, I am Miguel. My research interests are in the creation of Signal and Image Processing techniques in the context of Human-Robot Interaction, Wearables in Medicine, Ultrasound-Guided Interventions, Medical Robotics, and Research Software Engineering.
I am currently Research Associate at King's College London within the School of Biomedical Engineering and Imaging Sciences where, in the group of Prof. of Interventional Image Computing Tom Vercauteren, I am pushing forward the state-of-the-art of ultrasound-guided procedures and making scientific contributions to new algorithms, software and hardware. Prior to that, I was awarded a Ph.D. degree in Computer Engineering from the University of Birmingham in July 2019 where I investigated Nonlinear Analysis to Quantify Movement Variability in Human-Humanoid Interaction under the supervision of Prof. of Pervasive and Ubiquitous Computing Chris Baber and Prof. of Information Engineering Martin Russell.
I have 20 years’ experience in human-robot interaction, electronics, mechatronics and signal processing, along with 12 years’ experience as a teaching associate in Mechatronic and Computer Engineering. I have passion for Open Science meaning that I open-sourced all my contributions to knowledge in my GitHub @mxochicale. I also tweet and re-tweet about Robotics, Chaos, AI and Brains @_mxochicale, I write in my blog, and in my free time am trying to improve my juggling skills.
- 2021-03-08: Presented "AIR4Children: Artificial Intelligence and Robotics for Children" at HRI2021
- 2021-01-07: King's Public Engagement Award on FETUS: Finding a fETus with an Ultrasound Simulator
- 2020-10-15: Alexander Mitton, my first supervised Master student at BMEIS, won the award of outstanding individual project
- 2020-09-11: Presented "open-corTeX: A continuous integration framework for open scientific communication" at RRTS20
- 2020-04-14: King’s Health Partners grant for the project "Sensory system abnormalities in childhood dystonia" lead by Verity McClelland and in collaboration with Carlos Seneci
- Montenegro R, Corona E., Perez-Badillo D., Cruz D., Xochicale, M. (2021). AIR4Children: Artificial Intelligence and Robotics for Children. HRI2021 Proceedings of the 16th Annual Conference on Human-Robot Interaction. doi-to-be-published.
- Xochicale, M. & Baber, C. (2020). Nonlinear methods to quantify Movement Variability in Human-Humanoid Interaction Activities. arXiv. 1810.09249.
- Xochicale, M. & Baber, C. (2017). Towards the Analysis of Movement Variability in Human-Humanoid Imitation Activities. HAI17 Proceedings of the 5th International Conference on Human Agent Interaction. 10.1145/3125739.3132595.
- Xochicale, M., Baber, C. & Oussalah M. (2017). Towards the Quantification of Human-Robot Imitation Using Wearable Inertial Sensors. HRI2017 Proceedings of the 12th Annual Conference on Human-Robot Interaction. 10.1145/3029798.3038320.
- Xochicale, M., Baber, C. & Oussalah M. (2016). Analysis of the Movement Variability in Dance Activities using Wearable Sensors. WeRob2017 Proceedings of the 12th Annual Conference on Human-Robot Interaction. 10.1007/978-3-319-46532-6_25.
- Xochicale, M., Baber, C. & Oussalah M. (2016). Understanding Movement Variability of Simplistic Gestures Using an Inertial Sensor. PerDis '16 Proceedings of the 5th ACM International Symposium on Pervasive Displays. 10.1145/2914920.2940337.
This section is always under construction but these are few of the active projects I am working on: (a) simulator of ultrasound-guided needle for public engagement events [ ], (b) low-power vibrational stimulator to study evoked cortical responses, and (c) in my free time I am trying to build "AIR4Children: Artificial Intelligence and Robotics for Children" [ ]. Also, you might like to have a look to some of my previous projects: demos of human-robot interaction and others.