The 3rd Workshop on

Neur    design in Human-Robot Interaction

The making of engaging HRI technology your brain can’t resist.

Full-day HYBRID workshop as part of the IEEE International Conference on Robotics and Automation (ICRA 2025)

October 9, 2022

14:00 - 18:30 (CET)

powered by


Lorem ipsum dolor sit amet, consectetur adipiscing elit. xbdfljnldjn;djv;dljvn;dljvnvarius enim in eros elementum Duis cursus, mi quis viverra gergegegegeggegeinterutrumergrgegegegeggjnjnjntjtngjntjjtggb lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

3d-Carousel

     Our Mission     

Bring      Human-Robot Interaction Research
to the Real World with
Brain-Centered Experience

     Statement     

Problem

  • Current robots interacting with humans don't understand personalized intents, attentions, specific needs and/or emotions to serve people appropriately in different contexts.
  • The design of interactive robot behaviors doesn’t match human's intuition, user's expectations or social norms, such that the people have a hard time interpreting the objectives of the robots.
  • The humans and the robots don't have mutual understandings to perform coordinated, co-adaptive joint actions in close contact, proximity or tele-operated spaces.
  • The human-machine interfaces are not ergonomically designed. The software AI algorithms or the embodied intelligence in mechanisms used in applications lack the cognitive smoothness for the people to interact with. As a result, human and machine cannot communicate naturally and effortlessly.
  • The academic HRI researches have difficulties to enter into the consumer markets to make immediate practical impact.

Solution

  • Reinventing human-robot interaction by working with scientists, entrepreneurs and end-users to define, prove, innovate and scale the need-driven, value-based HRI innovations.
  • Placing the brain-centric experiences at the center of design to compensate, augment, facilitate us human, our societies and mother nature's harmony, as well as to improve the quality of life everywhere.
  • We investigate the special characteristics of human behavioral psychologies when experimenting with the users. Building upon that, we will explore and implement the fundamental neuroscience principles of sensing, cognition, planning/decision making and motor control in tuning HRI interaction dynamics and designing bio-compatible human-robot interaction devices, algorithms, theories and strategies.
  • Mixing multi-disciplinary domain expertise, diverse mindsets, market-driven research funding to shorten the time-to-market of traditional lab researches during the entire innovation process.

Approach

  • Optimize Cognitive Load of HRI
  • Emotion-Aware Interactions
  • Intuitive Human-Machine Channels
  • AI-Enhanced Brain-Centered Experience
  • Design the Innovation Pipeline through the Lens of Entrepreneurship

Invited Speakers

Jeff Krichmar

Professor
Department of Cognitive Sciences and Computer Science
University of California, Irvine (UCI)
“Kind, warm-hearted and considerate” are always the accolades Dr. Sugaya receives from her students. And that’s why her research in human emotion estimation and the applications on robots and Internet of Things are so touching people’s hearts. Dr. Sugaya has work experiences in both industry and academia, with tremendous knowledge on how to combine research findings and practical applications in the sweet spots.
Explore more

Alessandra Sciutti

Tenure Track Researcher, Head
COgNiTive Architecture for Collaborative Technologies
Italian Institute of Technology (IIT)
Dr. Alimardani is a scientist and educator who combines her passion for robots and neuroscience in her research. She spent many years in Japan, the land of robots, before moving to the Netherlands. When she is not coding or writing, you can find her gardening or doing Yoga.
Explore more

Kyle Yoshida

Postdoctoral Fellow
MME, Washington State University
Upcoming Assistant Professor in MAE, UCLA
NSF award-winning hero in dimensionality reduction in control and coordination of human hand, Dr. Vinjamuri graduated from University of Pittsburgh with PhD in BCI and neuroprosthesis. During his daily life, he is an enthusiastic researcher and prize-winning teacher. His dream is to decipher the working principles of complex neuromuscular control, to one day bring the most intuitive and simplest human-machine interface to our society.
Explore more

Minoru "Shino" Shinohara

Associate Professor, Director
Human Neuromuscular Physiology Lab, Biological Sciences
Georgia Institute of Technology (Georgia Tech)
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Kinsey Herrin

Senior Research Scientist
School of Mechanical Engineering
Georgia Institute of Technology (Georgia Tech)
NSF award-winning hero in dimensionality reduction in control and coordination of human hand, Dr. Vinjamuri graduated from University of Pittsburgh with PhD in BCI and neuroprosthesis. During his daily life, he is an enthusiastic researcher and prize-winning teacher. His dream is to decipher the working principles of complex neuromuscular control, to one day bring the most intuitive and simplest human-machine interface to our society.
Explore more

Sean Montgomery

Founder, EmotiBit
Founder and Director of Engineering
Connected Future Labs
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Maegan Tucker

Assistant Professor, PI
Dynamic Mobility (Dynamo) Lab, ECE, ME
Georgia Institute of Technology (Georgia Tech)
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Stefano Caggiano

Product Design Program Lead
School of Design
Istituto Marangoni
Waiman is a postdoc in mechanical engineering at Georgia Tech. His main research interests are in healthcare robotics, specifically focused in MRI and rehabilitation. Outside (and in) the lab he enjoys designing and building things.
Explore more

Philipp Beckerle

Professor, Head of Chair
Chair of Autonomous Systems and Mechatronics, EE
FAU Erlangen-Nürnberg
As a father with a daughter diagnosed with cerebral palsy, Jeremiah founded CIONIC and introduced the first FDA-cleared Cionic Neural Sleeve, combining powerful adaptive algorithms, sensing, analysis and augmentation into a wearable garment to help individual's mobility needs. Jeremiah graduated with BS and MS from Computer Science at Stanford, with 20 years experience in product innovation at Apple, Openwave Systems, Slide, and Jawbone. Superpowering the human body is his dedicated lifetime mission.
Explore more

Matteo Bianchi

Associate Professor
Information Engineering, Research Centre “E. Piaggio”
University of Pisa (UniPi)
As a father with a daughter diagnosed with cerebral palsy, Jeremiah founded CIONIC and introduced the first FDA-cleared Cionic Neural Sleeve, combining powerful adaptive algorithms, sensing, analysis and augmentation into a wearable garment to help individual's mobility needs. Jeremiah graduated with BS and MS from Computer Science at Stanford, with 20 years experience in product innovation at Apple, Openwave Systems, Slide, and Jawbone. Superpowering the human body is his dedicated lifetime mission.
Explore more

Kasahara Shunichi

Project Leader, Researcher, Engineer
Sony Computer Science Laboratories, Inc. (Sony CSL)
Okinawa Institute of Science and Technology (OIST)
Dr. Ueda is a scientist and inventor, trailblazing his research from Japan to the US, blending the mysterious Japanese craftsmanship and the western critical thinking to build unique bio-inspired sensing, actuation and control algorithms for effective integration of intelligent human-robotic systems. His achievements have gained several prestigious international recognitions, including the recent Nagamori Award.
Explore more

All Stake Holders

Involved for Discussion

Event Schedule

08:30

Set Up

Welcome and greeting all people
plus icon
08:45
microphone

Opening Remarks

Organizers
plus icon
09:00

Neurorobotic Design Principles and How It Relates to HRI

Jeff Krichmar, University of California, Irvine, USA
plus icon
09:25

Neuroscience and Cognitive Robotics for Better HRI

Alessandra Sciutti, Italian Institute of Technology, Italy
plus icon
09:50

Human-Inspired Computational and Embodied Intelligence to Enrich Touch-Mediated HRI

Matteo Bianchi, University of Pisa, Italy
plus icon
10:05
coffee

Coffee Break (Poster and NeuroDesign Showcase Demo)

plus icon
10:20

Evaluating the Effect ofTranscutaneous Vagus Nerve Stimulation During Voluntary Movement

Minoru "Shino" Shinohara, Georgia Institute of Technology, USA
plus icon
10:45

Neuroergonomics in Human-Centered Robot Design and Evaluation

Philipp Beckerle, FAU Erlangen-Nürnberg, Germany
plus icon
11:10

Collaborative Approaches to Haptics and Soft Robotics Research

Kyle Yoshida, Washington State University, USA
plus icon
11:35
Kasahara Shunichi, Sony Inc., Japan

Cybernetic Humanity: Exploring the New Humanity Emerging from the Integration of Humans and Computers

plus icon
12:00

Lunch Break (Poster and NeuroDesign Showcase Demo)

plus icon
13:00

From Lab to Life: Lessons Learned toward Embodied Robotic Exoskeletons and Prostheses through Patient-Centered Experimentation

Kinsey Herrin, Georgia Institute of Technology, USA
plus icon
13:25

Designing Adaptive and User-Centered Bipedal Assistive Devices: Leveraging Hybrid Systems and Preference-Based Learning in Human-Robot Interaction

Maegan Tucker, Georgia Institute of Technology, USA
plus icon
13:50

Product Design in Human-Robot Interaction

Stefano Caggiano, Istituto Marangoni, Italy
plus icon
14:15
coffee

Coffee Break (Poster and NeuroDesign Showcase Demo)

plus icon
14:30

Student On-site Innovation

Student "break-out" discussion session (All participants mixed with Invited Speakers)
plus icon
15:00

NeuroDesign Showcase Competition

Competition teams
plus icon
16:00

Panel Discussion

All experts
plus icon
17:00

Award Ceremony and Networking

All participants
plus icon

NeuroDesign EXPO & Competition

NeuroDesign in HRI Student Showcase Competition

[Meetings] [CFP] (3rd Call) Call for Submission: NeuroDesign EXPO in HRI Student Showcase Competition at ICRA 2024 (USD $1800 in cash)

—------------------------------------------------------------


===============================Call for Submission===============================

NeuroDesign EXPO in HRI Student Showcase Competition  ($1800 USD for your participation)
2nd Workshop on NeuroDesign in Human-Robot Interaction: The making of engaging HRI technology your brain can’t resist

IEEE International Conference on Robotics and Automation (ICRA 2024)
Yokohama, JapanMay 17, 2024https://neurodesign-in-hri.webflow.io/


Dear Colleagues,

Are you working on a human-interactive robot project that already has prototypes or some initial research findings? Perhaps you're pondering how to evolve these into more human-centric, real-world applications with a seamless, intuitive "brain"-centered experience, aiming to connect robots more deeply with our bodies, minds, and souls.

Worry not! We're excited to announce the NeuroDesign EXPO in HRI Showcase Competition at ICRA 2024, in conjunction with the NeuroDesign in HRI workshop (https://neurodesign-in-hri.webflow.io/). This event will feature a panel of world-renowned experts from diverse fields such as Human-Robot Interaction (HRI), Artificial Intelligence (AI), Cognitive Neuroscience, Social and Behavioral Psychology, Art & Design, RoboEthics, and the Startup community. These professionals will offer live, integrated-perspective feedback and recommendations to help refine your projects into more impactful research and commercial products.
We invite students at the BSc, MSc, and PhD levels to submit your projects. Submissions can fall into, but are not limited to, the following categories:

Affective Computing
Social and Service Robot
Industrial Collaborative Robot
Wearable Robot/ Device
Brain-Machine/Computer Interface
Haptics and Tele-operation
Soft Robotics
VR/AR & Metaverse
Cyborg and Bionic System
Healthcare Robotics
Exoskeleton and Rehabilitation Robot
LLM and Foundation Model for Human Robot Interaction
Brain Dynamics and Psychology for Cognitive and Physical HRI (cHRI/pHRI)
Human-Drone/AutoVehicle Interaction
Assistive Technology
Intelligence Augmentation and Human 2.0 Technologies
Supernumerary Limbs
Biometric Information Processing
Pervasive-Ubiquitous/Spatial Computing
Smart Home/Internet of Things (IoTs)
Edge-Fog-Cloud Computing
Speech/Gesture Recognition or Image/Audio Processing
Big Data, AI & Machine Learning for Multimodal Interaction
Smart Tutoring/Chatbot System
RoboEthics
RoboFashion, Clothing and Robot Skin
Diversity, Equity & Inclusion (DEI) for HRI Technologies
 
 
Participation Procedure:
Our selection process is on a rolling basis, and we aim to choose 10 projects for the final on-stage pitch presentation. We especially encourage those who have already submitted their work as posters or papers to ICRA 2024 or have publications elsewhere to participate in our event. This competition offers a fantastic opportunity to increase the visibility of your research globally.

Submission Options:

Submissions can be made in one of three formats:
1. A concise 100-word abstract and a 1-2 minute video, offering a brief yet engaging overview.
2. A 100-word abstract accompanied by 5 detailed slides for a short but thorough presentation.
3. A 2-page extended abstract (Please follow IEEE ICRA format:
https://ras.papercept.net/conferences/support/word.php), for a more in-depth submission.
*4. Participants are also welcome to submit using any combination of the above formats.


Final Round and Presentation:

The selected 10 projects will each have a 5-minute pitch presentation on stage during the final round. Alternatively, you may submit a polished, pre-recorded 5-minute video presentation, which we will play on stage.

Exhibition and Virtual Participation:

All submitted projects will receive a dedicated booth for poster and prototype demonstrations.
The event is designed to be "Hybrid" to ensure that everyone has the opportunity to participate, regardless of their ability to travel to Japan.

Awards:

We are thrilled to present two distinguished award categories at the competition. The "Best Innovation in HRI NeuroDesign Award" will be awarded to 3 outstanding projects. The First Prize will be awarded for USD 1000 USD and the 2nd Prize will be awarded 800 hundred for the prototyping. These projects will exemplify groundbreaking innovation within Human-Robot Interaction NeuroDesign. Additionally, the "Most Popular Project in HRI NeuroDesign Award" will be given to 2 projects, for which the First Prize will be awarded for “Full Waiver” for the submission to Journal of Frontiers in Robotics and AI in that capture the hearts of our workshop attendees and audience, determined through a popular vote. Winners in both categories will receive certificates acknowledging their achievements.

Timeline:

- Submission Deadline: Your entries must be submitted by May 1, 2024. Please note that our selection process is rolling, so early submissions are encouraged.- Announcement of Final Project Teams: The teams selected for the final round will be announced on May 3, 2024.- Competition Date: The competition will take place on May 17, 2024, where finalists will present their projects to the panel and attendees.

Submission Website:
https://forms.gle/fQMxJtkXb8JEU2WR6

If you have any questions, please don't hesitate to reach out. We're looking forward to seeing you at ICRA 2024!

Best regards,

Organizing Committee
2nd Workshop on NeuroDesign in Human-Robot Interaction: The making of engaging HRI technology your brain can’t resisthttps://neurodesign-in-hri.webflow.io/

Register or submit a contribution!
STARTER
Free
Get started with training routines designed for beginners.
Access to 60+ training videos
1 free personalised nutritional plan
Monthly personalised training plans
Video form review
-
Pre-Register
Free Forever

In recent years, there has been a rapid rise of innovations in robotics around the globe. This has been largely driven by the fact of labor force shortage in the lower-level, dirty, dull, dangerous and repetitive/tiring jobs, such as manufacturing, agriculture, food industry, infrastructure constructions, and/or autonomous vehicles, etc., as where the robots can provide faster, precise, safer and reliable task performance, working for long hours without taking breaks, compared to our human counterparts. During the past two years, the world-wide pandemic even pushes the demands further of using robots to replace the frontline healthcare workers, nurses and physicians to avoid body contacts, mitigating dangerous virus infections and transmissions. All these great examples contributed to the vast innovations of robot automation, which excels when the robot can work alone without human intervention, separated from our living environment without worrying too much about harming the people nearby. However, when the robots come to our homes and the hospitals, or in the environment where it needs tight human-robot interactions (HRI), the safety issues and uncertain human factors make the presumed technical assumptions falter, and the developmental processes and the business models fail. Good representative examples can be found in the recent shutdowns of several well-known startup companies, including Rethink Robotics, Jibo and Anki, of which they were all developing the forefront human-robot interaction solutions.

We surmise that HRIinnovation and commercialization of HRI products present unique challenges that are typically not encountered in other industries. With the constant increasing demands of using HRI technologies to compensate, augment and empower our human capabilities, we need to seriously address the fundamental flaws when developing HRI technologies, and translate the traditional “ivory tower” lab research into the real-world applications and the consumer products more fluently. In our competition, we intend to find the best projects that exemplify the NeuroDesign innovation principles to identify, invent and implement the developed HRI technologies, which could provide us a practical guidance to quickly bring Human-Robot Interaction lab research to solve the real-world problems.

Participation Procedure:
Our selection process is on a rolling basis, and we aim to choose 10 projects for the final on-stage pitch presentation. We especially encourage those who have already submitted their work as posters or papers to ICRA 2024 or have publications elsewhere to participate in our event. This competition offers a fantastic opportunity to increase the visibility of your research globally.

Submission Options:

Submissions can be made in one of three formats:
1. A concise 100-word abstract and a 1-2 minute video, offering a brief yet engaging overview.
2. A 100-word abstract accompanied by 5 detailed slides for a short but thorough presentation.
3. A 2-page extended abstract (Please follow IEEE ICRA format:
https://ras.papercept.net/conferences/support/word.php), for a more in-depth submission.
*4. Participants are also welcome to submit using any combination of the above formats.


Final Round and Presentation:

The selected 10 projects will each have a 5-minute pitch presentation on stage during the final round.

Poster and Exhibition:

All submitted projects will receive a dedicated booth for poster and prototype demonstrations.
The event is designed to be "Hybrid" to ensure that everyone has the opportunity to participate, regardless of their ability to travel to Japan.

Awards:

We are thrilled to present two distinguished award categories at the competition. The "Best Innovation in HRI NeuroDesign Award" will be awarded to 3 outstanding projects. The First Prize will be awarded for USD 1000 USD and the 2nd Prize will be awarded 800 hundred for the prototyping. These projects will exemplify groundbreaking innovation within Human-Robot Interaction NeuroDesign. Additionally, the "Most Popular Project in HRI NeuroDesign Award" will be given to 2 projects, for which the First Prize will be awarded for “Full Waiver” for the submission to Journal of Frontiers in Robotics and AI in that capture the hearts of our workshop attendees and audience, determined through a popular vote. Winners in both categories will receive certificates acknowledging their achievements.

Timeline:

- Submission Deadline: Your entries must be submitted by May 1, 2024. Please note that our selection process is rolling, so early submissions are encouraged.- Announcement of Final Project Teams: The teams selected for the final round will be announced on May 3, 2024.- Competition Date: The competition will take place on May 17, 2024, where finalists will present their projects to the panel and attendees.

Submission Website:
https://forms.gle/fQMxJtkXb8JEU2WR6

Team 1: Observational Error Related Negativity for Trust Evaluation in Human Swarm Interaction
Joseph P. Distefano (University of Buffalo)

This groundbreaking study marks the inaugural exploration of Observation Error Related Negativity (oERN) as a pivotal indicator of human trust within the paradigm of human-swarm teaming, while simultaneously delving into the nuanced impact of individual differences, distinguishing between experts and novices. In this institutional Review Board (IRB) approved experiment, human operators physiological information is recorded while they take a supervisory control role to interact with multiple swarms of robotic agents that are either compliant or non-compliant. The analysis of event-related potentials during non-compliant actions revealed distinct oERN and error positivity (Pe) components localized within the frontal cortex.

Extended AbstractShort VideoPoster

Team 2: Novel Intuitive BCI Paradigms for Decoding Manipulation Intention - Imagined Interaction with Robots
Matthys Du Toit (University of Bath)

Human-robot interfaces lack intuitive designs, especially BCIs relying on single-body part activation for motor imagery. This research proposes a novel approach: decoding manipulation intent directly from imagined interaction with robotic arms. EEG signals were recorded from 10 subjects performing motor execution, visual perception, motor imagery, and imagery during perception while interacting with a 6-DoF robotic arm. State-of-the-art classification models achieved average accuracies of 89% (motor execution), 94.9% (visual perception), 73.2% (motor imagery), and highest motor imagery classification of 83.2%, demonstrating feasibility of decoding manipulation intent from imagined interaction. The research invites more intuitive BCI designs through improved human-robot interface paradigms.

Short VideoPoster

Team 3: Multimodal Emotion Recognition for Human-Robot Interaction
Farshad Safavi (University of Maryland, Baltimore County)

Our project is a multimodal emotion recognition system to enhance human-robot interaction by controlling a robotic arm through detected emotions, using facial expressions and EEG signals. Our prototype adjusts the robotic arm's speed based on emotions detected from facial expressions and EEG signals. Two experiments demonstrate our approach: one shows the arm's response to facial cues—it speeds up when detecting happiness and slows down for negative emotions like angry face. The other video illustrates control via EEG, adjusting speed based on the user's relaxation level. Our goal is to integrate emotion recognition into robotic applications, developing emotionally aware robots.

Intro SlidesShort VideoPoster

Team 4: Learning Hand Gestures using Synergies in a Humanoid Robot
Parthan Olikkal (University of Maryland, Baltimore County)

Hand gestures, integral to human communication, hold potential for optimizing human-robot collaboration. Researchers have explored replicating human hand control through synergies. This work proposes a novel method: extracting kinematic synergies from hand gestures via a single RGB camera. Real-time gestures are captured through MediaPipe and converted to joint velocities. Applying dimensionality reduction yields kinematic synergies, which can be used to reconstruct gestures. Applied to the humanoid robot Mitra, results demonstrate efficient gesture control with minimal synergies. This approach surpasses contemporary methods, offering promise for near-natural human-robot collaboration. Its implications extend to robotics and prosthetics, enhancing interaction and functionality.

Intro SlidesShort VideoPoster

Team 5: A Wearable, Multi-Channel, Parameter-Adjustable Functional Electrical Stimulation System for Controlling Individual Finger Movements
Zeyu Cai (University of Bath)

As the survival rate of patients with stroke and spinal cord injuries rises, movement dysfunction in patients after surgery has become a concern. Among them, hand dysfunction seriously impairs patients' quality of life and self-care ability. Recent, many studies have demonstrated that functional electrical stimulation (FES) in the rehabilitation of upper limb motor function. At the same time, compared to traditional treatment methods, functional electrical stimulation is more effective. However, existing FES studies for the hand placed the electrodes in the forearm, which does not allow full control of individual movements of single fingers. In this study, an electrode glove was designed to place the electrodes on the hand, which can achieve this goal. Furthermore, existing FES systems are large in size, the FES system developed in this study is more lightweight and can be made wearable. In summary, this study aimed to develop a novel, wearable functional electrical stimulation system for the hand, which can adjust the stimulation parameters and, with an electrode glove, can control the individual movements of single fingers providing a personalized rehabilitation approach.

Extended AbstractIntro SlidesPoster

Team 6: Enhancing MI-BCI Training with Human-Robot Interaction Through Competitive Music-Based Games
Alessio Palatella (University of Padova)

Motor Imagery Brain-machine Interfaces (MI-BMIs) interpret users' motor imagination to control devices, bypassing traditional output channels like muscles. However, MI-BMIs' proficiency demands significant time and effort, especially for novices. To address this, we propose a novel MI-BMI training method using Human-Robot Interaction via rhythmic, music-based video games and NAO robots. Our experimental setup involves a rhythm game connected to a real NAO robot via a BMI. EEG signals are processed using a CNN-based decoder. Despite data limitations, our approach demonstrates promising control capabilities, highlighting the potential of combining MI-BMIs with robotics for intuitive human-robot interaction and enhanced user experience.

Extended AbstractIntro SlidesShort VideoPoster

Team 7: Enhancing Synergy - The Transformative Power of AR Feedback in Human-Robot Collaboration
Akhil Ajikumar (Northeastern University)

Through this paper, we introduce a novel Augmented Reality system to enable intuitive and effective communication between humans and collaborative robots in a shared workspace.By using multimodal interaction data like gaze, speech, and hand gestures, which is captured through a head-mounted AR device, we explore the impact created by the system in improving task efficiency, communication clarity, and user trust in the robot. We validated this using an experiment, based on a gearbox assembly task, and it showed a significant preference among users for gaze and speech modalities, it further revealed a notable improvement in task completion time, reduced errors, and increased trust among users. These findings show the potential of AR systems to enhance the experience of human-robot teamwork by providing immersive, real-time feedback, and intuitive communication interfaces.

Extended AbstractPoster

Team 8: EEG Movement Detection for Robotic Arm Control
Daniele Lozzi (University of L'Aquila)

This research introduces a novel approach to the construction of an online BCI dedicated to the classification of motor execution, which importantly considers both active movements and essential resting phases to determine when a person is inactive. Then, it explore the best Deep Learning architecture suitable for motor execution classification of EEG signals. This architecture will be useful for control an external robotic arm for people with severe motor disabilities.

Extended AbstractPoster

Team 9: EEG and HRV Based Emotion Estimation Robot for Elderly Interaction
Yuri Nakagawa (Shibaura Institute of Technology)

The increasing demand for emotional care robots in nursing homes aims to enhance the Quality of Life for the elderly by estimating their emotions and providing mental support. Due to the limited physical state of elderly, traditional methods of emotion estimation pose challenges; thus, we explore physiological signals as a viable alternative. This study introduces an innovative emotion estimation method based on Electroencephalogram and Heart Rate Variability, implemented in a care robot. We detail an experiment where this robot interacted with three elderly individuals in a nursing home setting. The observed physiological changes during these interactions suggest that the elderly participants experienced positive emotions.

Intro SlidesPoster

    Organizers     

No. 1

DR. Ker-jiun Wang

Bioengineering
University of Pittsburgh
No. 2

Dr. Zhi-Hong Mao

ECE & BIOEngineering
University of Pittsburgh
No. 3

Dr. Midori Sugaya

CSE
Shibaura Institute of Technology
No. 3

Dr. maryam alimardani

CS and Artificial Intelligence
Tilburg University
No. 3

Dr. Ramana vinjamuri

CSEE
University of Maryland Baltimore
No. 3

Dr. Jun Ueda

Mechanical Engineering
Georgia Institute of Technology

Local Arrangement Chairs    

No. 3

Dr. Feng Chen

PosDoc, Dolylab
Shibaura Institute of Technology
No. 3

Yuri Nakagawa

phd student, dolylab
Shibaura Institute of Technology

HRI and Neuroscience at Scale

Innovation is hard. The core innovation is rooted in a scientific discovery that requires additional technical de-risking, finding profitable and sustainable solutions that meet the target needs. Neurodesign in HRI for a better “brain-centered experience” provides a sticky glue connecting the technologies and the end-users, changing the people’s behaviors to accept the new tech and finding the real use cases to apply the tech. Through hosting this workshop, we hope the HRI researchers and the neuroscientists could bring their groundbreaking research to the real world with more impact. The world needs science at scale.
keep the latest news
Email address
join our email list
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Come participate online in our creative workshop.,
with inspiring talks, thought-provoking discussions
and lots of fun interactions  . and networking.

What is the

Conferencos

Art Classes

& Camps?

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Get Tickets

 May 19, 2025  
08:30 - 17:30

2025 IEEE International Conference on Robotics and Automation (ICRA 2025)
New Here’s a notice bar to bring attention to new features of your website.