Project Introduction

Project Introduction

Message

This research starts from a corporative research “development of supporting technology, safe and comfortable support for people with disabilities” between the National Institute of Advanced Industrial Science and Technology and National Rehabilitation Centre for Persons with Disabilities (then) in 2004 to 2007. At that time, in a part of the project, we researched and developed an orphan product about a gesture interface that makes it possible to run autonomously indoor and outdoor circumstances by head movement for people with disabilities who cannot use electric wheelchair because of cerebral palsy. It has been 10 years after the end of the project, nowadays; truly useful product is coming because of the price down and improvement technology of the image range censor and personal computer.

The research objective and approach method

The research objective is that to provide low-priced, a flexible and adaptable switch interface with using an image range censor sell at a market for people with disabilities who cannot use ordinary interfaces. Our objective is to provide truly useful system for every user as to make the price of the system contains of personal computer, less than 100 thousand yen. The approach method for research is that, we collect and classified the actual data of people with disabilities according to body part and used to develop basic a modulated gesture recognition engine to apply to various kinds of people with disabilities more and more. Also, there are big feature in system as to make it possible to recognize multi gestures at the same time.

Research Team

    • National Institute of Advanced Industrial Science and Technology (AIST)
 
    • National Rehabilitation Centre for Persons with Disabilities (NRC)
 
  • National Center of Neurology and Psychiatry (NCNP)

Hot news !

    • Sept. 8-10, 2024

We will be exhibiting AAGI at Communication Matters 2024 in Leeds University, UK.

    • June 4th, 2024

A new version of AAGI is now available for distribution. The external output feature has been greatly enhanced. Furthermore, the “Slight” module is now compatible with UVC cameras.

 

History

  • Oct. 2023 – Sept. 2028 Cross-ministerial Innvation Promotion Program (SIP) Phase 3 Task: “Building a platform for learning and working in the post-Corona era” “Building a regional education and work model through interface innovation for people with severe disabilities”

Representative: Ikushi YODA (AIST), Main member: Daisuke NISHIDA (NCNP), Katsuhiro MIZINO (Tokai Univ.), Michie KAWASHIMA (Kyosan Univ.)

Summary : The project has been selected for research and development by JST’s Strategic Innovation Program (SIP) for the third term, “Establishment of a Platform for Learning and Working in the Post-Coronary Era”. In the Strategic Innovation Program (SIP), the Council for Science, Technology and Innovation (CSTI) identifies issues that are essential to society and important for Japan’s economic and industrial competitiveness, selects a program director (PD) for each issue, and implements research from basic research to exit (practical application and commercialization). The program is a cross-functional program that transcends the boundaries of ministries and fields, and promotes the use of intellectual property as well as regulatory and institutional reforms and the special zone system. This time, we will focus on the connection between “new learning” and work styles in Subproject B of SIP Phase 3, “Establishment of a Platform for Learning and Working in the Post-Coronation Era. The goal is to realize a flat society where people with disabilities and able-bodied people learn together and help each other by deploying nationwide a technological innovation called gesture interface, which was developed for people with severe motor function disabilities who have difficulty operating various information devices (PCs, home appliances, etc.) normally.
 
  • April 2020 – March 2024 JST/RSITEX / Strategic Creative Research Promotion Program (Social Technology Research and Development) “Co-creative research and development program for the achievement of the SDGs” [Solution creation phase] “Construction of an employment & education support model and human resource development for people with motor dysfunction using gesture interfaces”

Representative: Ikushi YODA (AIST), Collaboration implementer: Katushiro MIZUNO (NCNP)

Summary : By utilizing “gesture interface technology,” which uses an inexpensive, commercially available distance camera to recognize gestures of persons with disabilities and link them to the operation of ICT equipment in a non-contact, non-binding manner, we will reduce difficulties for persons with motor function disabilities to operate ICT equipment while compensating for the lack of resources to support persons with disabilities. Through collaboration and cooperation with the parties concerned, occupational therapist associations, local hospitals, employment support companies, support schools, and local NPOs that are in charge of support, regional demonstrations and effectiveness measurements of the gesture interface technology will be conducted, so that patients and supporters can confirm its value and effectiveness, while we will also create educational manuals for supporters and develop a regional network of support personnel. In addition, a series of research and development will be conducted, including the creation of educational manuals for supporters and the establishment of a regional support system. Through these efforts, a regional support model will be established to comprehensively improve employment opportunities and the quality of education for people with motor function disabilities.
 
  • April 2020 – March 2023 Tateishi Research Foundation S “Fusion of man and machine” “Research and development on the advancement of basic technology for gesture interfaces and clinical evaluation”

Representative Ikushi YODA (AIST), Main partner: Katsuhiro MIZUNO (NCNP)

Summary : The first target is people with severe motor function disabilities who have difficulty operating various information devices (PCs, home appliances, etc.). This research and development is intended to promote the integration of information devices and people with disabilities by using their remaining voluntary body part movements as switches. This research on human function extension and information function extension is envisioned to enhance the social participation of people with disabilities. By promoting social implementation in cooperation with people with disabilities themselves and their families, and by conducting research and development targeting users who really need gesture interfaces, we will conduct research on next-generation interface technologies that can eventually be used by the elderly and healthy people.
 
  • April 2019 – March 2022 AMED “Research on the construction of an ICT equipment operation environment for people with severe motor dysfunction by applying multi recognition technologies”

Representative Kazuyuki Itoh (NRC), sharing member Ikushi Yoda (AIST)

Summary : (1) We brush up the recognition engine by collecting movement data from various people with severe motor function disabilities and providing feedback on the evaluation of the recognition engine. (2) We select interface locations for user needs, such as key input and mouse operation for PC operation and remote-control operation of various home appliances, and develop specific interfaces from the recognition system to various devices. (3) We analyze the collected operation data, brush up the recognition engine, integrate the individual basic recognition modules under development, and complete the system so that it can be adapted to the target user by automatically adjusting various parameters.
 
  • April 2018 – March 2020 Ministry of Internal Affairs and Communications, Strategic Information and Communications Research and Development Promotion Project (SCOPE) Priority Area Research and Development “Research and development of adaptive gesture interface for improving accessibility”

Representative Ikushi Yoda (AIST), sharing member Yoko Kobayashi, Awazawa Hiroyuki (NCNP)

Summary : We research and develop a gesture interface for people with disabilities who cannot use ordinary devices to operate PCs. We develop a unconstrained and non-contactable interfaces using commercially available distance image sensor to reduce the price. Our first target is people with disabilities, and we develop the technology to adapt their movements to develop standard technology in the future.

Acknowledgment: This research and development work was supported by the MIC/SCOPE #181503007 in Japan.

 
  • April 2016 – March 2019 Ministry of Education, Culture, Sports, Science and Technology, Science and research basis B “Research of adaptable gesture interface for people with disabilities to support communication”

Representative Ikushi Yoda (AIST), sharing member Tsuyoshi Nakayama, Kazuyuki Ito (NRC), Yoko Kobayashi (NCNP)

Research result : We have been researching mainly of basic learning system using collected more than 100 part of body gestures, and we are about to complete gesture recognition module group of our basic suggestion. The basic part of this recognition engine will be complete at the end of this project.
 
  • April 2015 – March 2018 AMED (Total research and development project for persons with disabilities), “The development of non-contact gesture recognition interface for cerebral palsy and cerebral stroke patients to support communication”

Representative Kazuyuki Ito (NRC), sharing member Ikushi Yoda (AIST)

Summary : We did long term adjustment experiment for actual users mainly research for cerebral palsy patient continuously, we collected gesture data from people with disabilities continuously, and we developed operation menu for the first time mainly of to operate personal computers, home electronics, bell pressing.
Research result : We collected data total number of 51 people; total number of the body part is 181 continuously. Also, we did long adjustment evaluation for the first time with state transition type of operation menu. Also, we made web site to open the software later and we will open foot gesture module as for the first version of prototype.
 
  • April 2014 – March 2015 Ministry of Health, Labor and Welfare, Health, Labor, Welfare and Science Research Trust Fund (total research project for persons with disabilities) “Research and development of module that is non-contact and unconstraint gesture interface to support of a person with disabilities”

Representative Ikushi Yoda (AIST), sharing member Tsuyoshi Nakayama, Kazuyuki Ito (NRC)

Summary : We research and developed gesture interfaces that make it possible to operate personal computer with more easily, as a continuous research mentioned above. Especially, we did long term adjustment experiment (3 to 4 month or more) for 3 examinees to develop a system that can customize the system for each user with more easily and more low cost. Also, we developed recognition engine module with collecting and classifying various kinds of movement from people with disabilities same as before.
Research result : We collected gestures with using these parts from total number of 36 people, adding to the previous project with hearing ideas from users that is person with disabilities of their own and their caregivers. And we collected total number of 125 parts of the body movement because the users want to use these parts to operate equipment as for a switch. Also, we newly developed leaning motion recognition modules that recognize movement of the body part that do not regulate part of the body, even though 5 parts of recognition modules, same as before. We did long term additional experiment for 3 examinees for over 3 month with using this new module, head part recognition module and finger recognition module.
 
  • April 2013 – March 2014 Ministry of Public Management, Home Affairs, Posts and Telecommunications SCOPE Priority area type (ICT innovation creation type) research and development ”Research and development of module that is non-contact and unconstraint gesture interface to support information illiterate”

Representative Ikushi Yoda (AIST), sharing member Tsuyoshi Nakayama, Kazuyuki Ito (NRC)

Summary : We researched and developed the interface that can operate information-processing equipment with easy gestures for persons with disabilities who cannot use ordinary switches or sight line input devices because of convulsion or involuntary movement. It is first in the world developing the system with collecting many gestures from people with disabilities.
Research result : We collected 36 parts of gestures that users want and users can do voluntary movement from 22 examinee (not total number but all different person). We classified these gestures with person concerned such as rehabilitation medical doctor, rehabilitation engineer and occupational therapist. At the same time, we developed a prototype and make total 5 kinds of recognition modules (2 types of hands and arms, 2 types of head parts, 1 type of legs parts).
 
  • April 2012 – March 2013 Tateishi Research Foundation A“The promotion of Human and machine harmony” “Development of interface for cerebral palsy by image range sensor”

Representative Ikushi Yoda (AIST)、sharing member Tsuyoshi Nakayama, Kazuyuki Ito (NRC)

There are some cases that it is difficult to cope with the problem about recognition of gestures that it is difficult to recognize because of movement of the recognition area or spasticity even though it is easy to understand by familiar people with ordinary equipment. So, we research and develop system for one people who have cerebral palsy. As a result, we made success as to develop system specialize for one user with using gesture, mainly for movement of finger and shake neck and open and close moth.

 
  • April 2004 – March 2006 Science and Technology Promotion Fund “Development of supporting technology safe and comfortable support for people with disabilities”

We developed a stereo camera and our system recognize the head motion in real time, and we succeeded to run electric wheelchair operated by people with disabilities of their own at indoor and outdoor condition.

 

Main papers

  1. Ikushi Yoda, Kazuyuki Itoh, and Tsuyoshi Nakayama, “Modular Gesture Interface for People with Severe Motor Dysfunction: Foot Recognition,” Proceedings of AAATE 2017, (Harnessing the Power of Technology to Improve Lives), IOS Press, pp.725-732 (2017)
  2. Ikushi Yoda, Kazuyuki Itoh, and Tsuyoshi Nakayama, “Long-Term Evaluation of a Modular Gesture Interface at Home for Persons with Sever Motor Dysfunction,” Proceedings of Universal Access in Human-Computer Interaction 2016, (Springer LNCS 9738), pp.102-116 (2016)
  3. Ikushi Yoda, Kazuyuki Itoh, and Tsuyoshi Nakayama; “Basic Long-term Experimental Evaluation on Modular Gesture Interface by People with Severe Motor Dysfunction,” IEICE Technical Report, WIT2014-93, pp.45-50 (2015) (in Japanese)
  4. Ikushi Yoda, Kazuyuki Itoh, and Tsuyoshi Nakayama: “Collection and Classification of Gestures from People with Severe Motor Dysfunction for Developing Modular Gesture Interface,” Springer Lecture Note in Computer Science (LNCS) HCI International 2015 (2015 in printed)
  5. Ikushi Yoda, Kazuyuki Itoh, and Tsuyoshi Nakayama: “Collection and Classification of Gestures from People with Severe Motor Dysfunction for Developing Modular Gesture Interface,” Springer Lecture Note in Computer Science (LNCS) HCI International 2015 (2015 in printed)
  6. Ikushi Yoda, Kazuyuki Itoh, and Tsuyoshi Nakayama: “Collection and Classification of Gestures from People with Severe Motor Dysfunction for Developing Modular Gesture Interface,” Correspondences on Human Interface Vol.16,No.2, SIG-ACI-12, PP.23-28 (2014) (in Japanese)
  7. I. Yoda, T. Nakayama, and K. Itoh: “Development of Interface for Cerebral Palsy Patient by Image Range Sensor,” Tateishi Science and Technology Foundation, Grand Research Results Vol. 22 2013, pp.122-125 (2013) (in Japanese)
  8. N. Sato, I. Yoda and T. Inoue: “Shoulder Gesture Interface for Operating Electric Wheelchair,” IEEE International Workshop on Human-Computer Interaction in conjunction with ICCV2009, pp.2048-2055 (2009)
  9. I. Yoda, J. Tanaka, Y. Kimura, B. Raytchev, K. Sakaue, and T. Inoue: “Non-contact Non-constraining Head Gesture Interface System for Electric Wheelchair,” IEICE Trans. Inf.& Syst.(Japanese Edition) Vol. J91-D, No.9, 2008 (2008) (in Japanese)