Language:   
   Researches
Assistive device
Affect - based Image Retrieval
Film restoration

Assistive device


An assistive robot is device to support safe mobility to the people with sensor, motor and cognitive impairments. To develop the assistive robot, our research team study the algorithms for gesture - based interface, automatic navigation and situation recognition. And we implement some assistive device such as Intelligent Wheelchair(IW), EYECane, Wayfinding system.

Intelligent Wheelchair interface using Face and Mouth recognition

Intelligent Interface using Facial Features

Developed by Jin Sun Ju, Yunhee Shin and Eun Yi Kim

We develop a novel intelligent wheelchair control using face and mouth recognition for the severely disabled. The main goal of the present study is to provide a more convenient and effective access method for people with various disabilities. For accurate recognition of the user’s intention, the direction of the IW is determined by the inclination of the user’s face, while proceeding and stopping are determined by the shape of the user’s mouth.
Process to recognize user’s gestures, where the recognition is performed by three steps: Detector, Recognizer, and Converter.

Intelligent Wheelchair interface using Face and Mouth recognition
<The outline of the proposed control system>

Reference: Jin Sun Ju, Yunhee Shin and Eun Yi Kim, Intelligent Wheelchair (IW) Interface using Face and Mouth recognition, IUI’09

EYE MOUSE: Interface using Eye Tracking

Eye Mouse: Interface using Eye Tracking

Developed by Yunhee Shin, Jin Sun Ju and Eun Yi Kim

We developed a welfare interface using multiple facial features tracking, which can efficiently implement various mouse operations. The proposed system consist of five modules

  • face detection
  • eye detection
  • mouth detection
  • facial features tracking
  • mouse control

It receives and displays a live video of the user sitting in front of the computer. The user moves the mouse pointer by moving his (her) eyes, and clicks icons and menus by opening and closing his (her) mouth.



<The overall configuration of the system>

Reference: Yunhee shin and Eun Yi Kim, Welfare Interface Using Multiple Facial Features Tracking, AI ‘06

EYECane: Navigating with camera embedded white cane for visually impaired person

Navigating with camera embedded white cane for visually impaired person

Developed by Jin Sun Ju and Eun Yi Kim

We demonstrate a novel assistive device which can help the visually impaired or blind people to gain more safe mobility, which is called as “EYECane”. The EYECane is the white-cane with embedding a camera and a computer. It automatically detects obstacles and recommends some avoidable paths to the user through acoustic interface. For this, it is performed by three steps: Firstly, it extracts obstacles from image streaming using online background estimation, thereafter generates the occupancy grid map, which is given to neural network. Finally, the system notifies a user of an paths recommended by machine learning. To assess the effectiveness of the proposed EYECane, it was tested with 5 users and the results show that it can support more safe navigation, and diminish the practice and efforts to be adept in using the white cane.

EYECane
<The EYECANE navigation system>


Reference: Jin Sun Ju, Eunjeong Ko and Eun Yi Kim, EYECane: Navigating with camera embedded white cane for visually impaired person, ASSETS ‘09

Situation-based indoor wayfinding system for the visually and cognitively impaired people

Situation-based indoor wayfinding system for the visually and cognitively impaired people

Developed by Eunjeong Ko, Jin Sun Ju and Eun Yi Kim

In our work, we develop an indoor wayfinding system to help the visually impaired finding their way to a given destination in an unfamiliar environment. The main novelty is the use of the user’s situation as the basis for designing color codes to explain the environmental information and for developing the wayfinding system to detect and recognize such color codes. Actually, people would require different information according to their situations. Therefore, situation-based codes are desgined including location-specific codes and guide codes. These color codes are affixed in certain locations to provide information to the visually impaired, and their location and meaning are the recognized using the proposed wayfinding system. Consisting of three steps, the proposed wayfinding system first recognizes the current situation using a vocabulary tree that is built on the shape properties of images taken of various situations. Next, it detects and recognizes the necessary codes according to the current situation, based on color and edge information. Finally, it provides the user with environmental information and their path through an auditory interface.

Situation-based indoor wayfinding system for the visually and cognitively impaired people
<Overall architecture of proposed wayfinding system>

Reference: Eunjeong Ko, Jin Sun Ju and Eun Yi Kim, Situation-based Indoor Wayfinding System for the Visually Impaired, ASSETS ‘11

Cross intersection

Cross intersection

Developed by Jihye Hwang, Jin Sun Ju and Eun Yi Kim

  User account

Login / ID:
Password:
 
Forgot Password  |  Registr ]
  Image Battle System

Image Battle Survey System User Rank
N Username Votes Uploads Favourites
1 viplab 43334 0 0
2 changdaekim 27001 0 0
3 myungillee 10001 0 0
4 youjinshin04 3641 0 0
5 myungillee03 3640 0 0
6 changdaekim02 3640 0 0
7 myungillee02 3640 0 0
8 changdaekim03 3640 0 0
9 changdaekim04 3640 0 0
10 myungillee04 3640 0 0
11 myungillee05 3640 0 0
12 myungillee06 3640 0 0
13 changdaekim05 3640 0 0
14 changdaekim01 3640 0 0
15 myungillee01 3640 0 0

Go to Image Battle System page

  Affective Features

Top 10 users are shown

Notification
N Username Frequence
1 sanna 4
2 aurshax 2
3 수경 2
4 su-kyung 1
5 Jason 1
6 Jihye Hwang 1
7 EJ 1
8 sodam 1

There are 8 users attended for the survey

Attend to userstudy

  Partners