The main objective of this paper is to introduce a MEMS based device for handwritten digit and hand gesture recognition applications. This MEMS based device consists of tri axial accelerometer, an atmega8 microcontroller and a ZigBee wireless transmission module. The accelerometer is used to recognize hand movements. It converts the acceleration signals into analog voltage signals. The analog voltage signals will be converted into digital signals in atmega8 microcontroller. The ZigBee module will transmit those signals to personal computer and it will be processed. Finally the digits will be displayed in pc monitor. Hence human computer interaction can be done in an easy manner similar to write in a notebook by using a pen.
Keywords |
Accelerometer, Atmega8 Microcontroller, Zigbee, Human Computer Interaction (HCI) |
INTRODUCTION |
The rapid advance of the computing environment increasingly requires a new human–machine symbiosis. Our primary
Physical connection to the world is made through our hands. We perform most of our everyday tasks with them. When
we work with a computer or the computing system, we are constrained by clumsy intermediary devices such as
keyboards, mice and joysticks. Among these, the keyboard is the most familiar and widely used input device for
humans. Although QWERTY-type keyboards are still widely used these days, it is true that they are too bulky and
inconvenient for portable computing systems including wearable and mobile computing systems. |
In addition, a large number of input elements are required to input even a very-small word/phrase vocabulary as these
types of keyboards incorporate only one kind of input element—a switch. Since the early 1980s, much attention has
been paid to a new portable input device with the physical form of a portable computer [1], [2]. In the past, data input
requires at least two stages: the initial data processing or collection is carried out via pen and paper in the first stage and
the data are entered into a computer in the second stage. Various wearable input devices allow human operators to
remove the first conventional stage and reduce the data input process to a single stage. Some results showed that the
introduction of wearable input devices could save task time and reduce error rates by compressing the conventional two
stage task into a single stage [3]. |
In this paper a new approach for human computer interaction is introduced. Based on inertial sensor this device can be
used as a normal pen and digits written by using that pen will be displayed in monitor. The accelerometer used in this
device is used to recognize the hand movements. The digital pen can be used to write the hand digits by users. The
signals from the accelerometer will be transmitted to pc with the help of ZigBee. The signals will be processed in pc
and the results will be displayed in monitor. Hence there is no need for keyboard typing which is quite difficult than
normally writing by using a pen in a notebook. |
RELATED WORK |
Nowadays many researches have been focused on Human Computer interaction. For illustration, Yoon Sang Kim,
Byung Seok Soh, and Sang-Goog Lee proposed a new wearable input device named SCURRY which based on inertial
sensors. This device allows a human operator to select a specified character, an event, or an operation as the input
he/she wants spatially through both hand motion and finger clicking. It is a glove like device, which can be worn on the human hand. The two gyroscopes embedded in the base module have a role in detecting the direction (up, down,
right, and left) of the hand motion. The accelerometers have a role in detecting finger motion generated by finger
clicking[4]. |
And also Xiang Chen, Xu Zhang, Zhang-Yan Zhao, Ji-Hai Yang proposed a project Hand Gesture Recognition
Research Based on Surface EMG Sensors and 2D-accelerometers For realizing multi-DOF interfaces in wearable
system, accelerometers and surface EMG sensors are used to detect hand movement information for multiple hand
gesture recognition. Experiments were designed to collect gesture data with both sensing techniques to compare their
performance in the recognition of various wrist and finger gestures. Recognition tests were run using different subsets
of information: accelerometer and sEMG data separately and combined sensor data[5]. |
Similarly, the authors Eri Sato, Toru Yamaguchi, Fumio Hiroshima proposed Natural Interface Using Pointing
Behaviour for Human Robot Gestural Interaction. In that method, pointing behaviour for a natural interface was
focused. A gestural interface is important for use with these robots. Gestural recognition has already been studied for
use in a human–machine interface. The investigation was done to find a system that recognizes users’ intentions by
using their gestural information in particular situations. Therefore, a system based on interpersonal communication that
uses pointing gestures as information was constructed [6]. |
HARDWARE DESCRIPTION |
This MEMS based device consists of a 3-axis accelerometer, atmega8 microcontroller, RS232, ZigBee module. |
A. Accelerometer |
The accelerometer ADXL335 is used in the device used for HCI interaction. This accelerometer sensor is very small in
size hence it can be used in the device which can be used as pen. It consumes very low power approximately 3v. It can
measure the acceleration signals from three dimensions with minimum range of ±3 g. It can measure the static
acceleration and also dynamic acceleration signals. The accelerometer bandwidth can be selected by the user using
capacitors connected to the output pins of the accelerometer. Hence the acceleration of hand motion while writing the
digits can be measured by the 3axis accelerometer. In which the acceleration signals will be converted to analog voltage
signals depending upon the hand motion [7]. |
C. Transmitter Section |
D. Receiver Section |
B. Atmega8 microcontroller |
The Atmega8 is an 8-bit microcontroller. The architecture is RISC type. It consumes low power approximately 5V. It
can execute the instructions in a single cycle and hence it can execute 1 MIPS per MHz and simultaneously which
allows optimizing power consumption versus processing speed. In this device the microcontroller is used both in
transmitter and receiver section. |
In the transmitter section the analog voltage signals generated by accelerometer will be converted into digital signals
with the help of inbuilt ADC in the atmega microcontroller. And the signals are controlled and monitored with the help
of microcontroller. In the receiver section the received signal from the ZigBee module will be transferred to pc via
microcontroller and RS232. |
F. ZigBee Module |
Zigbee is used to transfer data between multiple ends. It can transfer data over long distance because it can pass the
data through intermediate nodes. The Zigbee module is suitable for high level communication protocols which are used
to create personal area networks. It is based on an IEEE 802.15 standard. It is a low powered device. In this transceiver
module there is no necessity for centralized controller. In this device the ZigBee module is used in both receiving and
transmitting section. The acceleration signals from the microcontroller will be transmitted to the receiving section
through this ZigBee module. In the receiving side this module is used to receive the transmitted signal. And the signal
will be passed to the PC for further processing. Finally the digits will be displayed in PC monitor |
E. Schematic Diagram |
EXPERIMENTAL RESULTS |
A. Simulation output |
This device is designed by using Atmega8 Microcontroller. It is proposed to design a device for machine human
interaction. The microcontroller is used for interfacing with the peripheral such as accelerometer. For doing so an
Atmega8 microcontroller is interfaced to a Zigbee Transceiver. The hardware interfaces to microcontroller are LCD
display, Zigbee transceiver, and accelerometer. A Serial Driver IC is used for converting TTL Voltage to RS-232
voltage levels.The simulation is carried out by using Proteus Design tool. The current design is an embedded
system platform, which asks for ignition code. |
The Figure 4 shows the simulation result in Proteus software which demonstrates how the digits will be transmitted and
displayed. In which, the virtual terminal is used to display the information which is transferred from external hardware
as similar data transmission between PC monitor and external hardware. The LCD is used to display the digits
depending upon the signals generated during accelerometer movement. The value will be continuously displayed in
virtual terminal window. |
B. Hardware Result displayed in Monitor |
The digits or alphabets are written with the help of accelerometer. The analog signal from the accelerometer will be
transferred to atmega microcontroller in which the analog signals will be converted into digital signals. The converted
digital signal will be transferred through Zigbee module. The received signals will be passed to Pc where it is processed. The signals will be displayed in the PC monitor by using Visual basic as front end. The digits will be
displayed as shown in the figure 5 and 6. Similarly, we did experiments for the numerals 0 to 9. The system can also be
used to display the characters. |
CONCLUSION |
In this paper, we have described a device for human computer interaction. Our aim was to achieve easy way of
interaction between human and computer. The acceleration is measured by 3-axial accelerometer hence hand gesture
can be obtained in easy manner. By means of this technology we can put pen to paper & display the characters not
including the keyboard for applying the human interaction to the computer. |
Figures at a glance |
|
|
|
Figure 1 |
Figure 2 |
Figure 3 |
|
|
|
Figure 4 |
Figure 5 |
Figure 6 |
|
|
References |
- L. J. Bass, ―Is there a wearable computer in your future?‖ in Engineering Human-Computer Interaction (EHCI), Yellowstone Park, WY, 1995, pp. 3–16.
- S. Mann, ―Smart clothing: The shift to wearable computing,‖Commun.ACM, vol. 39, no. 8, pp. 23–24, Aug.1996.
- L. J. Bass et al., ―The design of a wearable computer,‖ in Computer Human Interaction (CHI), Atlanta, GA, 1997, pp. 139–146.
- Y. S. KIM, B. S. SOH, AND S.-G. LEE, ―A New Wearable Input Device: SCURRY,‖IEEE TRANS. IND. ELECTRON., VOL. 52, NO. 6, PP. 1490–1499, DEC. 2005
- N. C. KRISHNAN, C. JUILLARD, D. COLBRY, AND S. PANCHANATHAN, ―Recognition of Hand Movements Using wearable Acccelerometers,‖ J. AMBIENT INTELL. SMART ENVIRON., VOL. 1, NO. 2, PP. 143–155, APR. 2009.
- E. SATO, T. YAMAGUCHI, AND F. HARASHIMA, ―Natural Interface Using Pointing Behaviourfpr Human Robot Gestural Interaction,‖ IEEE TRANS. IND.ELECTRON., VOL. 54, NO. 2, PP. 1105–1112, APR. 2007.
- Y. S. SUH, ―Attitude Estimation by Multiple-mode Kalman Filters‖ IEEE TRANS. IND. ELECTRON., VOL. 53, NO. 4, PP. 1386–1389, JUN. 2006.
- Z. Dong, U. C. Wejinya, and W. J. Li, ―An optical-tracking calibration method for MEMS-based digital writing instrument,‖ IEEE Sens. J.,vol. 10, no. 10, pp. 1543–1551, Oct. 2010.
- J. S.Wang, Y. L. Hsu, and J. N. Liu, ―An inertial-measurement-unit-based pen with a trajectory reconstruction algorithm and its applications,‖ IEEE Trans. Ind. Electron., vol. 57, no. 10, pp. 3508–3521, Oct. 2010.
|