Gr Using Matlab

Published on September 2016 | Categories: Types, Research, Science | Downloads: 89 | Comments: 0 | Views: 469
of 11
Download PDF   Embed   Report

Comments

Content

Matlab-based GUI for online gesture recogntion with hidden Markov models June 8, 2010

Thomas Holleczek, Daniel Roggen Electronics Laboratory ETH Zurich

Contents

2

Contents
1 Introduction 1.1 Acceleration Sensor . . . . . . . 1.2 Segmentation . . . . . . . . . . 1.3 Discretization . . . . . . . . . . 1.3.1 Magnitude . . . . . . . 1.3.2 Directional Acceleration 1.4 Class Representation . . . . . . 1.5 Training . . . . . . . . . . . . . 1.6 Classification . . . . . . . . . . 1.6.1 Magnitude . . . . . . . 1.6.2 Directional Acceleration 2 Using the GUI 2.1 Starting Up . . . . . . . . 2.2 Settings . . . . . . . . . . 2.3 Sensor Activation . . . . . 2.4 Segmentation . . . . . . . 2.5 Gesture Operations . . . . 2.5.1 Adding . . . . . . 2.5.2 Recording . . . . . 2.5.3 Individual Training 2.5.4 Removing . . . . . 2.5.5 Saving . . . . . . 2.5.6 Loading . . . . . . 2.5.7 Retraining . . . . . 2.6 Hidden Markov Models . . 2.6.1 Modifying HMMs 2.6.2 Confusion Matrix . 2.7 Classification . . . . . . . 3 Known Bugs and Problems 3 3 3 4 4 4 5 5 5 5 5 6 6 6 6 7 7 8 8 8 8 9 9 9 9 9 10 10 11

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . .

1

Introduction

3

(a) USB acceleration sensor

(b) Measured acceleration in the rest position shown in Figure 1(a)

Figure 1: Measuring the acceleration of objects with sensors

1 Introduction
This document describes the matlab-based online activity recognition software part of the Education Kit for activity recognition using wearable sensors described in [1].

1.1 Acceleration Sensor
Let X, Y and Z denote the (infinite) data stream of measured acceleration values of the three space dimensions: X = (x1 , x2 , . . .) Y = (y1 , y2 , . . .) Z = (z1 , z2 , . . .) The the corresponding data stream M of the magnitude is defined as: M = (m1 , m2 , . . .) where mi =
2 2 x2 + yi + zi i

(1) (2) (3)

(4) (5)

1.2 Segmentation
The process of determining and extracting relevant parts of the input signal, i.e. parts that can be associated with an occurred gesture, is referred to as segmentation. Principally, the sensor can be viewed as an FSM (finite state machine), that is either at rest or active. The task of the segmentation is then to detect state changes of the sensor, which correspond to the start and the end of a segment (or gesture). The basic mechanism used here is as follows: A sliding window of the size n is pushed through the values of the incoming data stream with a specified offset. Suppose that X = (x1 , . . . , xn ) refers to the acceleration signals in the direction of the x-axis observed in the sliding window at a certain time. Y , Z and M are defined analogously. A metric δ is applied to the data, which is then used to decide whether a state change occurred:

1

Introduction

4

1. Start of segment. If the current state is rest and δ is greater than a threshold value ϑ, there is a state transition to active. 2. End of segment. If the current state is active and δ < ϑ, i.e. falls below the threshold value, the end of a segment has been detected. Now the position of the new segment is known and the corresponding values in the data stream X, Y , Z and M can be extracted and prepared for classification. This Matlab tool provides two rather unsophisticated segmentation approaches: energybased segmentation as well as distance-based segmentation. The two segmentation approaches only differ in the type of metric that is applied: 1. Energy-based. The energy-based approach uses as metric the variance of the magnitude values in the current sliding window, which can be determined as follows: 1 n δe (X , Y , Z , M ) = V (M ) = (m − m ) (6) n i=1 i The threshold value for the energy-based segmentation is referred to as ϑe . 2. Distance-based. The distance-based approach uses as metric the distance of the pattern observed in the sliding window to a rest position. The rest position has to be recorded beforehand and is represented through the vector (rx , ry , rz ), where rx is mean value of the acceleration on the x-axis and so on. Then, the computation of the distance metric is as follows: δd (X , Y , Z , M ) =|(x1 − rx , . . . , xn − rx )|+ |(y1 − ry , . . . , yn − ry )|+ |(z1 − rz , . . . , zn − rz )| The threshold value for the distance-based segmentation is referred to as ϑd . (7)

1.3 Discretization
Before the activity of an identified segment can be used for training or classification, features have to be extracted from the acceleration signals. Here, each observed data point di carries a feature which can be determined by discretizing di . 1.3.1 Magnitude 1. Interval width. 2. Number of intervals. 1.3.2 Directional Acceleration Basically, three parameters shape the behavior of the discretization process: 1. Baseline. 2. Interval width. 3. Number of intervals.

1

Introduction

5

1.4 Class Representation
Before any kind of activity may be recognized in the data of identified segments, a set of gesture classes C = (C1 , . . . , CK ) needs to be defined and trained. Here, each gesture class Ci is represented as a quadruple of HMMs (H), one for each acceleration signal: Ci = (Hi,x , Hi,y , Hi,z , Hi,m ) (8)

1.5 Training
To train a particular gesture class Ci , a set of training units needs to be recorded. Once they are available, the HMMs of the gesture class may be trained. Here, the following approach is chosen: 1.

1.6 Classification
Classification refers to assigning an observed gesture to that gesture class it was most likely generated by. As seen before, each gesture class is represented by four (independent) HMMs, one for each signal. There are two major approaches for the classification of the gesture observed in a segment. Either the magnitude or the three directional acceleration signals are used for gesture recognition. 1.6.1 Magnitude The magnitude approach determines the log-likelihood for M for all Hi,m and selects the gesture class with the minimum log-likelihod as most likely generator. If the log-likelihood is −∞ for all classes, the observed activity cannot be classified with this approach. 1.6.2 Directional Acceleration The directional acceleration approach determines the winning models for the three directional acceleration signals X , Y and Z by using the majority vote approach: each signal is assigned to the gesture class it was most likely generated by, i.e. each signal votes for a class. Based on these results, the overall winner is determined as the class which received at least one vote more than any other class. In case of a draw, a winner may be selected randomly. If none of the signals could be classified (with the likelihood being −∞ for all classes and signals) and no single vote was placed, the observed activity cannot be classified with this approach.

Figure 2: Main window of the GUI. The left diagram visualizes the acceleration values of X, Y and Z, whereas the right diagram shows the corresponding magnitude M (in 9.81 · 10−3 m/s2 )

2 Using the GUI
The following sections give an overview of the functionality of the GUI and describe how to best use it.

2.1 Starting Up
First, connect the USB acceleration sensor to the computer and determine the mount point of it (most likely, it will be /dev/ttyUSB0). Make sure the sensor is working by typing cat /dev/ttyUSB0. Next, download the Matlab source code, which is available at www. To start up, open Matlab 2008 (using the command matlab-7.6r2008a on the Solaris machines). Change into the directory where the source code of <Name> resides. Then type <Name>. This will bring up the main window of the software shown in Figure 2 that allows you to control the sensor, to record and train gestures and to finally classify sensor movements.

2.2 Settings
Clicking on Settings allows you to set the system parameters shown in Table 1 initially. Note The settings should not be modified once the sensor has been activated.

2.3 Sensor Activation
Once the initial system parameters have been set, the acceleration sensor is ready for activation. Inittiate the data stream of the sensor by clicking on Activate. It may now take some time, while the system confugures the sensor. Once this is finished, you will see in the real-time visualization of the incoming data stream in the two diagrams in the main window of the GUI.

2

Using the GUI Parameter sensor source device sensor baud rate window size window offset energy threshold distance threshold Short Description Sensor device file of sensor baud rate of the sensor Segmentation n sliding window size for segmentation sliding window offset for segmentation ϑe threshold value for energy based segmentation ϑd threshold value for distance based segmentation Discretization R k number of discretization intervals for X, Y, Z (M ) dR width of discretization intervals for X, Y, Z(M ) Classification either fully-connected or left-right model number of states of HMM to be recovered number of iterations carried out to find the best random starting point number of BW iterations carried out in the training process Default /dev/ttyUSB0 19 200 64 32 10 000 6 400

7

feature baseline number of features interval width

0 5 (10) 500 (500)

type of HMM number of states starting point iterations

fully-connected 5 4

Baum-Welch iterations

10

Table 1: System parameters

2.4 Segmentation
By default, the system makes use of the energy-based segmentation approach with the specified parameters. To use the distance-based approach, click on RestPos. A new window will appear that allows you to record a rest position. Move the sensor in a cmfortable rest position and click on Start. Remain in this position for some time and finally click on End. The recorded rest position will be displayed in a diagram. If you think the recorded rest position is not adequate, you can remove it by pushing the Clear button and start over again. Clicking on OK will confirm your choice. Note Make sure that you use the same segmentation approach for all gestures that you record.

2.5 Gesture Operations

2

Using the GUI

8

2.5.1 Adding To add a new gesture to the current set of gestures, specify an appropriate name in the main window and the desired number of training instances. Then click on Add gesture. A new window will pop up which allows you to record the corresponding training units of the new gesture. 2.5.2 Recording Clicking on Add gesture will bring up a new window to record the specified number of training instances of the gesture. The recording of a particular training instance may be prepared by clicking on the corresponding Start button. The recording of the gesture is started as soon as the metric δ applied to the contents of the sliding window exceeds ϑ for the first time. The recording is terminated once the segmentation process recognizes the end of the current segment once falls below the respective threshold value ϑ. Any training instance may be removed by clicking on Clear. As soon as you are satisfied with the recorded training units, have the system train the HMMs with the global parameters by pushing the Train button. If you prefer to modify the global parameters for the current gesture class, click on Train individually. Notes 1. Make sure that you are in the rest position you have recorded previously if you are using distance-based segmentation. 2. The system will let you open only one recording window at a time. 3. All recorded training instances of a gesture and the gesture itself will be removed if the recording window is closed before the training process has been completed. 4. Clicking on Train will have no effect if not all training instances have been recorded. 5. As long as the HMMs are trained, the processing of the sensor data is suspended for performance reasons. 2.5.3 Individual Training Once all training instances have been recorded, the HMMs for all signals of the gesture may be trained with other parameters than the global ones by clicking on Train individually. A new window will open which asks you to specify the parameters for the training process. 2.5.4 Removing Recorded gesture classes may be removed by clicking on Remove gesture in the main windw of the GUI. You are asked to specify the gesture you want to remove and have to confirm your choice.

2

Using the GUI

9

2.5.5 Saving The current set of available trained gestures may be stored in a local data file. To do this, provide a file name in the text box right next to the button Save gestures in the main window. The trained set of gestures is then saved in the specified file, which resides in the subdirectory data/ of the present working directory. Note Saving the gesture classes in a file will save the corresponding system settings as meta data as well. 2.5.6 Loading By specifying the name of a file in which recorded and trained gestures have been stored and clicking on Load gestures in the main window GUI will load the respective gestures. Notes 1. Loading stored gestures will overwrite any gestures that have not yet been saved in a file. 2. The system settings the saved gestures were recorded with (i.e. meta data) are restored data well. 2.5.7 Retraining The HMMs of the currently loaded set of gestures may be retrained with new parameters. To do this, select Retrain in the main window of the GUI. Adjust the current set of parameters as desired and confirm your choice. Notes 1. Be patient. The retraining process may take a little while. 2. As long as the HMMs are trained, the processing of the sensor data is suspended for performance reasons.

2.6 Hidden Markov Models
2.6.1 Modifying HMMs To modify the parameters of recovered HMMs, click on Show HMMs in the main window. Select that gesture whose HMMs you want to investigate. You will be presented the HMM initial state, transition and observation matrices for all four signals of the gesture (X, Y , Z and M ). You may now edit the contents of any cell of the matrices. Finalize your modifications by pushinhg the OK button.

2

Using the GUI

10

Notes 1. Make sure that, once you have modified the content of a certain cell, you click into any other cell. This will enforce the Matlab interrupt handler to keep the new cell content. 2. After confirming your choice, all probability vectors that do not sum up to one are normalized automatically. 2.6.2 Confusion Matrix Confusion matrices of the current training set may be obtained by clicking on Confusion matrix.

2.7 Classification
Now that the system has been equipped by a couple of trained samples, it is ready for classifying your gestures. Click on Classify to commence the classification process. The new window that opens up contains several fields displaying the latest classification results: 1. Classification decision based on the data stream of the magnitude M . 2. Outcome of the majority vote based on the classification of X, Y and Z. 3. Log-likeliood matrix containing the log-likelihood values for all gestures and HMMs.

3 Known Bugs and Problems
None

References
[1] Roggen, D., Bächlin, M., Schumm, J., Holleczek, T., Lombriser, C., Tröster, G., Widmer, L., Majoe, D., Gutknecht, J.: An educational and research kit for activity and context recognition from on-body sensors. In: Proc. IEEE Int. Conf. on Body Sensor Networks (BSN). (2010) 277–282

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close