Please use this identifier to cite or link to this item: https://hdl.handle.net/2440/121245
Citations
Scopus Web of Science® Altmetric
?
?
Type: Journal article
Title: Efficient dense labelling of human activity sequences from wearables using fully convolutional networks
Author: Yao, R.
Lin, G.
Shi, Q.
Ranasinghe, D.C.
Citation: Pattern Recognition, 2018; 78:252-266
Publisher: Elsevier
Issue Date: 2018
ISSN: 0031-3203
1873-5142
Statement of
Responsibility: 
Rui Yao, Guosheng Lin, Qinfeng Shi, Damith C. Ranasinghe
Abstract: Recognising human activities in sequential data from sensors is a challenging research area. A significant problem arises from the need to determine fixed sequence partitions (windows) to overcome the inability of a single sample to provide adequate information about an activity; commonly overcome by using a fixed size sliding window over consecutive samples to extract information—either handcrafted or learned features—and predicting a single label for all the samples in the window. Two key issues arise from this approach: (i) the samples in one window may not always share the same label, a problem more significant for short duration activities such as gestures. We refer to this as the multi-class windows problem. (ii) the inferencing phase is constrained by the window size selected during training while the best window size is difficult to tune in practice. We propose an efficient method for predicting the label of each sample, which we call dense labelling, in a sequence of activity data of arbitrary length based on a fully convolutional network (FCN) design. In particular, our approach overcomes the problems posed by multi-class windows and fixed size sequence partitions imposed during training. Further, our network learns both features and the classifier automatically. We conduct extensive experiments and demonstrate that our proposed approach is able to outperform the state-of-the-arts in terms of sample-based classification and activity-based label misalignment measures on three challenging datasets: Opportunity, Hand Gesture, and our new dataset—an activity dataset we release based on a wearable sensor worn by hospitalised patients.
Keywords: Human activity recognition; time series sequence classification; fully convolutional networks
Rights: © 2018 Elsevier Ltd. All rights reserved.
DOI: 10.1016/j.patcog.2017.12.024
Published version: http://dx.doi.org/10.1016/j.patcog.2017.12.024
Appears in Collections:Aurora harvest 8
Computer Science publications

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.