Robust Sensor-Orientation-Independent Feature Selection for Animal Activity Recognition on Collar Tags

    Research output: Contribution to journalArticleAcademicpeer-review

    83 Downloads (Pure)

    Abstract

    Fundamental challenges faced by real-time animal activity recognition include variation in motion data due to changing sensor orientations, numerous features, and energy and processing constraints of animal tags. This paper aims at finding small optimal feature sets that are lightweight and robust to the sensor's orientation. Our approach comprises four main steps. First, 3D feature vectors are selected since they are theoretically independent of orientation. Second, the least interesting features are suppressed to speed up computation and increase robustness against overfitting. Third, the features are further selected through an embedded method, which selects features through simultaneous feature selection and classification. Finally, feature sets are optimized through 10-fold cross-validation. We collected real-world data through multiple sensors around the neck of five goats. The results show that activities can be accurately recognized using only accelerometer data and a few lightweight features. Additionally, we show that the performance is robust to sensor orientation and position. A simple Naive Bayes classifier using only a single feature achieved an accuracy of 94 % with our empirical dataset. Moreover, our optimal feature set yielded an average of 94 % accuracy when applied with six other classifiers. This work supports embedded, real-time, energy-efficient, and robust activity recognition for animals.
    Original languageEnglish
    Article number15
    Number of pages27
    JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
    Volume2
    Issue number1
    DOIs
    Publication statusPublished - 25 Mar 2018

    Fingerprint

    Feature extraction
    Animals
    Sensors
    Classifiers
    Accelerometers
    Processing

    Keywords

    • Animal Activity Recognition, Decision Tree, Embedded Systems, Machine Learning, Naive Bayes, Sensor Orientation

    Cite this

    @article{c4b6557309c849c4b0617f689a6ee434,
    title = "Robust Sensor-Orientation-Independent Feature Selection for Animal Activity Recognition on Collar Tags",
    abstract = "Fundamental challenges faced by real-time animal activity recognition include variation in motion data due to changing sensor orientations, numerous features, and energy and processing constraints of animal tags. This paper aims at finding small optimal feature sets that are lightweight and robust to the sensor's orientation. Our approach comprises four main steps. First, 3D feature vectors are selected since they are theoretically independent of orientation. Second, the least interesting features are suppressed to speed up computation and increase robustness against overfitting. Third, the features are further selected through an embedded method, which selects features through simultaneous feature selection and classification. Finally, feature sets are optimized through 10-fold cross-validation. We collected real-world data through multiple sensors around the neck of five goats. The results show that activities can be accurately recognized using only accelerometer data and a few lightweight features. Additionally, we show that the performance is robust to sensor orientation and position. A simple Naive Bayes classifier using only a single feature achieved an accuracy of 94 {\%} with our empirical dataset. Moreover, our optimal feature set yielded an average of 94 {\%} accuracy when applied with six other classifiers. This work supports embedded, real-time, energy-efficient, and robust activity recognition for animals.",
    keywords = "Animal Activity Recognition, Decision Tree, Embedded Systems, Machine Learning, Naive Bayes, Sensor Orientation",
    author = "Kamminga, {Jacob Wilhelm} and {Le Viet Duc}, {Duc Viet} and Meijers, {Jan Pieter} and Bisby, {Helena C.} and Nirvana Meratnia and Havinga, {Paul J.M.}",
    year = "2018",
    month = "3",
    day = "25",
    doi = "10.1145/3191747",
    language = "English",
    volume = "2",
    journal = "Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies",
    issn = "2474-9567",
    publisher = "Association for Computing Machinery (ACM)",
    number = "1",

    }

    TY - JOUR

    T1 - Robust Sensor-Orientation-Independent Feature Selection for Animal Activity Recognition on Collar Tags

    AU - Kamminga, Jacob Wilhelm

    AU - Le Viet Duc, Duc Viet

    AU - Meijers, Jan Pieter

    AU - Bisby, Helena C.

    AU - Meratnia, Nirvana

    AU - Havinga, Paul J.M.

    PY - 2018/3/25

    Y1 - 2018/3/25

    N2 - Fundamental challenges faced by real-time animal activity recognition include variation in motion data due to changing sensor orientations, numerous features, and energy and processing constraints of animal tags. This paper aims at finding small optimal feature sets that are lightweight and robust to the sensor's orientation. Our approach comprises four main steps. First, 3D feature vectors are selected since they are theoretically independent of orientation. Second, the least interesting features are suppressed to speed up computation and increase robustness against overfitting. Third, the features are further selected through an embedded method, which selects features through simultaneous feature selection and classification. Finally, feature sets are optimized through 10-fold cross-validation. We collected real-world data through multiple sensors around the neck of five goats. The results show that activities can be accurately recognized using only accelerometer data and a few lightweight features. Additionally, we show that the performance is robust to sensor orientation and position. A simple Naive Bayes classifier using only a single feature achieved an accuracy of 94 % with our empirical dataset. Moreover, our optimal feature set yielded an average of 94 % accuracy when applied with six other classifiers. This work supports embedded, real-time, energy-efficient, and robust activity recognition for animals.

    AB - Fundamental challenges faced by real-time animal activity recognition include variation in motion data due to changing sensor orientations, numerous features, and energy and processing constraints of animal tags. This paper aims at finding small optimal feature sets that are lightweight and robust to the sensor's orientation. Our approach comprises four main steps. First, 3D feature vectors are selected since they are theoretically independent of orientation. Second, the least interesting features are suppressed to speed up computation and increase robustness against overfitting. Third, the features are further selected through an embedded method, which selects features through simultaneous feature selection and classification. Finally, feature sets are optimized through 10-fold cross-validation. We collected real-world data through multiple sensors around the neck of five goats. The results show that activities can be accurately recognized using only accelerometer data and a few lightweight features. Additionally, we show that the performance is robust to sensor orientation and position. A simple Naive Bayes classifier using only a single feature achieved an accuracy of 94 % with our empirical dataset. Moreover, our optimal feature set yielded an average of 94 % accuracy when applied with six other classifiers. This work supports embedded, real-time, energy-efficient, and robust activity recognition for animals.

    KW - Animal Activity Recognition, Decision Tree, Embedded Systems, Machine Learning, Naive Bayes, Sensor Orientation

    U2 - 10.1145/3191747

    DO - 10.1145/3191747

    M3 - Article

    VL - 2

    JO - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

    JF - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

    SN - 2474-9567

    IS - 1

    M1 - 15

    ER -