Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

5 Citations (Scopus)
4 Downloads (Pure)

Abstract

Assessing error resilience inherent to the digital processing workloads provides application-specific insights towards approximate computing strategies for improving power efficiency and/or performance. With the case study of radio astronomy calibration, our contributions for improving the error resilience analysis are focused primarily on iterative methods that use a convergence criterion as a quality metric to terminate the iterative computations. We propose an adaptive statistical approximation model for high-level resilience analysis that provides an opportunity to divide a workload into exact and approximate iterations. This improves the existing error resilience analysis methodology by quantifying the number of approximate iterations (23% of the total iterations in our case study) in addition to other parameters used in the state-of-the-art techniques.This way heterogeneous architectures comprised of exact and inexact computing cores and adaptive accuracy architectures can be exploited efficiently. Moreover, we demonstrate the importance of quality function reconsideration for convergence based iterative processes as the original quality function (the convergence criterion) is not necessarily sufficient in the resilience analysis phase. If such is the case, an additional quality function has to be defined to assess the viability of the approximate techniques.
Original languageEnglish
Title of host publicationCF'17
Subtitle of host publicationProceedings of the Computing Frontiers Conference
PublisherAssociation for Computing Machinery (ACM)
Pages374-379
Number of pages6
ISBN (Print)978-1-4503-4487-6/17/05
DOIs
Publication statusPublished - 15 May 2017
EventACM International Conference on Computing Frontiers 2017 - University of Siena, Palazzo del Rettorato, Siena, Italy
Duration: 15 May 201717 May 2017
http://www.computingfrontiers.org/2017/

Conference

ConferenceACM International Conference on Computing Frontiers 2017
CountryItaly
CitySiena
Period15/05/1717/05/17
Internet address

Fingerprint

Error analysis
Radio astronomy
Iterative methods
Digital signal processing
Calibration

Keywords

  • Error resilience analysis
  • iterative workloads
  • quality function
  • approximate computing
  • heterogeneous architectures

Cite this

Gillani, G. A., & Kokkeler, A. B. J. (2017). Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing. In CF'17: Proceedings of the Computing Frontiers Conference (pp. 374-379). Association for Computing Machinery (ACM). https://doi.org/10.1145/3075564.3078891
Gillani, G.A. ; Kokkeler, Andre B.J. / Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing. CF'17: Proceedings of the Computing Frontiers Conference. Association for Computing Machinery (ACM), 2017. pp. 374-379
@inproceedings{994f8b1a343b4ac8a4ca8c7739058489,
title = "Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing",
abstract = "Assessing error resilience inherent to the digital processing workloads provides application-specific insights towards approximate computing strategies for improving power efficiency and/or performance. With the case study of radio astronomy calibration, our contributions for improving the error resilience analysis are focused primarily on iterative methods that use a convergence criterion as a quality metric to terminate the iterative computations. We propose an adaptive statistical approximation model for high-level resilience analysis that provides an opportunity to divide a workload into exact and approximate iterations. This improves the existing error resilience analysis methodology by quantifying the number of approximate iterations (23{\%} of the total iterations in our case study) in addition to other parameters used in the state-of-the-art techniques.This way heterogeneous architectures comprised of exact and inexact computing cores and adaptive accuracy architectures can be exploited efficiently. Moreover, we demonstrate the importance of quality function reconsideration for convergence based iterative processes as the original quality function (the convergence criterion) is not necessarily sufficient in the resilience analysis phase. If such is the case, an additional quality function has to be defined to assess the viability of the approximate techniques.",
keywords = "Error resilience analysis , iterative workloads, quality function, approximate computing, heterogeneous architectures",
author = "G.A. Gillani and Kokkeler, {Andre B.J.}",
year = "2017",
month = "5",
day = "15",
doi = "10.1145/3075564.3078891",
language = "English",
isbn = "978-1-4503-4487-6/17/05",
pages = "374--379",
booktitle = "CF'17",
publisher = "Association for Computing Machinery (ACM)",
address = "United States",

}

Gillani, GA & Kokkeler, ABJ 2017, Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing. in CF'17: Proceedings of the Computing Frontiers Conference. Association for Computing Machinery (ACM), pp. 374-379, ACM International Conference on Computing Frontiers 2017, Siena, Italy, 15/05/17. https://doi.org/10.1145/3075564.3078891

Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing. / Gillani, G.A.; Kokkeler, Andre B.J.

CF'17: Proceedings of the Computing Frontiers Conference. Association for Computing Machinery (ACM), 2017. p. 374-379.

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

TY - GEN

T1 - Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing

AU - Gillani, G.A.

AU - Kokkeler, Andre B.J.

PY - 2017/5/15

Y1 - 2017/5/15

N2 - Assessing error resilience inherent to the digital processing workloads provides application-specific insights towards approximate computing strategies for improving power efficiency and/or performance. With the case study of radio astronomy calibration, our contributions for improving the error resilience analysis are focused primarily on iterative methods that use a convergence criterion as a quality metric to terminate the iterative computations. We propose an adaptive statistical approximation model for high-level resilience analysis that provides an opportunity to divide a workload into exact and approximate iterations. This improves the existing error resilience analysis methodology by quantifying the number of approximate iterations (23% of the total iterations in our case study) in addition to other parameters used in the state-of-the-art techniques.This way heterogeneous architectures comprised of exact and inexact computing cores and adaptive accuracy architectures can be exploited efficiently. Moreover, we demonstrate the importance of quality function reconsideration for convergence based iterative processes as the original quality function (the convergence criterion) is not necessarily sufficient in the resilience analysis phase. If such is the case, an additional quality function has to be defined to assess the viability of the approximate techniques.

AB - Assessing error resilience inherent to the digital processing workloads provides application-specific insights towards approximate computing strategies for improving power efficiency and/or performance. With the case study of radio astronomy calibration, our contributions for improving the error resilience analysis are focused primarily on iterative methods that use a convergence criterion as a quality metric to terminate the iterative computations. We propose an adaptive statistical approximation model for high-level resilience analysis that provides an opportunity to divide a workload into exact and approximate iterations. This improves the existing error resilience analysis methodology by quantifying the number of approximate iterations (23% of the total iterations in our case study) in addition to other parameters used in the state-of-the-art techniques.This way heterogeneous architectures comprised of exact and inexact computing cores and adaptive accuracy architectures can be exploited efficiently. Moreover, we demonstrate the importance of quality function reconsideration for convergence based iterative processes as the original quality function (the convergence criterion) is not necessarily sufficient in the resilience analysis phase. If such is the case, an additional quality function has to be defined to assess the viability of the approximate techniques.

KW - Error resilience analysis

KW - iterative workloads

KW - quality function

KW - approximate computing

KW - heterogeneous architectures

U2 - 10.1145/3075564.3078891

DO - 10.1145/3075564.3078891

M3 - Conference contribution

SN - 978-1-4503-4487-6/17/05

SP - 374

EP - 379

BT - CF'17

PB - Association for Computing Machinery (ACM)

ER -

Gillani GA, Kokkeler ABJ. Improving Error Resilience Analysis Methodology of Iterative Workloads for Approximate Computing. In CF'17: Proceedings of the Computing Frontiers Conference. Association for Computing Machinery (ACM). 2017. p. 374-379 https://doi.org/10.1145/3075564.3078891