Abstract
Arising popularity for resource-efficient machine learning models makes random forests and decision trees famous models in recent years. Naturally, these models are tuned, optimized, and transformed to feature maximally low-resource consumption. A subset of these strategies targets the model structure and model logic and therefore induces a trade-off between resource-efficiency and prediction performance. An orthogonal set of approaches targets hardware-specific optimizations, which can improve performance without changing the behavior of the model. Since such hardware-specific optimizations are usually hardware-dependent and inflexible in their realizations, this paper envisions a more general application of such optimization strategies at the level of programming languages. We therefore discuss a set of suitable optimization strategies first in general and envision their application in LLVM IR, i.e. a flexible and hardware-independent ecosystem.
Original language | English |
---|---|
Title of host publication | LCTES '24 |
Subtitle of host publication | 25th ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems, Copenhagen, Denmark, 24 June 2024 |
Editors | Aviral Shrivastava, Yulei Sui |
Place of Publication | New York, NY |
Publisher | ACM Publishing |
Pages | 58-61 |
Number of pages | 4 |
ISBN (Electronic) | 9798400706165 |
ISBN (Print) | 979-8-4007-0616-5 |
DOIs | |
Publication status | Published - 20 Jun 2024 |
Event | 25th ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems, LCTES 2024 - Copenhagen, Denmark Duration: 24 Jun 2024 → 24 Jun 2024 Conference number: 25 |
Conference
Conference | 25th ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, and Tools for Embedded Systems, LCTES 2024 |
---|---|
Abbreviated title | LCTES 2024 |
Country/Territory | Denmark |
City | Copenhagen |
Period | 24/06/24 → 24/06/24 |