Umeå University's logo

umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Efficient use of resources when implementing machine learning in an embedded environment
Umeå University, Faculty of Science and Technology, Department of Physics.
2023 (English)Independent thesis Advanced level (professional degree), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Machine learning and in particular deep-learning models have been in the spotlight for the last year. Particularly the release of ChatGPT caught the attention of the public. But many of the most popular models are large with millions or billions of parameters. Parallel with this, the number of smart products constituting the Internet of Things is rapidly increasing. The need for small resource-efficient machine-learning models can therefore be expected to increase in the coming years. This work investigates the implementation of two different models in embedded environments. The investigated models are, random forests, that are straight-forward and relatively easy to implement, and transformer models, that are more complex and challenging to implement. The process of training the models in a high-level language and implementing and running inference in a low-level language has been studied. It is shown that it is possible to train a transformer in Python and export it by hand to C, but that it comes with several challenges that should be taken into consideration before this approach is chosen. It is also shown that a transformer model can be successfully used for signal extraction, a new area of application. Different possible ways of optimizing the model, such as pruning and quantization, have been studied. Finally, it has been shown that a transformer model with an initial noise-filter performs better than the existing hand-written code on self-generated messages, but worse on real-world data. This indicates that the training data should be improved.

Place, publisher, year, edition, pages
2023. , p. 36
National Category
Other Computer and Information Science
Identifiers
URN: urn:nbn:se:umu:diva-211379OAI: oai:DiVA.org:umu-211379DiVA, id: diva2:1780077
External cooperation
Saab AB, Training and Simulation
Subject / course
Examensarbete i teknisk fysik
Educational program
Master of Science Programme in Engineering Physics
Presentation
2023-06-08, NAT.D.410, Naturvetarhuset D, 901 87, Umeå, 11:00 (Swedish)
Supervisors
Examiners
Available from: 2023-08-03 Created: 2023-07-05 Last updated: 2023-08-03Bibliographically approved

Open Access in DiVA

Slutrapport(1449 kB)195 downloads
File information
File name FULLTEXT01.pdfFile size 1449 kBChecksum SHA-512
93f2e377d9e1b76fa520a3b8c7b33faeff251892e39195b464cf2a18142f581840d801d4ad70c71702596a02ef8ff2088ccda4e8fd1aac44e77185f4459e262c
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Eklöf, Johannes
By organisation
Department of Physics
Other Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 195 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 647 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf