umu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Exploring numerical calculations with CalcNet
Umeå University, Faculty of Science and Technology, Department of Computing Science. (XAI)ORCID iD: 0000-0002-8078-5172
2019 (English)Conference paper, Oral presentation with published abstract (Refereed)
Abstract [en]

Neural networks are not great generalizers outside their training range i.e. they are good at capturing bias but might miss the overall concept. Important issues with neural networks is that when testing data goes outside training range they fail to predict accurate results. Hence, they loose the ability to generalize a concept. For systematic numeric exploration neural accumulators (NAC) and neural arithmetic logic unit(NALU) are proposed which performs excellent for simple arithmetic operations. But, major limitation with these units is that they can’t handle complex mathematical operations & equations. For example, NALU can predict accurate results for multiplication operation but not for factorial function which is essentially composition of multiplication operations only. It is unable to comprehend pattern behind an expression when composition of operations are involved. Hence, we propose a new neural network structure effectively which takes in complex compositional mathematical operations and produces best possible results with small NALU based neural networks as its pluggable modules which evaluates these expression at unitary level in a bottom-up manner. We call this effective neural network as CalcNet, as it helps in predicting accurate calculations for complex numerical expressions even for values that are out of training range. As part of our study we applied this network on numerically approximating complex equations, evaluating biquadratic equations and tested reusability of these modules. We arrived at far better generalizations for complex arithmetic extrapolation tasks as compare to both only NALU layer based neural networks and simple feed forward neural networks. Also, we achieved even better results for our golden ratio based modified NAC and NALU structures for both interpolating and extrapolating tasks in all evaluation experiments.Finally, from reusability standpoint this model demonstrate strong invariance for making predictions on different tasks.

Place, publisher, year, edition, pages
2019.
Keywords [en]
Neural networks, Neural Arithmetic Logic Unit, Neural Accumulators, CalcNet
National Category
Computer Sciences
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:umu:diva-165293DOI: DOI 10.1109/ICTAI.2019.00-73OAI: oai:DiVA.org:umu-165293DiVA, id: diva2:1371398
Conference
IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI 2019), Portland, Oregon, November 4-6, 2019
Funder
Wallenberg AI, Autonomous Systems and Software Program (WASP), 570011220Available from: 2019-11-19 Created: 2019-11-19 Last updated: 2019-11-20

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records BETA

Främling, Kary

Search in DiVA

By author/editor
Främling, Kary
By organisation
Department of Computing Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 61 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf