ATTENTION/WARNING - NE PAS DÉPOSER ICI/DO NOT SUBMIT HERE

Ceci est la version de TEST de DIAL.mem. Veuillez ne pas soumettre votre mémoire sur ce site mais bien à l'URL suivante: 'https://thesis.dial.uclouvain.be'.
This is the TEST version of DIAL.mem. Please use the following URL to submit your master thesis: 'https://thesis.dial.uclouvain.be'.
 

Stochastic thermodynamics of Machine Learning : the inevitable cost of neural networks

(2021)

Files

Beuseling_07901400_2021.pdf
  • Open access
  • Adobe PDF
  • 3.97 MB

Details

Supervisors
Faculty
Degree label
Abstract
Using the formalism of stochastic thermodynamics to describe how entropy is produced and flows in a system, this work will describe how a trained neural network unavoidably emits heat when it is run. The two major concepts involved are the Landauer cost, taking its origin from the Landauer principle and which does not depend on the wiring of the circuit, and the mismatch cost, which corresponds to the loss of energy incurred when the input distribution is not the most fitting to the circuit. It will be proven that highly correlated input variables result in higher energetic cost, while uncorrelated variables minimize such cost. While these quantities constitute a lower bound to the entropy flow, a maximal bound to the mismatch cost is proposed. The results obtained on the Landauer and mismatch costs are then tested on a trained model, to which are given correlated and uncorrelated data. This will offer an insight on how to treat data when minimizing the energetic expenses of a trained neural network.