Jump to content

Neural Turing machine: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Small grammatical change (neural is lowercase)
Line 1: Line 1:
{{turing}}
{{turing}}
A '''Neural Turing machine''' ('''NTM''') is a [[recurrent neural network]] model of a [[Turing machine]]. The approach was published by [[Alex Graves (computer scientist)|Alex Graves]] et al. in 2014.<ref name="arxiv" /> NTMs combine the fuzzy [[pattern matching]] capabilities of [[neural network]]s with the [[algorithm]]ic power of [[programmable computer]]s.
A '''neural Turing machine''' ('''NTM''') is a [[recurrent neural network]] model of a [[Turing machine]]. The approach was published by [[Alex Graves (computer scientist)|Alex Graves]] et al. in 2014.<ref name="arxiv" /> NTMs combine the fuzzy [[pattern matching]] capabilities of [[neural network]]s with the [[algorithm]]ic power of [[programmable computer]]s.


An NTM has a neural network controller coupled to [[Auxiliary memory|external memory]] resources, which it interacts with through attentional mechanisms. The memory interactions are differentiable end-to-end, making it possible to optimize them using [[gradient descent]].<ref name="MyUser_Https:_May_17_2016c">{{cite web |url=https://www.linkedin.com/pulse/deep-minds-interview-googles-alex-graves-koray-sophie-curtis |title=Deep Minds: An Interview with Google's Alex Graves & Koray Kavukcuoglu |access-date= May 17, 2016}}</ref> An NTM with a [[long short-term memory]] (LSTM) network controller can infer simple algorithms such as copying, sorting, and associative recall from examples alone.<ref name="arxiv">{{cite arXiv |eprint=1410.5401|title= Neural Turing Machines |last1= Graves |first1= Alex |last2= Wayne |first2= Greg |last3= Danihelka |first3= Ivo |class= cs.NE |year= 2014 }}</ref>
An NTM has a neural network controller coupled to [[Auxiliary memory|external memory]] resources, which it interacts with through attentional mechanisms. The memory interactions are differentiable end-to-end, making it possible to optimize them using [[gradient descent]].<ref name="MyUser_Https:_May_17_2016c">{{cite web |url=https://www.linkedin.com/pulse/deep-minds-interview-googles-alex-graves-koray-sophie-curtis |title=Deep Minds: An Interview with Google's Alex Graves & Koray Kavukcuoglu |access-date= May 17, 2016}}</ref> An NTM with a [[long short-term memory]] (LSTM) network controller can infer simple algorithms such as copying, sorting, and associative recall from examples alone.<ref name="arxiv">{{cite arXiv |eprint=1410.5401|title= Neural Turing Machines |last1= Graves |first1= Alex |last2= Wayne |first2= Greg |last3= Danihelka |first3= Ivo |class= cs.NE |year= 2014 }}</ref>

Revision as of 08:36, 1 February 2023

A neural Turing machine (NTM) is a recurrent neural network model of a Turing machine. The approach was published by Alex Graves et al. in 2014.[1] NTMs combine the fuzzy pattern matching capabilities of neural networks with the algorithmic power of programmable computers.

An NTM has a neural network controller coupled to external memory resources, which it interacts with through attentional mechanisms. The memory interactions are differentiable end-to-end, making it possible to optimize them using gradient descent.[2] An NTM with a long short-term memory (LSTM) network controller can infer simple algorithms such as copying, sorting, and associative recall from examples alone.[1]

The authors of the original NTM paper did not publish their source code.[1] The first stable open-source implementation was published in 2018 at the 27th International Conference on Artificial Neural Networks, receiving a best-paper award. [3][4][5] Other open source implementations of NTMs exist but as of 2018 they are not sufficiently stable for production use.[6][7][8][9][10][11][12] The developers either report that the gradients of their implementation sometimes become NaN during training for unknown reasons and cause training to fail;[10][11][9] report slow convergence;[7][6] or do not report the speed of learning of their implementation.[12][8]

Differentiable neural computers are an outgrowth of Neural Turing machines, with attention mechanisms that control where the memory is active, and improve performance.[13]

References

  1. ^ a b c Graves, Alex; Wayne, Greg; Danihelka, Ivo (2014). "Neural Turing Machines". arXiv:1410.5401 [cs.NE].
  2. ^ "Deep Minds: An Interview with Google's Alex Graves & Koray Kavukcuoglu". Retrieved May 17, 2016.
  3. ^ Collier, Mark; Beel, Joeran (2018), "Implementing Neural Turing Machines", Artificial Neural Networks and Machine Learning – ICANN 2018, Springer International Publishing, pp. 94–104, arXiv:1807.08518, Bibcode:2018arXiv180708518C, doi:10.1007/978-3-030-01424-7_10, ISBN 9783030014230, S2CID 49908746
  4. ^ "MarkPKCollier/NeuralTuringMachine". GitHub. Retrieved 2018-10-20.
  5. ^ Beel, Joeran (2018-10-20). "Best-Paper Award for our Publication "Implementing Neural Turing Machines" at the 27th International Conference on Artificial Neural Networks | Prof. Joeran Beel (TCD Dublin)". Trinity College Dublin, School of Computer Science and Statistics Blog. Retrieved 2018-10-20.
  6. ^ a b "snowkylin/ntm". GitHub. Retrieved 2018-10-20.
  7. ^ a b "chiggum/Neural-Turing-Machines". GitHub. Retrieved 2018-10-20.
  8. ^ a b "yeoedward/Neural-Turing-Machine". GitHub. 2017-09-13. Retrieved 2018-10-20.
  9. ^ a b "camigord/Neural-Turing-Machine". GitHub. Retrieved 2018-10-20.
  10. ^ a b "carpedm20/NTM-tensorflow". GitHub. Retrieved 2018-10-20.
  11. ^ a b "snipsco/ntm-lasagne". GitHub. Retrieved 2018-10-20.
  12. ^ a b "loudinthecloud/pytorch-ntm". GitHub. Retrieved 2018-10-20.
  13. ^ Administrator. "DeepMind's Differentiable Neural Network Thinks Deeply". www.i-programmer.info. Retrieved 2016-10-20.