Cybenko, George V. 1989.
“Approximation by Superpositions of a Sigmoidal Function.” Mathematics of Control, Signals and Systems 2: 303–14.
https://api.semanticscholar.org/CorpusID:3958369.
Géron, Aurélien. 2022. Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow. 3rd ed. O’Reilly Media, Inc.
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. 2016.
Deep Learning. Adaptive Computation and Machine Learning. MIT Press.
https://dblp.org/rec/books/daglib/0040158.
Hornik, Kurt, Maxwell Stinchcombe, and Halbert White. 1989.
“Multilayer Feedforward Networks Are Universal Approximators.” Neural Networks 2 (5): 359–66. https://doi.org/
https://doi.org/10.1016/0893-6080(89)90020-8.
LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. 2015.
“Deep Learning.” Nature 521 (7553): 436–44.
https://doi.org/10.1038/nature14539.
LeNail, Alexander. 2019.
“NN-SVG: Publication-Ready Neural Network Architecture Schematics.” Journal of Open Source Software 4 (33): 747.
https://doi.org/10.21105/joss.00747.
McCulloch, Warren S, and Walter Pitts. 1943.
“A logical calculus of the ideas immanent in nervous activity.” The Bulletin of Mathematical Biophysics 5 (4): 115–33.
https://doi.org/10.1007/bf02478259.
Minsky, Marvin, and Seymour Papert. 1969. Perceptrons: An Introduction to Computational Geometry. Cambridge, MA, USA: MIT Press.
Rosenblatt, F. 1958.
“The perceptron: A probabilistic model for information storage and organization in the brain.” Psychological Review 65 (6): 386–408.
https://doi.org/10.1037/h0042519.
Russell, Stuart, and Peter Norvig. 2020.
Artificial Intelligence: A Modern Approach. 4th ed. Pearson.
http://aima.cs.berkeley.edu/.