From Gauss's least squares to Rosenblatt's perceptron — mathematics lays the foundation for machine intelligence.
Legendre & Gauss's method of least squares — fit a straight line to scattered data by minimizing the sum of squared errors.
Update your belief based on new evidence. Prior × likelihood → posterior. The core framework of probabilistic inference.
Memoryless state transitions — the next state depends only on the current state. Cornerstone of HMM, MCMC, and PageRank.
Rosenblatt's first artificial neuron that could learn from data. Computes weighted sum of inputs; output 1 if above threshold, 0 otherwise.
Widrow & Hoff's adaptive linear neuron — unlike the Perceptron, Adaline computes error before the activation function, enabling true gradient descent (LMS rule). The decision boundary glides smoothly into place!