Backpropagation revives neural networks. Decision trees, RNNs, and Boltzmann machines emerge.
Fukushima's hierarchical pattern recognizer — inspired by the visual cortex. Simple cells detect local features, complex cells pool them for translation invariance.
Networks with loops — the hidden state acts as memory, carrying information from previous time steps. Essential for sequences like text, speech, and time series.
Hinton & Sejnowski's stochastic network — neurons randomly flip on/off based on their energy. Lower energy states are more likely.
Rumelhart, Hinton & Williams made neural networks trainable. Compute the error at the output, then propagate gradients backward through each layer.
Quinlan's ID3 algorithm — recursively split data on the feature that gives the most information gain. Simple, fast, and explainable.