Some traditional neural network architectures

Towards foundation models

Neural networks are growing in size and complexity, and recently the term 'foundation model' has been introduced as a way to describe complex neural network architectures with some good properties and applications, and with the possibility of coming pre-trained with huge datasets to help us to adapt them to specific problems.

Convolutional Neural Networks

CNN evolution into more specific tasks such as object or instance recognition

Recurrent Neural Networks (RNN)

In front of sequences and when memory of previous values is important.

Last updated