Few Shot Object Recognition

Few Shot Object Recognition

This lecture overviews Few Shot Object Recognition that has many applications in image classification, when few training data are available.  It covers the following topics in detail: Few-shot Image Learning definitions and methods. Applications in sports video athlete recognition.

Special topics in Object Detection

This lecture overviews Special topics in Object Detection that has many applications in embedded computing and drone vision. It covers the following topics in detail: Embedded object detection, Small object detection, Person detection from aerial views.

Deep Object Detection

Recently, Convolutional Neural Networks (CNNs) have been used for object/target (e.g., face, person, car, pedestrian, road sign) detection with great results. However, using such CNN models on embedded processors for real-time processing is prohibited by HW constraints. In that sense, various architectures and settings will be examined in order to facilitate and accelerate the use… Continue reading Deep Object Detection

Attention and Transformers Networks

In this lecture, the limitations of Recurrent Neural Networks (RNNs) and Convolutional Neural Networks (CNNs) in effectively processing sequences are emphasized. However, a breakthrough solution known as Transformers is introduced, which addresses these limitations comprehensively. The architecture of Transformers is meticulously described, with a particular emphasis on its fundamental building blocks. These include positional encoding,… Continue reading Attention and Transformers Networks

Convolutional Neural Networks Lecture

Convolutional Neural Networks form the backbone of current AI revolution and are used in a multitude of classification and regression problems. This lecture overviews the transition from multilayer perceptrons to deep architectures. The following topics are resented in detail: Tensors and mathematical formulations. Convolutional layers. Fully connected layers. Pooling. Neural Image Features and their relation… Continue reading Convolutional Neural Networks Lecture

Multilayer perceptron. Backpropagation

This lecture covers the basic concepts and architectures of Multi-Layer Perceptron (MLP), Activation functions, and Universal Approximation Theorem. Training MLP neural networks is presented in detail: Loss types, Gradient descent, Error Backpropagation. Training problems are overviewed, together with solutions, e.g., Stochastic Gradient Descent, Adaptive Learning Rate Algorithms, Regularization, Evaluation, Generalization methods.