In this notebook, we will implement an **Autoencoder** with Convolutional Attention Blocks (CABs) to encode and decode MNIST digits, aiming to learn efficient latent representations. To get started, ...
Generating synthetic data is useful when you have imbalanced training data for a particular class, for example, generating synthetic females in a dataset of employees that has many males but few ...
Improved Autoencoder Model With Memory Module for Anomaly Detection (IAEMM) is an unsupervised anomaly detection algorithm that enhances traditional autoencoders with a memory module and a hypersphere ...
Abstract: In this paper, we propose a novel Transformer based approach, namely Cross-modal Contrastive Masked AutoEncoder (C2MAE), to Self-Supervised Learning (SSL) on compressed videos. A unified ...
Abstract: Variational autoencoder (VAE) is widely used as a data enhancement technique. However, it faces challenges with inaccurate potential spatial distribution and poor reconstruction quality when ...
Sparse autoencoders are central tools in analyzing how large language models function internally. Translating complex internal states into interpretable components allows researchers to break down ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results