Posts by Tags

Catastrophic forgetting

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

Comet.ml

Introduction to Comet.ml

3 minute read

Published:

Comet is a neat alternative to the classical Tensorboard experiment tracker, which supports multiple functions such as logging of experiment metrics, plots, gradients, model weights, as well as an online dashboard similar to that of Tensorboard. There are other similar products such as Weights&Biases, but Comet has the best support on the Compute Canada clusters.

Deep InfoMax

Density estimation

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper

InfoMax

Introduction

Introduction to Comet.ml

3 minute read

Published:

Comet is a neat alternative to the classical Tensorboard experiment tracker, which supports multiple functions such as logging of experiment metrics, plots, gradients, model weights, as well as an online dashboard similar to that of Tensorboard. There are other similar products such as Weights&Biases, but Comet has the best support on the Compute Canada clusters.

Inverse autoregressive flow

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper

Jacobian

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper

Machine learning

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

Introduction to Comet.ml

3 minute read

Published:

Comet is a neat alternative to the classical Tensorboard experiment tracker, which supports multiple functions such as logging of experiment metrics, plots, gradients, model weights, as well as an online dashboard similar to that of Tensorboard. There are other similar products such as Weights&Biases, but Comet has the best support on the Compute Canada clusters.

Neural tangent kernel

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

Noise contrastive estimation

Normalizing flows

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper

Orthogonal gradient descent

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

PacMan

Procgen

PyTorch

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

Introduction to Comet.ml

3 minute read

Published:

Comet is a neat alternative to the classical Tensorboard experiment tracker, which supports multiple functions such as logging of experiment metrics, plots, gradients, model weights, as well as an online dashboard similar to that of Tensorboard. There are other similar products such as Weights&Biases, but Comet has the best support on the Compute Canada clusters.

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper

Pyro

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper

Python

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

Introduction to Comet.ml

3 minute read

Published:

Comet is a neat alternative to the classical Tensorboard experiment tracker, which supports multiple functions such as logging of experiment metrics, plots, gradients, model weights, as well as an online dashboard similar to that of Tensorboard. There are other similar products such as Weights&Biases, but Comet has the best support on the Compute Canada clusters.

REST API

Introduction to Comet.ml

3 minute read

Published:

Comet is a neat alternative to the classical Tensorboard experiment tracker, which supports multiple functions such as logging of experiment metrics, plots, gradients, model weights, as well as an online dashboard similar to that of Tensorboard. There are other similar products such as Weights&Biases, but Comet has the best support on the Compute Canada clusters.

Radial flows

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper

Reinforcement learning

Representation learning

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

Self-supervised learning

Supervised learning

Characterizing catastrophic forgetting via the Neural Tangent Kernel

less than 1 minute read

Published:

Neural networks have achieved near optimal performance for supervised learning tasks.However when facing a sequence of tasks where data distribution is changing over time, they tend to forget what has been learned in the past leading to Catastrophic Forgetting (CF). This is one critical problem Continual Learning (CL) aims to solve. Although, there has been plenty of empirical works trying to study that pathology, very few tackled it from a theoretical side. In this work, we provide a theoretical analysis of CF under the Neural Tangent Kernel (NTK) regime where neural networks behave linearly.

Variational inference

Normalizing flows in Pyro (PyTorch)

10 minute read

Published:

NFs (or more generally, invertible neural networks) have been used in:

  • Generative models with $1\times1$ invertible convolutions Link to paper
  • Reinforcement learning, to improve upon the (not always optimal) Gaussian policy Link to paper
  • Simulating attraction-repulsion forces in actor-critic Link to paper