# Implicit Learning for neuronal representations

## Motivation

We train models to predict the visual cortex responses for mice (a model takes an image or a video as input and predicts the neuronal response). We treat part of the model weights as neuronal embeddings (each vector has correspondences for the actual recorded cells). The idea is that using these cell embeddings we can come up with the desired cell type classification using unsupervised clustering on top of the embeddings.

## Project

currently a well-performing predictive model requires high-dimensional embeddings, which is problematic clustering due to the curse of dimensionality. However, there is mostly no difference between the rotation-equivariant CNN and classical CNN in terms of performance, even though the internal dimensionality of the rotation-equivariant CNN is eight times smaller. In this project we would like to try implicit learning to solve the dimensionality issue. In particular, the goal is to try to learnt the low-dimensional parameters for each neuron and a shared transformation for them to the original dimensionality. A shared transformation could be an MLP, for instance. Once the final implicit network is sufficiently close to the non-implicit version in terms of performance, the goal would be to look into the structure of the learnt low-dimensional neural representations and compare them with the high-dimensional representations from the explicit network.

No prior knowledge in biology is required.

## Thesis

Within the context of this project, several research questions could be explored:

- Is it possible to achieve equal performance with the implicit learning? Is it possible to improve it?
- What are the important parts of the shared transformation to improve the network performance? Nonlearity, depth?
- How reproducuble are the latent neuronal representations (implicit parameters) im terms reproducibility across clustering and/or across model fits? (inspired by this work)

## Contact

To apply please email Polina Turishcheva stating your interest in this project and detailing your relevant skills. A part of this project could be also a lab rotation.