Time-invariant embeddings to find connection between neurons function and morphology
Motivation
We train models to predict the visual cortex responses for mice (a model takes an image or a video as input and predicts the neuronal response). We treat part of the model weights as neuronal embeddings (each vector has correspondences for the actual recorded cells). The idea is that using these cell embeddings we can come up with the desired cell type classification using unsupervised clustering on top of the embeddings. Understanding cell types is important for studiyng how visual system works as cells are the building blocks of this system.
Project
Recent work from Allen institute showed that they can learn functional embeddings and map them to transcriptomically defined cell types. However, another paper from Allen institute emphasizes that cell types should be defined using all three modalities: function, morphology and transcriptomics. While the Allen Brain dataset has limited morphology, there is Microns dataset, which has both functional and morphological recordings for ~10 000 cells.
In this project we aim to adopt the training approach from the first paper and see how much information is shared between function and morphology using Microns dataset.
The morphological embeddings and measurements will be provided, only functional models should be trained. No prior knowledge in biology is required.
Thesis
Within the context of this project, several research questions could be explored:
- Are the embeddings from better then embeddings from CNN models [1, 2]?
- in terms of shared information with morphology?
- im terms reproducibility across clustering and/or across model fits (inspired by this work)
- What is the best way to align information across modalities?
- What axis of information seems to be most aligned between the modalities?
Contact
To apply please email Polina Turishcheva stating your interest in this project and detailing your relevant skills. A part of this project could be also a lab rotation.