A unifying view of the human induced pluripotent stem cells
The Allen Integrated Cell is a predictive, 3D model of human induced pluripotent stem cell (hiPSC) organization. It provides a realistic, data-driven 3D visualization of a living hiPSC in its pluri-potent state. The visualization shows the many molecular machines and structures (organelles) inside the cell, simultaneously. This integrated organization drives the cell’s basic functions, and these models provide a baseline for new models of different cell types, disease, drug responses, and cellular environments.
The Allen Integrated Cell unites two technologies to improve our understanding of how human induced pluripotent stem cells (hiPSCs) vary in both shape and organization. One is a deterministic model, which shows the organization of organelles in particular individual cells, and the other model is probabilistic - showing likely shapes and locations for organelles in any cell - even for cells we have not studied.
Below, you can interactively explore many examples of our cells - including some that are preparing to divide(!) - in both the 3D deterministic and probabilistic views of the Allen Integrated Cell.
How does the model work?
As described in our cell methods and microscopy methods pages, we collect 3D images of thousands of cells from our collection of cell lines. Each cell has one of 14 different proteins endogenously tagged with a fluorescent label, so that we can see a particular structure of interest inside of the cell. We use this data as input for both the deterministic and probabilistic models.
In the deterministic model, our deep learning algorithms learn the relationship between the cell structure outlined by tagged voxels and an unlabeled bright-field image, by mining these relationships from our vast database of fluorescence images for each structure. The model can then detect where an organelle is, given only a simple unlabeled brightfield image, allowing us to detect and therefore integrate many organelles from the same input image.
The probabilistic model, also machine learning-based, allows us to generate a probabilistic view that shows the most probable location of all of the organelles of interest for any cell, based on the location and morphologies of the cell boundary and the nucleus. This model provides an estimate of where organelles are most likely to be located in a cell and how they will look.
It seems like the predicted channels from the probabilistic model do not always match up with the observed channels – why is that?
The probabilistic model provides an organelle’s most likely location and shape in any given cell, based on what has been observed for all cells. Therefore, the specific prediction will likely not match up with any particular single instance of measured data. This is expected behavior for models of this type.
I’d really like to see the two types of models used by the Allen Integrated Cell in the same viewport...
We want you to explore deterministic and probabilistic model predictions in the same viewer, are currently implementing this, and will make this available later this year.