Simulating microorganisms with machine learning
Currently training on this video. The ProcessVideos notebook expects the video to be in notebooks/datasets/jams-germs/raw-videos
(from youtube-dl), and exports the frames to ../frames/{title}
.
VideoModelUNet takes these frames and trains a recurrent diffusion model to generate videos
Visualization of what the model sees during training:
Model output:
A goal for this project is to make a real-time simulation with a controllable organism. This requires a video with body movement, orientation, and camera movement tracked. The datasets
folder contains a Love2D project for manually annotating videos in this way and a tracking file for this video, though something like DeepLabCut will likely be used in the future.
These tracking points are passed to the model as a 3-channel image, where the first two encode the head orientation and the last encodes distance from the body. RGB and tracking are combined here only for visualization.
Output of Cotrollable notebook: