-
Notifications
You must be signed in to change notification settings - Fork 22
Conda Environment Muckery
We have had to hack around conda env
and conda
not working in ways that are consistent with how we want them to work to have predictably shareable environments.
Here are some workarounds and why:
We use scripts/split_pip.py
to split the original environment.yml
file into a bunch of files that are installed separately. In the first version we defaulted to doing the conda default channel, conda-forge, then pip. In subsequent versions, we extend the environment.yml
to include a version a section called channel-order
that gives the order to install via channels. This works around the weirdness that can be introduced when conda-forge is added into the mix and gives a clean defaults solve to build off of.
We tried using derivative .yml
files and conda env
but conda env create
and conda env update
frustratingly don't allow for all the parameter that conda create
and conda install
allow for. In particular, conda install
by default tries a solve where it freezes the existing packages but if that doesn't resolve, drops to a flexible solve where already installed packages may be altered. This is can be managed by a --freeze-installed
or --no-update-deps
flag. However in conda env
it seems to use the frozen solver by default. As far as I can tell at this time it doesn't use the exact same default solver settings as conda install
and there are precious few parameters that conda env
takes to modify this. Given how hair-brained and difficult it is to get a solve with some of our heavy data science dependencies, this constraint is not worth having.
In short, with the exact same requirements, conda install
will solve, but conda env update
will not.