-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BLT/CMAKE build system #6
Conversation
…ng submodules temporarily
…d Ref issues with python packages
…ined methods left
@jmikeowen I have updated this branch to build the latest changes in master |
OK, I tried to start the build, but when I run cmake I quickly get the error message: CMake Error at CMakeLists.txt:23 (message): call cmake with -DSPHERAL_BLT_DIR=/your/installation/of/blt So how are we supposed to get BLT? If it's not automated, we should figure out how to do that so users have a simple install. |
@jmikeowen blt is pulled in as a submodule of the project so you need to use --recursive when cloning the repo branch. If not you need to : git submodule init |
Where are the host-config files for each build (intel, gcc, clang, etc) or where are you setting the compilers? |
@ptsuji I was just setting the compilers with -DCMAKE_CXX_COMPILER during the config stage |
I think having a host-config directory with specified compilers and MPI would be really helpful; you could choose one of the builds to be the default on a certain platform if no host-config file is specified. There are a few in the cmake/blt directory, but I'd take a look at some of the ALE3D toss3 host-config files (I gave you these on rzgenie). Otherwise, BLT seems to grab whatever MPI is in the path set by the modules, which may not match the default compilers. I didn't specify anything, but when I called cmake, BLT grabbed the GCC 4.9.3 as the compiler, while it's looking at Intel 19.0.4/mvapich2.3 libraries for MPI; my CMakeCache.txt file shows the following: BLT_MPI_LIBRARIES:STRING=/usr/tce/packages/mvapich2/mvapich2-2.3-intel-19.0.4/lib/libmpicxx.so;/usr/tce/packages/mvapich2/mvapich2-2.3-intel-19.0.4/lib/libmpi.so BLT_MPI_LINK_FLAGS:STRING=-Wl,-rpath -Wl,/usr/tce/packages/mvapich2/mvapich2-2.3-intel-19.0.4/lib //CXX compiler |
Another question that I had: when third party libraries are built, they seem to be built into a directory called tpl in the top-level directory. Is there any way to append/specify this directory name? One problem that I can see someone running into - if I try to build with one compiler set, say intel 19.0.4, I'll build the libraries with intel compilers, and they'll be installed into tpl. If I try to do another build with a different compiler set, say gcc 8.1.0, I would make another build directory in src and call cmake; it will look into the tpl directory and find that the TPLs are already built, even though I don't want to link to these (since they were built with a different compiler). |
The host-config files do sound like a good idea, I'll take a look into those files and attempt to mirror something similar for spheral. On the subject of the tpl location I agree this could cause issues, this is something Mike and I discussed offline and were already planning to integrate. Now thinking about it, and given your first point I imagine auto-generating a directory name with the compiler and distro would be a useful feature too. |
Okay, I think we're getting closer to being able to merge this in. There is some documentation on the Spheral Confluence page that outlines some of the more common build options for this if you're interested. As for the most recent changes: @ptsuji You can now specify an out of source Install location for the TPLs and include/lib files with -DSPHERAL_INSTALL_DIR=, let me know if this works for you as expected. As for the host config files we're thinking about doing those in a separate PR. @jmikeowen After the first build you can now build locally from cached files without a network I believe I have this feature working properly, I built offline and ran tests successfully. Please give it a go yourself and let me know if I've missed anything or if something breaks. |
* commit '6a971c4b578bb5f2c53d9f63ffcddff57c20979b': Added unordered_map as a possible FieldList type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This overall looks pretty good. I think we're still missing some of the third party libraries built by default by the autoconf system, but those can be added as we go.
Okay, I think this is finally ready for an initial review.
This PR presents a CMake based build system for the entirety of spheral. It is similar in function to the autotool implementation in that it handles the download and install of all necessary TPL's, builds C++ libs as well as python binding libs. There are however some notable differences between the operation of this CMake system and the existing autotools system:
TPL Installations -
TPL's are downloaded and built during the configuration stage of the CMake process. If the user wishes to pass a pre built directory of a given TPL, they can do this during configuration with -D_DIR=. The build system will then attempt to find the necessary libraries and header files it needs in that directory tree. Users can also entirely turn off tpl installation with -DINSTALL_TPLS=Off. Of course once they are installed once they will not need to be installed again.
TPL Location-
Due to the nature of CMake, the TPL's are not downloaded inside of the BUILD directory as they are with the autotools system, as targets cannot be pointed to from within CMake's build directory path at config. They are therefore installed into the Spheral root folder at Spheral/tpl. This location can be changed in the future, or we can make it a user defined directory as a CMake option.
BUILD Dir -
As with the Autotools system the user should create their BUILD dir in Spheral/src/BUILD. Originally I would have preferred to do it in Spheral/BUILD but to keep TPL's within the project I had to move it up to src.
C++ package libs -
Currently this system is only building shared libraries for the C++ libs as those are needed to create the Python binding package libs. @ptsuji I noticed you were building static libs. Do you need static libs, or are shared libs okay? I can build both, make it an option to only make static when ENABLE_CXXONLY=On, or create a BUILD_STATIC flag?
BLT -
BLT is now added as a submodule of the project and lives in the Spheral project root directory.
MPI/OpenMP -
MPI/OpenMP are enabled by default. To disable use -DENABLE_MPI=Off and/or -DENABLE_OPENMP=Off.
Installation -
If ENABLE_CXXONLY=On then all libs and includes for Spheral are installed at <BUILD_DIR>/Spheral.
if ENABLE_CXXONLY=Off libs and includes can still be found in <BUILD_DIR>/Spheral however all libs and appropriate python targets will be installed to the given python site-packages dir, similar to the autotools system.
Build Start2Finish -
git clone --recursive -b blt-cmake-overhall https://github.com/jmikeowen/spheral
cd spheral/src
mkdir BUILD && cd BUILD
cmake ..
make -j72 install
../../python/bin/python2.7 -c "import Spheral"
@ptsuji I can't seem to add you as a reviewer for this PR but if you could look it over and provide feedback that would be brilliant (maybe @jmikeowen knows how we can do this?), I want to make sure that when we merge this we have made all the necessary changes to ensure this is compatible with your work.