diff --git a/src/docs/sphinx/dev_guide/release.rst b/src/docs/sphinx/dev_guide/release.rst index 03bfb9e0a1..aece0f2635 100644 --- a/src/docs/sphinx/dev_guide/release.rst +++ b/src/docs/sphinx/dev_guide/release.rst @@ -39,41 +39,65 @@ Here are the steps to follow for an Axom release. ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Create a release candidate branch off of the develop branch to initiate a -release. The name of a release branch must contain the associated release version -number. Typically, we use a name like v0.5.0-rc +release. The name of a release branch should contain the associated release +version name. Typically, we use a name like v0.5.0-rc (i.e., version 0.5.0 release candidate). See :ref:`semver-label` for a description of how version numbers are chosen. -2: Issue a Pull Request +2: Create a Pull Request ^^^^^^^^^^^^^^^^^^^^^^^^ +The release candidate branch, when complete, reviewed, and approved, will be +merged into main so that a release tag can be generated for that merge commit. Create a pull request to merge the release candidate branch into main so that -release changes can be reviewed. Such changes include: +release changes can made and reviewed. + +.. note:: Typically, when a release is being prepared it will have been months + since the previous release and the main branch has changed. Thus, + the number of files changed by the release candidate merge into main + will be large. Fortunately, most of those changes have been reviewed + and merged into the develop branch and do not require + additional review. Therefore, it is helpful to create a companion + pull request to merge the release candidate branch into develop. + This pull request will not be merged, but will be much easier for + the team to review. To facilitate the process, cross reference the + pull request to be merged into main and the one to develop and note + in the summary of the former to review the latter but approve the + former. (Whew! Hopefully, that is clear!) + +Typical changes that are made in a release candidate branch include: #. Update the version information (major, minor, and patch version numbers) at the top of the ``axom/src/cmake/AxomVersion.cmake`` file and in the ``axom/RELEASE`` file. -#. Update the release notes in ``axom/RELEASE-NOTES.md`` by adding the - release version number and release date in the heading, as well as, - the corresponding link to the version on GitHub. +#. Update the notes in the section for the new release in the file + ``axom/RELEASE-NOTES.md``. Add the release version number and release date + in the section heading and add a link to the new version on GitHub at the + bottom of the file. -#. Update the mail map in ``axom/.mailmap`` by adding the names and emails - of new contributors since the last release. +#. Update the mail map in ``axom/.mailmap``, if needed, by adding names and + emails of new contributors since the last release. #. Update the citations in ``axom/CITATION.cff`` by adding the names of new LLNL contributors since the last release. #. Test the code by running it through all continuous integration tests - and builds. This will ensure that all build configurations are working - properly and all tests pass. - -#. Fix any issues discovered during final release testing if code changes - are reasonably small and re-run appropriate tests to ensure issues are - resolved. If a major bug is discovered, and it requires significant - code modifications to fix, do not fix it on the release branch. - `Create a new GitHub issue for it `_ - and note it in the ``known bugs`` section of the release notes. + and builds. This will be done automatically when the release pull request is + made. All build configurations must compile properly and all tests must pass + before the pull request can be merged. + +#. Fix any issues discovered during final release testing in the release + candidate branch if code changes are reasonably small, and re-run + appropriate tests to ensure issues are resolved. If a major bug is + discovered, and it requires significant code modifications to fix, + do not fix it on the release branch. `Create a new GitHub issue for it + `_ and note it in the ``known bugs`` + section of the release notes. Alternatively, if time permits, fix the + bug in a different branch and create a pull request as you would do during + regular development. After the bug is resolved and that pull request is + merged into develop, merge develop into the release candidate branch where + checks will run on that. #. Make sure all documentation (source code, user guides, etc.) is updated and reviewed. This should not be a substantial undertaking as @@ -84,12 +108,19 @@ release changes can be reviewed. Such changes include: should be updated during the regular development cycle. See :ref:`release-notes-label` for information about release notes. +.. important:: It is good practice to have everyone on the team review the + release notes to ensure that they are complete, correct, and + sufficiently descriptive so that users understand the content + of the release. **Please make sure the section for the new + release follows the same organization as in previous release + sections.** + 3: Merge Release Candidate ^^^^^^^^^^^^^^^^^^^^^^^^^^^ Merge the release candidate branch into the main branch once it is ready and -approved. Do not "squash merge:" that will make the histories of main and -release branches disagree, and we want to preserve the history. After +approved. Do not "squash merge" as it will make the histories of main and +develop branches disagree, and we want to preserve the history. After merging, the release candidate branch can be deleted. @@ -140,7 +171,9 @@ merging, the release candidate branch can be deleted. 6: Merge Main to Develop ^^^^^^^^^^^^^^^^^^^^^^^^^^^ -Create a pull request to merge main into develop. When approved, merge it. +Create a pull request to merge main into develop so that changes in the +release candidate branch are integrated into subsequent Axom development. +When approved, merge it. .. _release-notes-label: diff --git a/src/docs/sphinx/dev_guide/updating_tpls.rst b/src/docs/sphinx/dev_guide/updating_tpls.rst index 3b1e809954..260fd9ca8a 100644 --- a/src/docs/sphinx/dev_guide/updating_tpls.rst +++ b/src/docs/sphinx/dev_guide/updating_tpls.rst @@ -5,14 +5,15 @@ .. _tpls-label: -********************* -Third-party Libraries -********************* +**************************** +Third-party Libraries (TPLs) +**************************** Axom dependencies are grouped into four categories: Git submodules, -built-in Third-party Libraries (TPLs) in the Axom source tree, system-level +built-in TPLs in the Axom source tree, system-level TPLs, and other TPL libraries. The following sections describe how to -install and update these dependencies for Axom. +install and update these dependencies for Axom. Specifically, the sections +should provide Axom developers with answers to questions such as: * How does one add a new compiler or platform to the mix? * How does one build a new set of TPLs for a single platform or compiler @@ -22,77 +23,82 @@ install and update these dependencies for Axom. different branches? * How to use the scripts for team TPL support vs. local development and experimentation? - * Others? Determinism ----------- We strive for as close to deterministic behavior in our builds as possible. -By this, we mean that repeated builds should act the same in the following -regards: +By this, we mean that repeated builds should be the same with respect to the +following: -* Set of libraries with their options and versions -* Compilers, compiler flags, and versions -* Installed file and directory structure with permissions +* Set of libraries with version of each and compile-time options +* Compilers, versions, and compiler flags +* Structure and permissions of installation directory structure and files =========================================== -Build Scripts and Their Configuration Files +Build Scripts and Configuration Files =========================================== -There are three levels of build scripts or programs that drive TPL builds. -As you move up the levels, and away from Spack, the scripts require less -configuration and even build multiple sets of TPLs and/or Axom configurations -at a time. +There are three levels of build scripts or programs that drive Axom TPL builds. +As you move up levels, away from Spack, the scripts require less configuration +and can build multiple sets of TPLs and/or Axom configurations with a single +script invocation. -Here is a brief description of what the levels are handling and what important -configuration and input files they use, from lowest level to highest. +The following sections provide brief descriptions of what each level does and +what important configuration and input files it uses. The sections appear from +lowest level to highest. + +After these levels are described, we discuss Axom development processes that use them. Level 1: Spack -------------- Spack is a multi-platform package manager that builds and installs multiple versions -and configurations of software packages. It has recipes on how to build each package -with variants on each package to customize them to your needs. For example, Axom -has variants for Fortran and MPI, among others. These recipes handle how to drive -the individual packages build systems, as well as any packages they depend on. +and configurations of software packages. It has recipes for building each package +with available variants to customize it to your needs. For example, Axom +has variants for Fortran, MPI, and others. The recipes drive +the individual package build systems and manage packages they depend on. Spack also handles system level packages, so you can describe where they are on your -system instead of building them from scratch. You will need to describe which compilers +system instead of building them from scratch. You will need to describe which compilers are available on your system as well. -* Platform specific configuration files live under ``scripts/spack/configs/``. - There is one file (``spack.yaml``) per platform that handles the following: +* Platform specific configuration files live under ``axom/scripts/spack/configs/``. + There is one file (``spack.yaml``) per platform that has sections describing: * ``compilers``: This section contains the compiler specs that describe the location - and any other required information about that compiler. For example, compiler or - linker flags. - * ``packages``: This section describes the system level packages. For example, - where they are located and what version they are. This file is very important - due to its ability to drastically reduce the amount of packages that Spack builds. - -* Axom specific Spack package files live under ``scripts/spack/packages``. These override - the package files that live in Spack's repository here ``var/spack/repos/builtin/packages``. - We try to minimize these but we have had to alter the existing packages to apply fixes before - pushing them up to Spack proper or alterations to the recipes that are Axom specific. - This overriding does not happen at the Spack level, but at the next level, Uberenv. -* `Spack's GitHub repo `_ -* `Spack's documentation `_ + and any other required information for each compiler. For example, a compiler spec + contains compiler and version, build and linker flags, etc. + * ``packages``: This section describes system level packages. For example, version and + location on the filesystem. This file is very important + due to its ability to drastically reduce the number of packages that Spack builds. + +* Axom specific Spack package files live under ``axom/scripts/spack/packages``. These override + the package files in Spack's repository under ``var/spack/repos/builtin/packages``. + We try to minimize these, but we often have to alter the existing Spack packages to apply fixes + before pushing them up to Spack proper or alter recipes in ways that are Axom specific. + Such overrides do not happen at the Spack level, but at the next level, Uberenv, + described below. + +* More detailed information about Spack can be found in the + `Spack GitHub repo `_ + or in the `Spack documentation `_ .. note:: - Spack does not stop at the first error. It attempts to build as many packages - as possible. Due to this, finding the actual error can sometimes be hard but looking - through the log for a large indented section will help. The error will - be in that section and also a message with a path to the full log will be printed - by Spack afterwards. Searching for ``-build-out.txt`` in your output should + Spack does not stop at the first error it encounters. It attempts to build as many packages + as possible. As a result, finding the root cause of an error can be difficult. However, looking + through the log file, whose name will appear in the screen output, for a large indented section + will help. The error will be in that section and a message with a path to the full log + fill will be printed by Spack afterwards. Searching for ``-build-out.txt`` in your output will help. Level 1: Vcpkg -------------- -Vcpkg is an open-source C++ Library Manager for Windows, Linux, and MacOS by Microsoft. -Axom only uses it for our Windows TPL builds. +Vcpkg is an open-source C++ library manager for Windows, Linux, and MacOS by Microsoft. +For Axom, we use it only for Windows TPL builds. -* Project specific package files live under ``develop/scripts/vcpkg_ports``. There are +* Project specific package files live under ``axom/scripts/vcpkg_ports``. There are two different files for each package: * ``portfile.cmake``: This file is the recipe on how to build the package. Vcpkg @@ -101,68 +107,76 @@ Axom only uses it for our Windows TPL builds. * ``vcpkg.json``: This is the manifest file that describes information about the package. For example, dependencies, license information, and optional features. -* `Vcpkg's GitHub repo `_ -* `Vcpkg's documentation `_ +* More detailed information about Vcpkg can be found in the + Vcpkg GitHub repo `_ + and in the `Vcpkg documentation `_ Level 2: Uberenv ---------------- -Uberenv simplifies the use of two level 1 package managers, Spack and Vcpkg. -We rely on Uberenv for two major points: reducing multiple commands into one -and adding as much determinism as possible. The basic workflow in Uberenv is -the following: +Uberenv simplifies the use of the two level 1 package managers, Spack and Vcpkg. +We rely on Uberenv for two important things: to collapse multiple Spack commands into +one, and to add as much determinism as possible to our use of the level 1 package managers. +The basic workflow in Uberenv is the following: -#. Setup necessary paths and directories like the base directory where the +#. Setup necessary paths and directories such as the base directory where the package manager will be installed. -#. Clone the package manager to the specific Git commit. -#. Apply patches to package manager. For example, disabling extra config scopes in Spack. -#. Adds our repositories package repository to Spack, so our packages take precedence. -#. Clean previous temporary information from previous runs that may bleed into this run. +#. Clone the package manager to a specific Git commit. +#. Apply patches to the package manager. For example, disable extra config scopes in Spack. +#. Add Axom's package repository to Spack, so our packages take precedence. +#. Clean temporary information from previous runs that may bleed into a new run. #. Optionally create a package source mirror. #. Install packages via the selected package manager. -* ``.uberenv_config.json``: This file describes project specific configurations, - such as, where to download the package manager, what git commit to use, and - the top level package to install. -* `Uberenv's GitHub repo `_ -* `Uberenv's documentation `_ +* The information provided to Uberenv to start this workflow is defined in one file in + the top-level Axom source directory: + + * ``.uberenv_config.json``: This file describes project specific configurations, + such as where to download the package manager, what git commit to use, and + the top level package to install. + +* More detailed information about Uberenv can be found in the + `Uberenv GitHub repo `_ + and in the `Uberenv documentation `_ .. note:: - Uberenv's warnings and errors are easy to find by searching the output for ``[ERROR:`` - or ``[Warning:``. Uberenv will stop at the first error. + Uberenv warnings and errors are easy to find by searching the output for ``[ERROR:`` + or ``[Warning:``. Unlike Spack, Uberenv will stop at the first error it encounters. Level 3: Build Scripts ---------------------- The file ``axom/scripts/spack/specs.json`` contains a list of all specs -required per platform or machine name. These specs automatically handle +that we share for Axom development and GitLab CI testing on the LC platforms +we use for development and testing. The specs automatically handle platform differences and contain the full list of compilers and package specs required to build. The directory ``axom/scripts/llnl_scripts`` contains three "build" scripts that -are designed to handle building suites of TPLs via Uberenv and Spack. - -* ``build_tpls.py``: This script starts by building all TPLs listed in the file - ``specs.json``. It will generate host-config files and copy them to the base - of the Axom repository. After building all of the TPLs, it will test Axom - against those built TPLs as well as test the installed ``using-with-cmake`` - example for correctness. This script stops at the first failed TPL build but - attempts to build all host-configs against the Axom source with a summary at - the end of which succeeded or failed. -* ``build_src.py``: This script takes the existing host-configs, or the - specific one you point at, and builds and tests Axom against them. It also - tests the ``using-with-cmake`` examples. +are designed to build suites of TPLs via Uberenv and Spack. + +* ``build_tpls.py``: First, this script builds a set of TPLs for each of the specs + listed in the ``specs.json`` file for the platform on which it is run. For each TPL set, + it will generate a host-config file and copy it to the top-level directory of the local + copy of the Axom repository in which the script is run. After building all of TPL sets + for a platform, it will attempt to build Axom and the ``using-with-cmake`` example against + each set of TPLs. This script stops at the first failed TPL build but + attempts to build the Axom source will each host-config. It will output a summary at + the end indicating which Axom build succeeded or failed. +* ``build_src.py``: This script uses the existing host-configs in your local clone of the + Axom repo, or a specific one you point at, and builds and tests Axom against them. It also + tests the Axom installation via the ``using-with-cmake`` example and Axom tutorials. * ``build_devtools.py``: This script builds and installs the developer tools listed in the ``axom/scripts/spack/packages/axomdevtools/package.py`` Spack - package. It also uses a different set of Spack configs located in the - ``scripts/spack/devtools_config`` directory, so that the regular Spack configs + package. It uses the set of Spack configs located in the + ``axom/scripts/spack/devtools_config`` directory, so that the regular Spack configs can reuse previously built developer tools. .. note:: - Due to the large amount of information printed to the screen over a full build, the build scripts - redirect most build step output to log files. They will not only tell you what command is being run, - i.e., ``[exe: some/command --with-options]``, but it will tell you the log file being written - to before it redirects the output from the command, i.e., ``[[log file: /path/to/log``. + Due to the large amount of information printed to the screen during a full build, the build scripts + redirect most build step output to log files. This output will tell you what command is being run, + i.e., ``[exe: some/command --with-options]``, and will tell you the log file being written + to before it redirects the output from a command, i.e., ``[[log file: /path/to/log``. ============= @@ -172,105 +186,119 @@ Updating TPLs Git submodules -------------- -Currently, Axom uses three external packages that appear in the repo -as Git submodules. These are the following, including the location of the -package in the Axom source tree: +Currently, Axom uses four external packages that appear in the project repo +as Git submodules. These are: - * `BLT `_, which is the CMake-based build + * `BLT `_, the CMake-based build system we use. It is located in ``axom/src/cmake/blt``. - * `Axom Data `_, which is a collection + * `Axom Data `_, a collection of data files used in testing Axom. It is located in ``axom/data``. * `Uberenv `_, which contains Python scripts we use to help automate building third-party dependencies for development and deployment. It is located in ``axom/scripts/uberenv``. + * `RADIUSS Spack Configs `_, + which contains Spack packages for some of our LLNL-developed TPLs. It is + located in ``axom/scripts/spack/radiuss-spack-configs``. There is no software installation process for these dependencies in the traditional sense. To update one of these packages in Axom, simply go into -its directory in Axom and check out a new version. If a version is intended -to be changed in the Axom repo, make the version change on a branch and -submit a GitHub pull request as you would do for other software changes. -More info on :ref:`building-axom-label`. +the directory where the submodule lives in Axom and check out a new version. +If a version is intended to be changed in the Axom repo, make the version change +on a branch and submit a GitHub pull request as you would do for other software +changes. More info on :ref:`building-axom-label`. Built-in TPLs ------------- Axom uses several lightweight, header-only libraries internally, which are -exposed for downstream customers to use if they wish. +exposed for downstream customers to use if they wish. These are: - * `CLI11 `_ is a command line parser - for C++11 and beyond that provides a rich feature set with a simple and + * `CLI11 `_, a command line parser + for C++ and beyond that provides a rich feature set with a simple and intuitive interface. - * `fmt `_ is an open-source formatting + * `fmt `_, an open-source formatting library providing a fast and safe alternative to C stdio and C++ iostreams. - * `sol `_ is a C++ library binding to Lua. - * `Sparsehash `_ contains several - hash-map implementations. + * `sol `_, a C++ library binding to Lua. + * `Sparsehash `_, which contains + several hash-map implementations. -.. note:: Axom patches all built-in TPLs to be under the ``axom`` namespace. - This is to prevent symbol collisions with other projects, either our - dependencies or downstream customers who wish their own versions. For +.. note:: Axom patches its built-in TPLs so that they reside in the ``axom`` namespace + which prevents symbol collisions with other projects, either our + dependencies or downstream customers who wish to use their own versions. For example, ``fmt::format("foo")`` is ``axom::fmt::format("foo")``. -They can be found in the directory: ``axom/src/thirdparty/axom``. The basic -instructions on how to update a built-in TPL are as follows: +These TPLs are located in the directory: ``axom/src/thirdparty/axom``. The basic +instructions on how to update a built-in TPL are: #. Download the new release and override the source that is already there. - This can often involve removing files no-longer needed but most of the - current ones are a single header file. + This may involve removing files no longer needed. -#. Review and apply the existing patch files. More than likely, you will not - be able to directly apply the patch but it will give you the general idea - on what needs to be applied. For example, the namespace update mentioned above. +#. Review and apply the existing patch files in the ``axom/src/thirdparty/axom`` + directory. More than likely, you will not be able to directly apply the patch + file because the source of the library is different than the current version. + However, the patch files give the general idea of what needs to be changed. + For example, inclusion in the ``axom`` namespace mentioned above. -#. Ensure that the build and tests still pass. More info on :ref:`testing-label`. +#. Ensure that the code builds and tests pass. For more information, please see :ref:`testing-label`. -#. Follow the normal pull request work flow. More info on :ref:`pullrequest-label`. +#. Follow the normal pull request work flow. For more information, please see :ref:`pullrequest-label`. .. _local-tpls-label: Local Third-party Library Installation -------------------------------------- -It is often useful to have a different set of TPLs during the development process. -For example, you may want to try out a new library or version of an existing library. +It is often useful to build a new set of TPLs, other than what we have for GitLab CI testing +and regular development. For example, you may want to try out a new library or version of an +existing library. -From the top-level Axom directory, run the following script to build all TPLs -for all existing compiler specs on the platform you are currently on:: +.. important:: Running Spack and building TPLs typically requires much more storage + than you have available in your home directory on an LC system. To + avoid surpassing your disk space quota, you should run TPL builds + in a filesystem location with sufficient space. For example, + ``/usr/workspace/`` is usually appropriate for this. + +From the top-level Axom directory in a local clone of the repo, run the following command +to build all TPLs for all existing compiler specs on the platform you are currently on:: $ ./scripts/llnl_scripts/build_tpls.py -d local/install/path where ``local/install/path`` is a directory location where you want the libraries to be installed. -It will output whether the TPL install succeeded and, +The TPL build script will output whether each TPL install succeeded and, subsequently, whether an Axom build against the TPL install succeeded. +.. note:: When Spack runs, you may see what looks like an error related to ``axom@develop`` + being unable to download. This is not an actual error, but a "feature" of how + Spack reports what it's doing, and can be ignored. + Running the script produces new host-config files (i.e., CMake cache files) -that you can use to build and test Axom with the installation, if issues -arise. The generated host-config files will be located in the top-level Axom +that you can use to build and test Axom against the installation for development or +if issues arise. The generated host-config files will be placed in the top-level Axom directory of your local clone of the repo. If any changes to Axom code are -needed to work with the TPL update(s), make the changes and test them. +needed to work with the TPL update(s), make the changes there and test them. -.. note:: You can build a subset of TPLs for a platform, by using - the ``uberenv.py`` script in the top-level Axom directory. - For example:: +.. note:: You can build a subset of TPLs for a platform, by using the ``uberenv.py`` + script in the top-level Axom directory. For example:: python3 ./scripts/uberenv/uberenv.py --prefix /my/tpl/path --spec clang@10.0.0~cpp14+devtools+mfem+c2c will build the TPLs for the clang 10.0.0 compiler, install them to the ``/my/tpl/path`` directory, and generate a host-config file that you can use to build Axom and its tests. Please see the - ``scripts/spack/specs.json`` file for a current list of tested specs. + ``scripts/spack/specs.json`` file for a current list of TPL specs + we use for GitLab CI testing. Shared Third-party Library Installation Steps --------------------------------------------- -The following instructions describe how to install local copies of Axom -TPLs on Livermore Computing (LC) platforms and recreate our Docker containers +The following instructions describe how to install copies of Axom TPL builds +on Livermore Computing (LC) platforms and recreate our Docker containers with a new set of TPLs. Typically, this process is followed when you want to -update one or more TPLs which Axom depends on. After they are built and -the required changes are merged into develop, they will be available for +update one or more TPLs. After they are built and +the associated changes are merged into develop, they will be available for other Axom developers to use during development, in Axom GitLab CI testing, etc. #. **Working on a local branch.** @@ -286,7 +314,7 @@ other Axom developers to use during development, in Axom GitLab CI testing, etc. change the version number. Do this for each system you want to test/change, including configurations in the ``docker`` subdirectory. - .. note:: Inside of the ``spack.yaml`` for each system package directory, + .. note:: Inside of the ``spack.yaml`` file for each system package directory, there is a ``compilers`` section containing compiler and version information for compilers we use for development and testing. If you wish to test and build with a new compiler or @@ -305,9 +333,18 @@ other Axom developers to use during development, in Axom GitLab CI testing, etc. #. **Install TPLs on all required LC machines.** - This step needs to be run on each of the machines named in Axom's standard host-configs. - When you are confident that everything is correct, become the service user - ``atk`` via the following command:: + + When you are confident that everything is correct and working, you will need + to perform this step on each of the machines named in Axom's standard host-configs. + + .. important:: To install TPL builds to be shared by all Axom developers and used + in our GitLab CI, you will need to become the Axom service user ``atk``. + There is a clone of the Axom repo in the ``/usr/workspace/atk/axom_repo`` + directory. After becoming ``atk``, you can go into that directory and + switch to the branch you made to test your changes. Before running the + TPL builds, make sure the branch is updated, including all submodules. + + Become the service user ``atk`` via the following command:: $ xsu atk @@ -316,28 +353,30 @@ other Axom developers to use during development, in Axom GitLab CI testing, etc. Run the corresponding command for the system you are on:: # blueos - $ lalloc 1 -W 120 scripts/llnl_scripts/build_tpls.py + $ lalloc 1 -W 240 scripts/llnl_scripts/build_tpls.py # toss_4 - $ srun -N1 --interactive -t 120 scripts/llnl_scripts/build_tpls.py + $ srun -N1 --interactive -t 180 scripts/llnl_scripts/build_tpls.py + + .. note:: You may have to adjust the allocation times you ask for the script to complete. - This script will build all third-party libraries for all compilers specs - for the machine you are on. These will be installed into the shared LC directory + The ``build_tpls.py`` script will build all third-party libraries for all compilers specs + for the machine you are on. These will be installed in the shared LC directory ``/usr/workspace/axom/libs//