SpECTRE Documentation Coverage Report
Current view: top level - __w/spectre/spectre/docs/DevGuide - WritingTests.md Hit Total Coverage
Commit: 923cd4a8ea30f5a5589baa60b0a93e358ca9f8e8 Lines: 0 1 0.0 %
Date: 2025-11-07 19:37:56
Legend: Lines: hit not hit

          Line data    Source code
       1           0 : \cond NEVER
       2             : Distributed under the MIT License.
       3             : See LICENSE.txt for details.
       4             : \endcond
       5             : # Writing Unit Tests {#writing_unit_tests}
       6             : 
       7             : \tableofcontents
       8             : 
       9             : Unit tests are placed in the appropriate subdirectory of `tests/Unit`, which
      10             : mirrors the directory hierarchy of `src`. Typically there should be one test
      11             : executable for each production code library. For example,
      12             : we have a `DataStructures` library and a `Test_DataStructures` executable. When
      13             : adding a new test there are several scenarios that can occur, which are outlined
      14             : below.
      15             : 
      16             : - You are adding a new source file to an existing test library:<br>
      17             :   If you are adding a new source file in a directory that already has a
      18             :   `CMakeLists.txt` simply create the source file, which should be named
      19             :   `Test_ProductionCodeFileBeingTest.cpp` and add that to the `LIBRARY_SOURCES`
      20             :   in the `CMakeLists.txt` file in the same directory you are adding the `cpp`
      21             :   file.<br>
      22             :   If you are adding a new source file to a library but want to place it in a
      23             :   subdirectory you must first create the subdirectory. To provide a concrete
      24             :   example, say you are adding the directory `TensorEagerMath` to
      25             :   `tests/Unit/DataStructures`. After creating the directory you must add a call
      26             :   to `add_subdirectory(TensorEagerMath)` to
      27             :   `tests/Unit/DataStructures/CMakeLists.txt` *before* the call to
      28             :   `add_test_library` and *after* the `LIBRARY_SOURCES` are set. Next add the
      29             :   file `tests/Unit/DataStructures/TensorEagerMath/CMakeLists.txt`, which should
      30             :   add the new source files by calling `set`, e.g.
      31             :   ```
      32             :   set(LIBRARY_SOURCES
      33             :       ${LIBRARY_SOURCES}
      34             :       Test_ProductionCodeFileBeingTest.cpp
      35             :       PARENT_SCOPE)
      36             :   ```
      37             :   The `PARENT_SCOPE` flag tells CMake to make the changes visible in the
      38             :   CMakeLists.txt file that called `add_subdirectory`. You can now add the
      39             :   `Test_ProductionCodeFileBeingTested.cpp` source file.
      40             : - You are adding a new directory:<br>
      41             :   If the directory is a new lowest level directory you must add a
      42             :   `add_subdirectory` call to `tests/Unit/CMakeLists.txt`. If it is a new
      43             :   subdirectory you must add a `add_subdirectory` call to the
      44             :   `CMakeLists.txt` file in the directory where you are adding the
      45             :   subdirectory. Next you should read the part on adding a new test library.
      46             : - You are adding a new test library:<br>
      47             :   After creating the subdirectory for the new test library you must add a
      48             :   `CMakeLists.txt` file. See `tests/Unit/DataStructures/CMakeLists.txt` for
      49             :   an example of one. The `LIBRARY` and `LIBRARY_SOURCES` variables set the name
      50             :   of the test library and the source files to be compiled into it. The library
      51             :   name should be of the format `Test_ProductionLibraryName`, for example
      52             :   `Test_DataStructures`. The library sources should be only the source files in
      53             :   the current directory. The `add_subdirectory` command can be used to add
      54             :   source files in subdirectories to the same library as is done in
      55             :   `tests/Unit/CMakeLists.txt`. The `CMakeLists.txt` in
      56             :   `tests/Unit/DataStructures/TensorEagerMath` is an example of how to add source
      57             :   files to a library from a subdirectory of the library. Note that the setting
      58             :   of `LIBRARY_SOURCES` here first includes the current `LIBRARY_SOURCES` and at
      59             :   the end specifies `PARENT_SCOPE`. The `PARENT_SCOPE` flag tells CMake to
      60             :   modify the variable in a scope that is visible to the parent directory,
      61             :   i.e. the `CMakeLists.txt` that called `add_subdirectory`.<br>
      62             :   Finally, in the `CMakeLists.txt` of your new library you must call
      63             :   `add_test_library`. Again, see `tests/Unit/DataStructures/CMakeLists.txt` for
      64             :   an example. The `add_test_library` function adds a test executable with the
      65             :   name of the first argument and the source files of the third argument.
      66             :   Remember to use `target_link_libraries` to link any libraries your test
      67             :   executable uses (see \ref spectre_build_system).
      68             : 
      69             : All tests must start with
      70             : ```cpp
      71             : // Distributed under the MIT License.
      72             : // See LICENSE.txt for details.
      73             : 
      74             : #include "Framework/TestingFramework.hpp"
      75             : ```
      76             : The file `tests/Unit/Framework/TestingFramework.hpp` must always be the first
      77             : include in the test file and must be separated from the STL includes by a blank
      78             : line. All classes and free functions should be in an anonymous/unnamed
      79             : namespace, e.g.
      80             : ```cpp
      81             : namespace {
      82             : class MyFreeClass {
      83             :   /* ... */
      84             : };
      85             : 
      86             : void my_free_function() {
      87             :   /* ... */
      88             : }
      89             : }  // namespace
      90             : ```
      91             : This is necessary to avoid symbol redefinition errors during linking.
      92             : 
      93             : Test cases are added by using the `SPECTRE_TEST_CASE` macro. The first argument
      94             : to the macro is the test name, e.g. `"Unit.DataStructures.Tensor"`, and the
      95             : second argument is a list of tags. The tags list is a string where each element
      96             : is in square brackets. For example, `"[Unit][DataStructures]"`. The tags should
      97             : only be the type of test, in this case `Unit`, and the library being tested, in
      98             : this case `DataStructures`. The `SPECTRE_TEST_CASE` macro should be treated as a
      99             : function, which means that it should be followed by `{ /* test code */ }`. For
     100             : example,
     101             : \snippet Test_Tensor.cpp example_spectre_test_case
     102             : From within a `SPECTRE_TEST_CASE` you are able to do all the things you would
     103             : normally do in a C++ function, including calling other functions, setting
     104             : variables, using lambdas, etc.
     105             : 
     106             : The `CHECK` macro in the above example is provided by
     107             : [Catch2](https://github.com/catchorg/Catch2) and is used to check conditions. We
     108             : also provide the `CHECK_ITERABLE_APPROX` macro which checks if two `double`s or
     109             : two iterable containers of `double`s are approximately
     110             : equal. `CHECK_ITERABLE_APPROX` is especially useful for comparing `Tensor`s,
     111             : `DataVector`s, and `Tensor<DataVector>`s since it will iterate over nested
     112             : containers as well.
     113             : 
     114             : \warning Catch's `CHECK` statement only prints numbers out to approximately 10
     115             : digits at most, so you should generally prefer `CHECK_ITERABLE_APPROX` for
     116             : checking double precision numbers, unless you want to check that two numbers are
     117             : bitwise identical.
     118             : 
     119             : All unit tests must finish within a few seconds, the hard limit is 5, but having
     120             : unit tests that long is strongly discouraged. They should typically complete in
     121             : less than half a second. Tests that are longer are often no longer testing a
     122             : small enough unit of code and should either be split into several unit tests or
     123             : moved to an integration test.
     124             : 
     125             : #### Discovering New and Renamed Tests
     126             : 
     127             : When you add a new test to a source file or rename an existing test the change
     128             : needs to be discovered by the testing infrastructure. This is done by building
     129             : the target `rebuild_cache`, e.g. by running `make rebuild_cache`.
     130             : 
     131             : #### Testing Pointwise Functions
     132             : 
     133             : Pointwise functions should generally be tested in two different ways. The first
     134             : is by taking input from an analytic solution and checking that the computed
     135             : result is correct. The second is to use the random number generation comparison
     136             : with Python infrastructure. In this approach the C++ function being tested is
     137             : re-implemented in Python and the results are compared. Please follow these
     138             : guidelines:
     139             : 
     140             : - The Python implementation should be in a file with the same name as the source
     141             :   file that is being re-implemented and placed in the same directory as its
     142             :   corresponding `Test_*.cpp` source file.
     143             : - The functions should have the same names as the C++ functions they
     144             :   re-implement.
     145             : - If a function does sums over tensor indices then
     146             :   [`numpy.einsum`](https://docs.scipy.org/doc/numpy/reference/generated/numpy.einsum.html)
     147             :   should be used in Python to provide an alternative implementation of the loop
     148             :   structure.
     149             : - You can import Python functions from other re-implementations in the
     150             :   `tests/Unit/` directory to reduce code duplication. Note that the path you
     151             :   pass to `pypp::SetupLocalPythonEnvironment` determines the directory from
     152             :   which you can import Python modules. Either import modules directly from the
     153             :   `tests/Unit/` directory (e.g. `import
     154             :   PointwiseFunction.GeneralRelativity.Christoffel as christoffel`) or use
     155             :   relative imports like `from . import Christoffel as christoffel`. Don't assume
     156             :   the Python environment is set up in a subdirectory of `tests/Unit/`.
     157             : 
     158             : It is possible to test C++ functions that return by value and ones that return
     159             : by `gsl::not_null`. In the latter case, since it is possible to return multiple
     160             : values, one Python function taking all non-`gsl::not_null` arguments must be
     161             : supplied for each `gsl::not_null` argument to the C++. To perform the test the
     162             : `pypp::check_with_random_values()` function must be called. For example, the
     163             : following checks various C++ functions by calling into `pypp`:
     164             : 
     165             : \snippet Test_PyppRandomValues.cpp cxx_two_not_null
     166             : 
     167             : The corresponding Python functions are:
     168             : 
     169             : \snippet PyppPyTests.py python_two_not_null
     170             : 
     171             : #### Writing and Fixing Random-Value Based Tests
     172             : 
     173             : Many tests in SpECTRE make use of randomly generated numbers in order to
     174             : increase the parameter space covered by the tests. The random number generator
     175             : is set up using:
     176             : ```cpp
     177             : MAKE_GENERATOR(gen);
     178             : ```
     179             : The generator `gen` can then be passed to distribution classes such as
     180             : `std::uniform_real_distribution` or `UniformCustomDistribution`.
     181             : 
     182             : Each time the test is run, a different random seed will be used.  When writing a
     183             : test that uses random values, it is good practice to run the test at least
     184             : \f$10^4\f$ times in order to set any tolerances on checks used in the test.
     185             : This can be done by using the following command in the build directory
     186             : (SPECTRE_BUILD_DIR):
     187             : ```
     188             : ctest --repeat-until-fail 10000 -R TEST_NAME
     189             : ```
     190             : where `TEST_NAME` is the test name passed to `SPECTRE_TEST_CASE`
     191             : (e.g. `Unit.Evolution.Systems.CurvedScalarWave.Characteristics`).
     192             : 
     193             : If a test case fails when using a random number generated by `MAKE_GENERATOR`,
     194             : as part of the output from the failed test will be the text
     195             : ```
     196             : Seed is:  SEED from FILE_NAME:LINE_NUMBER
     197             : ```
     198             : Note that the output of tests can be found in
     199             : `SPECTRE_BUILD_DIR/Testing/Temporary/LastTest.log`
     200             : 
     201             : The failing test case can then be reproduced by changing `MAKE_GENERATOR` call
     202             : at the provided line in the given file to
     203             : ```cpp
     204             : MAKE_GENERATOR(gen, SEED);
     205             : ```
     206             : If the `MAKE_GENERATOR` is within `CheckWithRandomValues.hpp`, the failing test
     207             : case most likely has occurred within a call to
     208             : `pypp::check_with_random_values()`.  In such a case, additional information
     209             : should have been printed to help you determine which call to
     210             : `pypp::check_with_random_values()` has failed.  The critical information is
     211             : the line
     212             : ```
     213             : function:  FUNCTION_NAME
     214             : ```
     215             : where `FUNCTION_NAME` should correspond to the third argument of a call to
     216             : `pypp::check_with_random_values()`.  The seed that caused the test to fail can
     217             : then be passed as an additional argument to `pypp::check_with_random_values()`,
     218             : where you may also need to pass in the default value of the comparison
     219             : tolerance.
     220             : 
     221             : Typically, you will need to adjust a tolerance used in a `CHECK` somewhere in
     222             : the test in order to get the test to succeed reliably.  The function
     223             : `pypp::check_with_random_values()` takes an argument that specifies the lower
     224             : and upper bounds of random quantities.  Typically these should be chosen to be
     225             : of order unity in order to decrease the chance of occasionally generating large
     226             : numbers through multiplications which can cause an error above a reasonable
     227             : tolerance.
     228             : 
     229             : #### Testing Failure Cases {#testing_failure_cases}
     230             : 
     231             : `ASSERT`s and `ERROR`s can be tested with the `CHECK_THROWS_WITH`
     232             : macro.  This macro takes two arguments: the first is either an
     233             : expression or a lambda that is expected to trigger an exception (which
     234             : now are thrown by `ASSERT` and `ERROR` (Note: You may need to add `()`
     235             : wrapping the lambda in order for it to compile.); the second is a
     236             : Catch Matcher (see [Catch2](https://github.com/catchorg/Catch2) for
     237             : complete documentation), usually a
     238             : `Catch::Matchers::ContainsSubstring()` macro that matches a substring
     239             : of the error message of the thrown exception.
     240             : 
     241             : When testing `ASSERT`s the `CHECK_THROWS_WITH`
     242             : should be enclosed between `#%ifdef SPECTRE_DEBUG` and an `#%endif`
     243             : If the `#%ifdef SPECTRE_DEBUG` block is omitted then compilers will
     244             : correctly flag the code as being unreachable which results in
     245             : warnings.
     246             : 
     247             : Adding the "attribute" `// [[OutputRegex, Regular expression to match]]`
     248             : before the `SPECTRE_TEST_CASE` macro will force ctest to only pass the
     249             : particular test if the regular expression is found in the output of the
     250             : test.  In this case, the first line of the test should call the macro
     251             : `OUTPUT_TEST();`.
     252             : 
     253             : ### Testing Actions
     254             : 
     255             : The action testing framework is documented as part of the `ActionTesting`
     256             : namespace.
     257             : 
     258             : ## Input file tests
     259             : 
     260             : We have a suite of input file tests in addition to unit tests. Every input file
     261             : in the `tests/InputFiles/` directory is added to the test suite automatically.
     262             : If you don't want your input file tested at all, add the relative input file
     263             : path to the whitelist in `cmake/AddInputFileTests.cmake`. If the input file is
     264             : being tested, it must specify the `Executable` it should run with in the input
     265             : file metadata (above the `---` marker in the input file). Properties of the test
     266             : are controlled by the `Testing` section in the input file metadata. The
     267             : following properties are available:
     268             : 
     269             : - `Check`: Semicolon-separated list of checks, e.g. `parse;execute`. The
     270             :   following checks are available:
     271             :     - `parse`: Just check that the input file passes option parsing.
     272             :     - `execute`: Run the executable. If the input file metadata has an
     273             :       `ExpectedOutput` field, check that these files have been written. See
     274             :       `spectre.tools.CleanOutput` for details.
     275             :     - `execute_check_output`: In additional to `execute`, check the contents of
     276             :       some output files. The checks are defined by the `OutputFileChecks` in the
     277             :       input file metadata. See `spectre.tools.CheckOutputFiles` for details.
     278             : - `CommandLineArgs` (optional): Additional command-line arguments passed to the
     279             :   executable.
     280             : - `ExpectedExitCode` (optional): The expected exit code of the executable.
     281             :   Default: `0`. See `Parallel::ExitCode` for possible exit codes.
     282             : - `Timeout` (optional): Timeout for the test. Default: 2 seconds.
     283             : - `Priority` (optional): Priority of running this test on CI. Possible values
     284             :   are: `low` (not usually run on CI), `normal` (run at least once on CI), `high`
     285             :   (run always on CI). Default: `normal`.

Generated by: LCOV version 1.14