Contributor Notes

Hi! Thanks for contributing. This page contains all the details about getting your dev environment setup.


This is documentation for contributors developing nunavut. If you are a user of this software you can ignore everything here.


When committing to main you must bump at least the patch number in src/nunavut/ or the build will fail on the upload step.


tox -e local

I highly recommend using the local tox environment when doing python development. It’ll save you hours of lost productivity the first time it keeps you from pulling in an unexpected dependency from your global python environment. You can install tox from brew on osx or apt-get on GNU/Linux. I’d recommend the following environment for vscode:

git submodule update --init --recursive
tox -e local
source .tox/local/bin/activate


Our language generation verification suite uses CMake to build and run unit tests. If you are working with a native language see Nunavut Verification Suite for details on manually running these builds and tests.

Visual Studio Code

To use vscode you’ll need:

  1. vscode
  2. install vscode command line (Shell Command: Install)
  3. tox
  4. cmake (and an available GCC or Clang toolchain, or Docker to use our toolchain-as-container)


cd path/to/nunavut
git submodule update --init --recursive
tox -e local
source .tox/local/bin/activate
code .

Then install recommended extensions.

Running The Tests

To run the full suite of tox tests locally you’ll need docker. Once you have docker installed and running do:

git submodule update --init --recursive
docker pull uavcan/toxic:py35-py39-sq
docker run --rm -v $PWD:/repo uavcan/toxic:py35-py39-sq tox

To run a limited suite using only locally available interpreters directly on your host machine, skip the docker invocations and use tox -s.

To run the language verification build you’ll need to use a different docker container:

docker pull uavcan/c_cpp:ubuntu-20.04
docker run --rm -it -v $PWD:/repo uavcan/c_cpp:ubuntu-20.04
./.github/ -l c
./.github/ -l cpp

The script is a simple commandline generator for our cmake scripts. Use help for details:

./.github/ --help

Files Generated by the Tests

Given that Nunavut is a file generator our tests do have to write files. Normally these files are temporary and therefore automatically deleted after the test completes. If you want to keep the files so you can debug an issue provide a “keep-generated” argument.


pytest -k test_namespace_stropping --keep-generated

You will see each test’s output under “build/(test name}”.


Don’t use this option when running tests in parallel. You will get errors.

Sybil Doctest

This project makes extensive use of Sybil doctests. These take the form of docstrings with a structure like thus:

.. invisible-code-block: python

    from nunavut.lang.c import filter_to_snake_case

.. code-block:: python

    # an input like this:
    input = "scotec.mcu.Timer"

    # should yield:
    >>> scotec_mcu_timer

The invisible code block is executed but not displayed in the generated documentation and, conversely, code-block is both rendered using proper syntax formatting in the documentation and executed. REPL works the same as it does for doctest but assert is also a valid way to ensure the example is correct especially if used in a trailing invisible-code-block:

.. invisible-code-block: python

    assert 'scotec_mcu_timer' == filter_to_snake_case(input)

These tests are run as part of the regular pytest build. You can see the Sybil setup in the found under the src directory but otherwise shouldn’t need to worry about it. The simple rule is; if the docstring ends up in the rendered documentation then your code-block tests will be executed as unit tests.

import file mismatch

If you get an error like the following:

_____ ERROR collecting test/gentest_dsdl/ _______________________________________
import file mismatch:
imported module 'test_dsdl' has this __file__ attribute:
which is not the same as the test file we want to collect:
HINT: remove __pycache__ / .pyc files and/or use a unique basename for your test file modules

Then you are probably a wonderful developer that is running the unit-tests locally. Pytest’s cache is interfering with your docker test run. To work around this simply delete the pycache files. For example:

#! /usr/bin/env bash
clean_dirs="src test"

for clean_dir in $clean_dirs
    find $clean_dir -name __pycache__ | xargs rm -rf
    find $clean_dir -name \.coverage\* | xargs rm -f

Note that we also delete the .coverage intermediates since they may contain different paths between the container and the host build.

Alternatively just nuke everything temporary using git clean:

git clean -X -d -f

Building The Docs

We rely on read the docs to build our documentation from github but we also verify this build as part of our tox build. This means you can view a local copy after completing a full, successful test run (See Running The Tests) or do docker run --rm -t -v $PWD:/repo uavcan/toxic:py35-py39-sq /bin/sh -c "tox -e docs" to build the docs target. You can open the index.html under .tox/docs/tmp/index.html or run a local web-server:

python3 -m http.server --directory .tox/docs/tmp &
open http://localhost:8000/docs/index.html

Of course, you can just use Visual Studio Code to build and preview the docs using > reStructuredText: Open Preview.


We manually generate the api doc using sphinx-apidoc. To regenerate use tox -e gen-apidoc.


tox -e gen-apidoc will start by deleting the docs/api directory.

Coverage and Linting Reports

We publish the results of our coverage data to sonarcloud and the tox build will fail for any mypy or black errors but you can view additional reports locally under the .tox dir.


We generate a local html coverage report. You can open the index.html under .tox/report/tmp or run a local web-server:

python -m http.server --directory .tox/report/tmp &
open http://localhost:8000/index.html


At the end of the mypy run we generate the following summaries:

  • .tox/mypy/tmp/mypy-report-lib/index.txt
  • .tox/mypy/tmp/mypy-report-script/index.txt

Nunavut Verification Suite

Nunavut has built-in support for several languages. Included with this is a suite of tests using typical test frameworks and language compilers, interpreters, and/or virtual machines. While each release of Nunavut is gated on automatic and successful completion of these tests this guide is provided to give system integrators information on how to customize these verifications to use other compilers, interpreters, and/or virtual machines.

CMake scripts

Our language generation verification suite uses CMake to build and run unit tests. Instructions for reproducing the CI automation execution steps are below. This section will tell you how to manually build and run individual unit tests as you develop them.


git submodule update --init --recursive
cd verification
cmake ..
cmake --build . --target help

Try running a test which will first compile the test. For example, in the C language build

cmake --build . --target run_test_serialiization

To run the C++ test use the same steps shown in the TLDR above but set NUNAVUT_VERIFICATION_LANG to “cpp” first.

In the list of targets that the cmake --build . --target help command lists the targets that build tests will be prefixed with test_ and the psedo-target that also executes the test will be prefixed with run_test_. You should avoid the _with_lcov when you are manually building tests.

cmake build options

The following options are supported when configuring your build. These can be specified by using -D arguments to cmake. For example

Option Type Default Values Description



Debug, Release, MinSizeRel

Compiler optimizations are set based
on the CMake build type.


c, cpp

Specifies the language for source

code generated by nnvg.



little, big, any

Modifies generated serialization code
and support code to support various
CPU architectures. Other than
endianess, Nunavut serialization and
support code should be generic.



native32, native64

The target platform to compile for.
In future releases we hope to support
ppc (Big), AVR8, RISCV, ARM.




Enable or disable asserts in
generated serialization and support




Enable to omit floating-point
serialization routines.
NUNAVUT_VERIFICATION_LANG_STANDARD string (empty) c++17, c99 (etc) override value for the -std compiler flag of the target language

* Because this option has no default, a value must be provided by the user.

VSCode Remote Container Development of Verification Tests

To write and debug verification tests using VSCode Remote Containers you’ll need to use the “Open Folder in Container…” option:


Open the “verification” folder:


We play a little trick here where we dump you back into the Nunvut repo root when you reopen in the container. This lets you also work with the Python source. If you “reopen locally” while in this state, however, you’ll find yourself back in the verification folder which can be a little disorienting. Write to Microsoft asking them to allow multiple images in the .devcontainer json and we can get rid of this ugly hack. Sorry.