Skip to content

Plugin Guide

Whitebox's plugin system allows for easy extension of functionality through self-contained modules. Official Whitebox plugins are available on PyPI.

A few example plugins are as follows:

Plugin Structure

A typical plugin structure looks like this:

whitebox_plugin_<plugin_name>
├── whitebox_plugin_<plugin_name>
│   ├── static
│   │   ├── whitebox_plugin_<plugin_name>
│   │   │   ├── whitebox_plugin_<plugin_name>.css
│   │   │   └── whitebox_plugin_<plugin_name>.js
│   │   └── assets
│   │       ├── logo.svg
│   │       └── my_image.png
│   ├── templates
│   │   └── whitebox_plugin_<plugin_name>
│   │       └── whitebox_plugin_<plugin_name>.html
|   ├── jsx
│   │   ├── SomeComponent.jsx
│   │   ├── SomeComponent.test.jsx
│   │   ├── SomeOtherComponent.jsx
│   │   └── SomeOtherComponent.test.jsx
│   ├── __init__.py
│   ├── whitebox_plugin_<plugin_name>.py
│   └── any_other_file.xyz
├── whitebox_test_plugin_<plugin_name>
│   ├── __init__.py
│   ├── test_whitebox_plugin_<plugin_name>.py
│   ├── test_whitebox_plugin_<plugin_name>_browser.py
│   └── test_whitebox_plugin_<plugin_name>_integration.py
├── LICENSE
├── Makefile
├── README.md
├── pyproject.toml
└── poetry.lock

For whitebox to be able to discover and load plugins dynamically, the plugin package must be named whitebox_plugin_<plugin_name>. The plugin package must contain a whitebox_plugin_<plugin_name>.py file that defines the plugin_class attribute for the plugin class to be exported correctly.

Additionally, for plugin tests to be able to discover and load dynamically, the test package must be named whitebox_test_plugin_<plugin_name>. The test files must start with test_ and they may end with _browser or _integration to indicate the type of test to ensure readability.

JSX files should be placed in the jsx directory within the plugin package. These files will be transpiled and made available as frontend components via module federation, using vite-plugin-federation. JSX tests need to end with .test.jsx to be discovered correctly. Test JSX files will not be transpiled by default, and will only be available for testing purposes (more on this in the Testing JSX code section).

To initialize a new plugin project, run:

poetry new whitebox_plugin_<plugin_name>

Each plugin is a Python package with its own set of resources, including static files, templates and JSX files. If any additional assets are required, they should be placed in the assets directory within the static folder.

If any additional files are required for the plugin to function. Maybe a text file plugin needs to read from or any other file, they should be placed in the root directory of the plugin package not in the root directory of the project. This ensures that poetry can package the plugin correctly when it is published to PyPI.

This structure allows for clear separation of concerns and makes it easy to distribute and install plugins. Additionally, each plugin is expected to have its own repository for version control, CI and documentation if needed.

Plugin API

Plugins must implement the base plugin class provided by the Whitebox. Depending on the plugin's requirements, they can export some or all of the following attributes and methods:

import whitebox

class MyPlugin(whitebox.Plugin):
    name = "My Plugin"
    plugin_template = "plugin_name/plugin_name.html"
    plugin_template_embed = "plugin_name/plugin_name_embed.html"
    plugin_css = [
       "/static/plugin_name/plugin_name.css",
    ]
    plugin_js = [
       "/static/plugin_name/plugin_name.js",
    ]


plugin_class = MyPlugin

If a plugin needs to do some processing before sending template or static files, it can override the following methods:

import whitebox

class MyPlugin(whitebox.Plugin):
    name = "My Plugin"

    def get_template(self) -> str:
        """Return the name of the plugin's main template."""
        pass

    def get_template_embed(self) -> str:
        """
        Return the path to the HTML template file, that will be embedded in an
        iframe for the plugin.
        """
        pass

    def get_css(self) -> list:
        """Return the path to the plugin's CSS files."""
        pass

    def get_js(self) -> list:
        """Return the path to the plugin's JavaScript files."""
        pass

plugin_class = MyPlugin

To ensure that all static file paths are correctly resolved, it is recommended to use django.templatetags.static.static. This function can resolve the correct URL path for serving static files like images exported by plugins. For example, use:

from django.templatetags.static import static

class MyDevicePlugin(Plugin):
    device_image_url = static("whitebox_plugin_device_xyz/path/to/image.webp")

plugin_class = MyDevicePlugin

In the end, the plugin class must be exported as plugin_class. If this attribute is not present, the plugin will not be loadable by Whitebox.

Standard API

Whitebox provides a standard API for plugins to interact with the system and other plugins on the backend side. This API includes methods for:

  • Registering event callbacks
  • Unregistering event callbacks
  • Emitting events
  • Accessing shared resources
  • Interacting with the database

Example of registering an event callback:

import whitebox

class MyPlugin(whitebox.Plugin):
    def __init__(self):
        self.whitebox.register_event_callback("flight_start", self.on_flight_start)

    async def on_flight_start(self, data):
        print("Flight started")

Example of emitting an event:

import whitebox

class MyPlugin(whitebox.Plugin):
    async def update_location(self, lat, lon, alt):
        # Emit a location update
        await self.whitebox.api.location.emit_location_update(lat, lon, alt)

        # Emit a custom event
        await self.whitebox.emit_event("custom_event", {"data": "example"})

Refer to Plugin API Reference for more details on the available methods and properties.

JSX API

In addition to Python code, plugins can also supply JSX code defining React components. Using this feature, plugins can provide custom UI components that can be used in addition, or as an augmentation of other plugins, fully utilizing the design elements, and features of React such as state management.

Defining the JSX component

To define a JSX component, create a file in the jsx directory of the plugin, within a plugin name's directory. For example, if the plugin name is whitebox_plugin_r2d2, the JSX component file should be placed in whitebox_plugin_r2d2/jsx/whitebox_plugin_r2d2/MyComponent.jsx. This goes inline with how static files are resolved in the plugin system.

The JSX component file should export a React component, both as a default export, and a named one. For example:

const MyComponent = () => {
  const [isTranslated, setIsTranslated] = useState(false);

  return (
      <>
        <p>
          R2D2 says: {isTranslated ? "Beep Boop" : "Hello there"}
        </p>

        <button onClick={() => setIsTranslated(!isTranslated)}>
          Translate
        </button>
      </>
  );
};

export {
  MyComponent,
};
export default MyComponent;

An app rarely uses only a single component, and your plugin can define as many components as needed. Each component should be defined in a separate file, and exported in the same manner as the example above.

You can easily import JSX components from the same plugin with the usual import statement. For example, if you had the following files:

whitebox_plugin_r2d2/jsx/whitebox_plugin_r2d2/First.jsx
whitebox_plugin_r2d2/jsx/whitebox_plugin_r2d2/Second.jsx

you could import them like this:

// from within `First.jsx`
import Second from "./Second.jsx";

// or from within `Second.jsx`
import First from "./First.jsx";

The JSX component will be transpiled and made available to core and other plugins via the module federation registry. To use components from the core or other plugins, you can use the methods below.

Using JSX components through capabilities

Plugins may define capabilities that they provide, or require capabilities that they need to work. This allows the core and other plugins to interact in a plugin-agnostic way. For example, a plugin may require the map capability to render a map component, or provide the map-tiles capability to augment rendered maps with custom tiles.

Defining capabilities & slot components

A plugin may define capabilities that it provides, along with the JSX components that are provided through those capabilities. To define a capability, add a provides_capabilities attribute to the plugin class. You can then define a mapping of capability names to JSX components that are provided through those capabilities, as well as define a mapping of components exposed directly by a specific name:

import whitebox

class MyPlugin(whitebox.Plugin):
    ...
    provides_capabilities = ["map"]
    slot_component_map = {
        "map.display": "my_plugin/MyMapComponent",
    }
    exposed_component_map = {
        "map": {
            "SpecificMapDisplay": "my_plugin/MyMapComponent",
        }
    }

This will make the MyMapComponent, located at path PLUGIN_ROOT/jsx/my_plugin/MyMapComponent.jsx available to the core and other plugins to use, both as a map.display component implementation, and as a map.SpecificMapDisplay component to be used directly by others, as explained in below sections.

Using other plugins' slot components

To use other plugin's capabilities, Whitebox offers a SlotLoader component, which is available globally in the window.Whitebox object. It takes a name prop to define what slot component to load, and will pass all the other props to the slot component.

For example, you'd like to render a map somewhere within your plugin's UI, but you don't want to hardcode any specific map components. Instead, you'd like to use the map component provided by the map.display slot component.

First, you would want to ensure that the map capability is available, by defining it in your plugin's requires_capabilities attribute:

import whitebox

class MyPlugin(whitebox.Plugin):
    ...
    requires_capabilities = ["map"]

Then, you would use the Slot component to render the map component:

const { SlotLoader } = Whitebox;

const MyComponent = () => {
  return (
    <div>
      <h1>Look at this shiny map below!</h1>
      <SlotLoader name="map.display" />
    </div>
  );
};

This will render the map component provided by the plugin that provides the map.display capability. If the component accepts a darkMode prop, you can pass it along with the other props:

const MyComponent = () => {
  const [darkMode, setDarkMode] = useState(false);

  return (
    <div>
      <h1>Look at this shiny map below!</h1>
      <SlotLoader name="map.display" darkMode={darkMode} />
      <button onClick={() => setDarkMode(!darkMode)}>Toggle dark mode</button>
    </div>
  );
};

In the same way that you've used map.display to render the slot component implementation, you can use an exposed component (in the above example, the map.SpecificMapDisplay) to render the specific component, no matter the capability:

<SlotLoader name="map.SpecificMapDisplay" darkMode={darkMode} />

Using the JSX components directly

In addition to using the JSX components through the SlotLoader, you can also use a component directly, as if you imported it. This is useful when you want to have a more fine-grained control over the component, or when you want to use the component in a more complex way.

To use a component from another plugin or from the core, you'll need to use the importWhiteboxComponent utility, which is available globally in the window.Whitebox object. This utility takes a slot name as a prop, and will return the component that is registered with that slot name.

For example, if you wanted to render the MyMapComponent from the example above, you could use:

const { importWhiteboxComponent } = Whitebox;

const MapDisplay = importWhiteboxComponent("map.display");

const MyComponent = () => {
  const [darkMode, setDarkMode] = useState(false);

  return (
    <div>
      <h1>Look at this shiny map below!</h1>
      <MapDisplay darkMode={darkMode} />
      <button onClick={() => setDarkMode(!darkMode)}>Toggle dark mode</button>
    </div>
  );
};

In the same way that you've imported map.display to import the slot component implementation, you can import an exposed component (in the above example, the map.SpecificMapDisplay) to import the specific component, no matter the capability:

const MapDisplay = importWhiteboxComponent("map.SpecificMapDisplay");

In addition to the plugins' JSX components, you can also use the core's JSX components. They are "top-level" components that are not under a capability namespace. Some of the available components are:

  • PrimaryButton
  • SecondaryButton
  • Logo

For example, if you wanted to render the PrimaryButton component from the kernel, you could use:

const PrimaryButton = importWhiteboxComponent("PrimaryButton");

const MyComponent = () => {
  const [count, setCount] = useState(0);

  return (
    <div>
      <h1>Click the button below! (clicked {count} times so far)</h1>
      <PrimaryButton text="Click me" onClick={() => setCount(count + 1)} />
    </div>
  );
};

To see the available components from the kernel, you can check the whitebox/frontend/src/utils/components.jsx file, which has the full list. Additionally, you can check which components it links to, giving you the components' full spec.

Using slots

In addition to using the SlotLoader component, you can also define slots in your components, allowing other plugins to augment your components with their own content. This is useful when you want to provide a way for other plugins to extend your components.

To define slots in your component, you can use the useComponentSlots hook, which is available globally in the window.Whitebox object. This hook returns a Slot component that you can use to define default content, or render the content provided by the parent components.

For example, if you wanted to define a slot in your component, you could use:

const { useComponentSlots } = Whitebox;

const MyComponent = ({children}) => {
  const [Slot] = useComponentSlots(children);

  return (
    <div>
      <h1>My component</h1>
      <Slot name="my_slot">
        <p>This is the default content for the slot</p>
      </Slot>
    </div>
  );
};

This will define a slot named my_slot in your component, with the default content being the <p> element. When another plugin uses your component, they can provide their own content for the slot, which will replace the default content.

For example, to replace the default content of the my_slot slot, another plugin could use your component like this:

const MyComponent = importWhiteboxComponent("...");  // Import the component

const MyOtherComponent = () => {
  return (
    <div className="outer">
      <MyComponent>
        <h1 slot="my_slot">This is an override</h1>
      </MyComponent>
    </div>
  );
};

This will replace the default content of the my_slot slot (<p>) with the provided <h1> override. The resulting output will be:

<div class="outer">
  <div>
    <h1>My component</h1>
    <h1>This is an override</h1>
  </div>
</div>

Understanding the Plugin System

Whitebox employs a dynamic plugin discovery and management system. The process of loading and unloading plugins is as follows:

  1. On startup, the system scans the environment for installed plugins with the whitebox_plugin_ prefix.
  2. Each discovered plugin is instantiated and registered with the system.
  3. Plugin resources (templates, static files) are registered with Django's asset pipeline.
  4. Event callbacks registered by plugins are added to the event system.
  5. Device classes available in the plugin are registered with the device manager.
  6. JSX components available in the plugin are transpiled, and then registered with to the module federation registry, making them available to frontend.
  7. Plugins can be unloaded at runtime, removing their resources and event callbacks from the system by simply removing the plugin package (poetry remove <plugin_name>) and calling /plugins/refresh/ endpoint.

When /plugins/refresh/ is called, the system will rescan the environment for plugins and remove or add any new plugins without requiring a server restart or altering what is currently running. This process ensures that plugins are properly integrated into the system without requiring manual configuration for each new plugin.

Plugin Development Workflow

Plugin must be initialized using poetry and should adhere to the structure outlined in the Plugin Structure section. The plugin should implement the base plugin class provided by Whitebox and export it as plugin_class as outlined in the Plugin API section. The plugin can then interact with the system using the Standard API provided by Whitebox.

To set up a development environment for a plugin, follow these steps:

  1. Run the development environment container for Whitebox.
  2. In plugins folder, create a new plugin project using poetry.
  3. Add the plugin to Whitebox using the following command: poetry add -e path/to/plugin within the backend development container.
  4. Run the Whitebox server.

This installs the plugin in editable mode, allowing you to make changes to the plugin code and see the effects immediately without reinstalling the plugin in whitebox.

Handling Version Conflicts for Local Plugins

In some rare cases, you might encounter an error when trying to add a local version of a plugin that is behind the pinned version in Whitebox's pyproject.toml. For example:

$ docker exec backend-dev poetry add -e /plugins/whitebox-plugin-device-insta360

Updating dependencies
Resolving dependencies...

Incompatible constraints in requirements of whitebox (0.1.55):
whitebox-plugin-device-insta360 @ file:///plugins/whitebox-plugin-device-insta360 (0.1.6)
whitebox-plugin-device-insta360 (>=0.1.7,<0.2.0)

This happens because the local version of the plugin (e.g., 0.1.6) is older than the version pinned in Whitebox's pyproject.toml (e.g., >=0.1.7,<0.2.0). This can occur if:

  • You branched off an older version of the plugin.
  • Someone merged a newer version of the plugin into the main branch.
  • You pulled the latest changes from Whitebox's main branch, but your local plugin branch is still behind.

To resolve this issue, you need to update your local plugin branch to match the pinned version in Whitebox's pyproject.toml. Follow these steps:

  1. Pull the Latest Changes for the Plugin: Navigate to the plugin's directory and pull the latest changes from the main branch:

    cd /plugins/whitebox-plugin-device-insta360
    git pull origin main
    

  2. Ensure the Plugin Version Matches: Check the version of the plugin in its pyproject.toml file and ensure it matches or exceeds the pinned version in Whitebox's pyproject.toml. For example:

    # In /plugins/whitebox-plugin-device-insta360/pyproject.toml
    version = "0.1.7"
    
  3. Re-add the Plugin: After updating the plugin, re-add it to Whitebox:

    docker exec backend-dev poetry add -e /plugins/whitebox-plugin-device-insta360
    

Testing plugins on CI environment (including sandbox)

To include the plugin during CI runs, you need to add the plugin to the whitebox project as a dependency via specific git ref, which means that you'd have to, after testing, replace that dependency with the actual plugin version before merging.

As this is mundane and prone to human error, the CI environment allows you to add a "temporary" dependency that will be used in every CI step where whitebox is being installed. This is done through the poetry's optional temporary-dependencies dependency group.

This mechanism will install those dependencies to use for testing and sandbox deploys, and they will be removed by the CI upon merge.

To add a plugin to the temporary group, you can run:

poetry add --group temporary-dependencies git+https://gitlab.com/whitebox-aero/whitebox-plugin-name.git#feature/whitebox-1337

This will add the plugin by git repository and branch. To test that everything works well, you can run:

poetry install --with temporary-dependencies

Take note that, when you include --with temporary-dependencies, those dependencies will take precedence over the ones defined in the standard groups. That means that you can freely add the plugin to the temporary-dependencies group with a specific git ref, without needing to remove it from the standard groups.

Similar to temporary-dependencies, Whitebox also provides an all-official-plugins group. This group includes all the official plugins that are published on PyPI for Whitebox.

Whitebox CI pipeline runs tests against all the official plugins to ensure compatibility. You can also run tests locally against all the official plugins by including the --with all-official-plugins flag during installation.

To install Whitebox with all the official plugins, use the following command:

poetry install --with all-official-plugins

This will install all the official plugins, allowing you to test them in your local environment.


Sometimes, a feature that's being developed will introduce changes in both the kernel and a plugin. In such cases, you want the plugin's CI to run against the kernel's branch you're working on. To do this, you can, within the plugin's MR description, add a line that will start with KERNEL: (case-sensitive), and include the Git branch (and optionally repo URL) of the kernel that the plugin should be tested against. For example:

KERNEL: #feature/whitebox-1337

which will ensure the CI will install the kernel from the feature/whitebox-1337 branch, and the plugin will be tested against that kernel branch. In case your changes are on a different repo URL, you should also specify the full URL, like so:

KERNEL: https://gitlab.com/whitebox-aero/whitebox.git#feature/whitebox-1337

This will ensure that the repo/branch override is a temporary one, and it will not be affecting the branch that you're merging into, after you merge.


Additionally, as the temporary-dependencies group works only within the kernel and does not support transient behavior (plugin's temporary-dependencies are not taken into account), in case a plugin branch depends on another plugin's branch you can specify both of their branches in the kernel's temporary-dependencies, and use the above KERNEL: override to ensure they are both pulled and tested within the CI runs.

Augmentation through the Frontend API

In addition to just extending the backend, plugins can also extend the frontend by providing custom templates, styles, and scripts. The frontend API allows plugins to define custom templates, styles, and scripts that will be loaded into the frontend when the plugin is active.

Plugin scripts have access to the global Whitebox object, which allow for plugins to interact with the core as well as with other plugins.

Registering plugins

To register a plugin, you need to create an object representing the plugin, register it with the Whitebox:

const init = () => {
  console.log('Plugin loaded!')
}

const module = {
  name: 'my_first_plugin',

  providesCapabilities: ['map'],
  requiresSockets: ['flight'],

  init: init,
}

Whitebox.plugins.registerPlugin(module)

Plugin capabilities

Plugins can provide and require capabilities. Capabilities are a way to define what functionality a plugin provides or requires. This allows other plugins to interact with the plugin that they want to extend, based on the capabilities it provides.

If you'd like to augment the behavior of a plugin that provides a map capability, or your plugin requires the capability to be present in order to work, you can require that capability in your plugin:

const module = {
  name: 'my_second_plugin',

  providesCapabilities: ['capability_1'],
  requiresCapabilities: ['map'],

  init: init,
}

For example, the gps-display plugin provides map capability, which can be used by other plugins to augment the map. You can see its implementation of the map capability through the MapExtension here.

At the moment, there are no mechanisms to ensure that a plugin can or cannot be loaded based on the available capabilities, but this is a planned feature.

Plugin sockets

Plugins can also require sockets. As multiple plugins can require the same socket connection, the Whitebox will ensure that the socket is only connected once, and all plugins requiring the same socket will receive the same connection.

The plugin can then use the socket directly to send the events, as well as add event listeners to receive events from the socket:

const init = () => {
  Whitebox.sockets.addEventListener('flight', 'message', (event) => {
    const data = JSON.parse(event.data);
    if (data.type === "location_update") {
      console.log('We are now located at ', data.lat, data.lon);
    }
  })

  Whitebox.sockets['flight'].send(JSON.stringify({ type: 'get_location' }))
}

Plugin extensions

Plugins can also extend the core functionality by adding extensions that standardize the way plugins can interact with the core and with each other. Extensions are a way to define a set of methods that plugins can use to interact with the core or with other plugins.

They are the abstract classes that provide an extensible interface for plugins to implement, as a form of a contract that plugins need to implement for core to be able to use, and plugins to be able to interact with each other. This approach allows any plugin to fully implement the feature of the core. For example, the gps-display implements the map through Leaflet.js, using the MapExtension, allowing gps-display-icons to augment the map with custom icons. Another plugin may want to augment the map with custom layers, or reimplement the map using a different map library, in place of gps-display. As long as the plugin implements the MapExtension properly, it can be used as a map provider, and the gps-display-icons will be able to augment its own map without any additional changes.

At the moment, extensions are defined in the core, in file frontend/src/bridge/extensions.js. In the future, these will be moved into the plugins themselves, to allow for clean separation of concerns between core and plugins [GitLab issue].

You can see an example of a map extension implementation in the GPS Display plugin, and how it's being interacted with in the GPS Display Icons plugin.

Helper utils

Additionally, Whitebox provides a set of helper utilities that plugins can use:

  • Whitebox.apiURL (string): URL to the Whitebox API

External assets

Large static assets

In some cases, plugins may require packaging of large assets. As PyPI imposes a limit of 100MB for any packages it hosts, Whitebox offers a way for plugins to depend on externally hosted assets, which can, from runtime's perspective, be considered as a part of the package itself.

Upon plugin loading, Whitebox will ensure that all the external files are downloaded and ready to be served. To specify external files, create a file in the plugin's root directory called external-asset-manifest.json, for example:

whitebox_plugin_<plugin_name>
├── whitebox_plugin_<plugin_name>
│   ├── __init__.py
│   ├── whitebox_plugin_<plugin_name>.py
│   └── external-asset-manifest.json     <--- this one
├── pyproject.toml

Every external file needs to have 3 components:

  1. URL from which it'll be sourced from

  2. Integrity hash

Every file must have an integrity hash to verify that the downloaded file is a proper one. Within the integrity string, you should specify what hashing algorithm is used, in format [ALGORITHM]-[INTEGRITY_HASH].

Supported hashing algorithms: sha1, sha256.

Upon plugin loading, all the files will be checked, and:

  • If a file does not exist, it will be downloaded
  • If a file exists, but the file hash does not match, it will be downloaded, replacing the existing file. This behavior allows you to freely update your manifest file with new files, without worrying whether the files will be stale.

  • Target path where it will be saved locally and served from

Whitebox saves these files in a special location for asset files, which will be available to plugins in the same manner as if they were the ordinary static files within the plugins' static/ folder.

For example, if your plugin's package name was whitebox_plugin_r2d2, and the target path is voices/beep-boop.mp3, the file will be available for plugin's use at path /static/whitebox_plugin_r2d2/voices/beep-boop.mp3.

Additionally, you can freely use {% static "whitebox_plugin_r2d2/voices/beep-boop.mp3" %} template tags in the templates to reference these files, or alternatively use django.templatetags.static.static("whitebox_plugin_r2d2/voices/beep-boop.mp3") for the same purpose from within the code.

For the above example, the asset manifest file would look like this:

{
  "sources": [
    {
        "url": "https://example.org/r2d2/asset-file.mp3",
        "integrity": "sha1-8dfa2f3e56f3abd46119b698bf6a91cb18482c85",
        "target_path": "voices/beep-boop.mp3"
    },
    ... more files
  ]
}

To verify whether the manifest file is proper, you can use the Django's verify_external_asset_manifest command, by providing either the installed plugin's module name, or the path to file, e.g.

  • poetry run python whitebox/manage.py verify_external_asset_manifest --module-name whitebox_plugin_r2d2, or
  • poetry run python whitebox/manage.py verify_external_asset_manifest path/to/plugin-root/whitebox_plugin_r2d2/external_asset_manifest.json

Testing Plugins

Plugins can only be tested from within the Whitebox environment. To run tests, you need to have Whitebox running locally. As plugins can contain both Python and JSX code, the testing process is divided into two parts:

  • testing Python code, using the backend container, and
  • testing JSX code, using the frontend container.

Testing Python code

The backend test runner will automatically discover and run all tests in the whitebox_test_plugin_<plugin_name> package as long as they follow the naming convention outlined in the Plugin Structure section.

Unit & Integration tests would usually begin with the following structure:

from django.test import TestCase
from plugin.manager import plugin_manager

class TestWhiteboxPluginExamplePlugin(TestCase):
    def setUp(self) -> None:
        self.plugin = next(
            (
                x
                for x in plugin_manager.plugins
                if x.__class__.__name__ == "WhiteboxPluginExamplePlugin"
            ),
            None,
        )
        return super().setUp()

    def test_plugin_loaded(self):
        self.assertIsNotNone(self.plugin)

    def test_plugin_name(self):
        self.assertEqual(self.plugin.name, "Example Plugin")

    # Add more tests here

While browser tests would usually begin with the following structure:

import os
import logging

from django.contrib.staticfiles.testing import StaticLiveServerTestCase
from django.urls import reverse
from playwright.sync_api import sync_playwright

# Disable warnings
logging.basicConfig(level=logging.ERROR)
logger = logging.getLogger(__name__)
logging.getLogger("django.request").setLevel(logging.ERROR)
logging.getLogger("django.server").setLevel(logging.ERROR)

class TestWhiteboxPluginExamplePluginBrowser(StaticLiveServerTestCase):
    @classmethod
    def setUpClass(cls):
        os.environ["DJANGO_ALLOW_ASYNC_UNSAFE"] = "true"
        super().setUpClass()
        cls.playwright = sync_playwright().start()
        cls.browser = cls.playwright.chromium.launch(headless=True)
        cls.context = cls.browser.new_context()
        cls.page = cls.context.new_page()

    @classmethod
    def tearDownClass(cls):
        cls.page.close()
        cls.context.close()
        cls.browser.close()
        cls.playwright.stop()
        super().tearDownClass()

    def setUp(self):
        self.page.goto(f"{self.live_server_url}{reverse('index')}")

    # Add more tests here

Sometimes, you'll want to have a base class containing certain tests that should not be run on that very class, but only on classes that inherit it. In that case, you can assign ____test__ = False to the base class, and the tests on it will only run on the classes that inherit it, not on the base class itself.

For example, for some device, you might want to test connection types based on the class setup:

class BaseInsta360TestCase(TestCase):
    __test__ = False

    @property
    def device_class(self):
        raise NotImplementedError

    def test_get_connection_types(self):
        connection_types = self.device_class.get_connection_types()
        # make some asserts here that should apply for both classes

class TestInsta360X3(BaseInsta360TestCase):
    device_class = Insta360X3
    # You can add more tests specific to Insta360X3 here

class TestInsta360X4(BaseInsta360TestCase):
    device_class = Insta360X4
    # You can add more tests specific to Insta360X4 here

Additionally to ensure browser tests run correctly, you would have to install playwright additional dependencies. To install playwright dependencies, follow the steps below:

  1. Run: poetry run playwright install
  2. Run: poetry run playwright install-deps (optional, for Linux systems only)
  3. Ensure you have added plugin to whitebox: poetry add -e path/to/plugin.

Finally, you would run test on whitebox using the following command within the backend dev container:

make test

Testing JSX code

Unit testing

The frontend's federation unit testing system will automatically discover and run all tests that the plugin provides within the whitebox_plugin_<plugin_name>/jsx/ directory, as long as the files end with .test.jsx.

Tests usually begin with the following structure:

import { render, screen } from "@testing-library/react";
import { MyComponent } from "./MyComponent";

describe("MyComponent", () => {
  it("does something", () => {
    render(<MyComponent />);
    expect(screen.getByText(/R2D2 says:/i)).toBeInTheDocument();
  });

  // Add more tests here
});

The JSX test configuration shares most of the configuration with the kernel's configuration, so you have the same global variables available as in the kernel tests (e.g. beforeEach, afterEach).

Due to the complexity of the federated module system, testing JSX code is slightly different from the usual testing approach. To ensure that the behavior exhibited during testing matches the behavior in production (making tests as reliable as possible), the JSX testing process involves building the federated module system through the backend's apparatus, and then serving the built code through the frontend server.

To run the tests, you first need to run the backend server with the testing flag enabled:

docker exec -it backend-dev make run-federation-test

Then, in a separate shell, run:

docker exec -it frontend-dev make federation_test

Changes in the unit tests require a federation rebuild to be picked up. To help speed up test development, you can, in a separate terminal, run the federation build command in watch mode:

docker exec -it backend-dev poetry run python whitebox/manage.py build_federation_modules --watch --include-tests 

You can read more about this mechanism in JSX testing explained.

Integration testing

Similarly to Python code testing, federation integration tests will be automatically discovered from the whitebox_test_plugin_<plugin_name> package. They need to be located within the whitebox_test_plugin_<plugin_name>/federation/ directory, and all files ending with .spec.js will be picked up by the test runner. The federation integration tests are run using Playwright.

The integration tests usually begin with the following structure:

import { test } from "@tests/setup";
import { expect } from "@playwright/test";

test.describe("Big cube with hearts", () => {
  test.beforeEach(async ({ page }) => {
    await page.goto("/testing-chamber");
  });

  test("should relocate in space", async ({ page }) => {
    const element = await page.locator("button");
    expect(await element.innerText()).toBe("Relocate cube");

    await element.click();
    // implement logic to check if the cube has been relocated
  });

  // Add more tests here
});

Note that test is imported from @tests/setup, not @playwright/test, which is required in order to run the tests properly.

To run the integration tests, first run both servers in the testing mode (the two commands need to be run in separate terminals):

docker exec -it backend-dev make run-federation-test
docker exec -it frontend-dev make run

Then, in a separate terminal, you can run the integration tests for the federation modules:

docker exec -it frontend-dev make federation_integration_test

Changes in the integration tests are picked up automatically on every test run.

Writing CI

The CI for a plugin would almost always get extended from the whitebox shared CI file. The CI would usually look like this:

image: python:3.10

stages:
  - setup
  - lint
  - test
  - update_version
  - publish

include:
  - project: "whitebox-aero/whitebox"
    ref: "main"
    file: ".gitlab/config/shared-ci.yml"

variables:
  PIP_CACHE_DIR: "$CI_PROJECT_DIR/.pip-cache"
  POETRY_HOME: "$CI_PROJECT_DIR/.poetry"

cache:
  paths:
    - .pip-cache/
    - .poetry/
    - .venv/

run_setup:
  extends: .shared_plugin_setup

run_lint:
  extends: .shared_plugin_lint

run_test_python:
  extends: .shared_plugin_test_python

# Include this if your plugin has JSX code
run_test_jsx:
  extends: .shared_plugin_test_jsx

update_version:
  extends: .shared_plugin_update_version

publish:
  extends: .shared_plugin_publish

Versioning Plugins

Whitebox uses Semantic Versioning for versioning plugins. In the CI file above, the update_version stage is responsible for updating the version of the plugin.

When a merge request is merged to main, first, patch version will be bumped in the pyproject.toml file. A commit will be made with the new version, along with a new tag, which the CI will push to the repository. After that, the plugin will be published, as outlined below.

It is important to ensure that the version is incrementally updated, so that the plugin can be published correctly to PyPI.

Automatic versioning setup

The following steps apply for setting up automatic versioning on GitLab, orchestrated by .gitlab-ci.yml.

  1. Set up an Access Token that CI will use to update the version

  2. Go to your repository's settings on Gitlab: Settings > Access Tokens

  3. Create a new token with the following settings:
    • Name: name that you want to appear as the committer (e.g. Whitebox CI)
    • Role: Maintainer
    • Scopes: read_repository, write_repository
  4. Copy the token, as it won't be displayed again

  5. Add the token to your repository's CI/CD settings

  6. Go to your repository's settings on Gitlab: Settings > CI/CD

  7. Open Variables
  8. Add a new variable with the following settings:

    • Type: Variable
    • Environment scope: All
    • Visibility: Masked
    • Key: PUSH_TOKEN
    • Value: the token you copied
  9. Ensure that the update_version stage is set up in your .gitlab-ci.yml

  10. The stage should include the following job:

    update_version:
      extends: .shared_plugin_update_version
    

After this is set up, the CI will automatically update the version of the plugin when a merge request is merged to the default branch (main). This version will automatically be used when the plugin is published.

Publishing Plugins

  1. Initial Setup for New Plugins/Repositories:

    • If the plugin is new and does not have a PyPI project, you need to create it using your PyPI account.
    • Perform the initial publish using the command:

      poetry publish --build
      
    • This should be done from your local machine to automatically create the package on PyPI.

    • Once the project is set up on PyPI, add antoviaque as an owner to the project to share access and management.
  2. Setting Up PyPI Access Tokens for CI:

    • Create a new access token on your PyPI account. This token should be scoped specifically for the project to limit permissions effectively.
    • Add this project-scoped token to your CI environment configuration to enable automated publishing for future releases, similarly to the PUSH_TOKEN setup above:
    • Type: Variable
    • Environment scope: All
    • Visibility: Masked
    • Key: PYPI_TOKEN
    • Value: the token you created
  3. For Existing PyPI Projects:

    • If there is already a PyPI project for the repository, you will need to request access from one of the current maintainers. You can find the maintainers listed on the project’s page on the PyPI website.

Next Steps