π§ How the pxl-pipeline cli works¶
π Pipeline lifecycle¶
initβ generate templatetestβ runs the pipeline locally in.venv/deployβ builds & pushes Docker image + registers in Picselliasmoke-testβ runs pipeline in a container before deploying
π Project structure¶
Here is a typical pipeline folder structure:
my_pipeline/
βββ config.toml
βββ pyproject.toml
βββ uv.lock
βββ Dockerfile
βββ picsellia_pipeline.py
βββ local_pipeline.py
βββ steps.py
βββ utils/
β βββ parameters.py
βββ runs/
β βββ run1/
β βββ run_config.toml
βββ .venv/
Key files:¶
-
config.tomlDescribes the pipeline metadata, entrypoint files, requirements file, and model metadata. β This makes pipelines easily portable and shareable. -
pyproject.toml/uv.lockManaged byuvto declare dependencies. You donβt need to manually install anything β just run the CLI. -
picsellia_pipeline.pyEntrypoint when running on Picsellia (inside Docker). -
local_pipeline.pyEntrypoint for running and testing the pipeline locally. -
steps.pyContains@step-decorated functions that define the logic of your pipeline. -
utils/parameters.pyContains the parameter class (TrainingHyperParameters,ProcessingParameters, etc.) used to extract configuration at runtime. -
.venv/Created automatically by the CLI when you runpxl-pipeline test.
π Environment variables¶
The CLI requires:
PICSELLIA_API_TOKEN
PICSELLIA_ORGANIZATION_NAME
PICSELLIA_HOST # optional, defaults to https://app.picsellia.com
They are:
- Prompted once during init, test, or deploy
- Saved in:
~/.config/picsellia/.env - Automatically loaded on future runs
You can:
- Manually edit that file
- Or override any value in the current terminal session with export VAR=...
π§° Dependency management with uv¶
Each pipeline uses uv as the dependency manager. It handles package resolution and installation via pyproject.toml, without needing pip or poetry.
π¦ What happens during pxl-pipeline test?¶
When you run:
pxl-pipeline test my_pipeline
The following is automatically done for you:
uv lockresolves all dependencies and generates/updatesuv.lockuv syncinstalls packages into.venv/based on the lock file
You don't need to install or activate anything manually β the CLI ensures the right environment is built.
β Adding dependencies¶
To install a PyPI package:
uv add opencv-python --project my_pipeline
To add a Git-based package:
uv add git+https://github.com/picselliahq/picsellia-cv-engine.git --project my_pipeline
This updates the pyproject.toml and uv.lock files inside your pipeline folder.
π‘ Tip: the --project flag ensures the package is added to the correct pipeline folder.
π How runs/ work¶
Each test run creates a new directory under runs/:
βββ runs/
β βββ run1/
β βββ run2/
β βββ run3/
β βββ run_config.toml
Inside each run folder:
run_config.tomlstores the parameters used for that run (e.g.experiment_id,model_version_id, etc.)- The dataset and model will be downloaded into this folder
- Logs, annotations, and any outputs will be saved here
Reusing configurations¶
- If a previous run exists, the CLI will prompt:
π Reuse previous config? experiment_id=... [Y/n]
-
Choosing Y reuses the last config (but creates a new folder and re-downloads assets).
-
Use the flag
--reuse-dirto reuse the same directory and config, without downloading again.
Working with pipeline parameters¶
β Adding a custom parameter¶
Each pipeline includes a utils/parameters.py file containing a parameter class that extracts and validates values from Picsellia metadata (experiment or processing).
1. Locate your parameters file¶
my_pipeline/
βββ utils/
β βββ parameters.py β edit this file
2. Edit the parameter class¶
Inside parameters.py, youβll find a class that inherits from:
-
Parameters(for processing pipelines) -
HyperParameters(for training pipelines)
Add your new fields by calling self.extract_parameter(...) in the constructor.
from picsellia_cv_engine.core.parameters import Parameters
class ProcessingParameters(Parameters):
def __init__(self, log_data):
super().__init__(log_data)
# Add your custom parameters here π
self.threshold = self.extract_parameter(
keys=["threshold"],
expected_type=float,
default=0.5,
)
self.use_filter = self.extract_parameter(
keys=["use_filter"],
expected_type=bool,
default=True,
)
- Link the class in
config.toml
Make sure the class is declared in your pipelineβs config.toml:
[execution]
parameters_class = "utils/parameters.py:ProcessingParameters"
β What you can define¶
Each parameter can include:
| Field | Description |
|---|---|
keys |
One or more fallback keys (e.g. ["lr", "learning_rate"]) |
expected_type |
Type validation (int, float, bool, str, Optional[...]) |
default |
Optional default value (or ... to mark as required) |
range_value |
Value bounds: (min, max) for numeric parameters |
Advanced use cases (enums, optional types, dynamic validation) are documented in the base Parameters class via extract_parameter(...).
β Summary¶
-
Pipelines are self-contained and shareable via config.toml
-
Dependencies are isolated and reproducible with uv
-
CLI stores runs in runs/, with config and outputs
-
Parameters are centralized and easy to extend
-
You can deploy to Picsellia with
pxl-pipeline deploy ...
For template-specific usage, see: