Validation of pipeline parameters
validateParameters()
This function takes all pipeline parameters and checks that they adhere to the specifications defined in the JSON Schema.
- It does not return anything, but logs errors or warnings indicating the parameters that failed to the command line.
- If any parameter validation has failed, it throws a
SchemaValidationException
exception to stop the pipeline. - If any parameters in the schema reference a sample sheet schema with
schema
, that file is loaded and validated.
The function takes two optional arguments:
- The filename of a JSON Schema file (optional, default:
nextflow_schema.json
). File paths should be relative to the root of the pipeline directory. - A boolean to disable coloured outputs (optional, default:
false
). The output is coloured using ANSI escape codes by default.
You can provide the parameters as follows:
Monochrome logs can also be set globally providing the parameter --monochrome_logs
or adding params.monochrome_logs = true
to a configuration file. The form --monochromeLogs
is also supported.
Tip
As much of the Nextflow ecosystem assumes the nextflow_schema.json
filename, it's recommended to stick with the default, if possible.
See the Schema specification for information about what validation data you can encode within the schema for each parameter.
Example
The example below has a deliberate typo in params.input
(.txt
instead of .csv
).
The validation function catches this for two reasons:
- The filename doesn't match the expected pattern (here checking file extensions)
- The supplied file doesn't exist
The function causes Nextflow to exit immediately with an error.
N E X T F L O W ~ version 23.04.1
Launching `pipeline/main.nf` [amazing_crick] DSL2 - revision: 53bd9eac20
ERROR ~ Validation of pipeline parameters failed!
-- Check '.nextflow.log' file for details
The following invalid input values have been detected:
* --input (samplesheet.txt): "samplesheet.txt" does not match regular expression [^\S+\.(csv|tsv|yml|yaml)$]
* --input (samplesheet.txt): the file or directory 'samplesheet.txt' does not exist
Failing for unrecognized parameters
When parameters which are not specified in the JSON Schema are provided, the parameter validation function returns a WARNING
.
This is because user-specific institutional configuration profiles may make use of params that are unknown to the pipeline.
The down-side of this is that warnings about typos in parameters can go unnoticed.
To force the pipeline execution to fail with an error instead, you can provide the validation.failUnrecognisedParams = true
configuration option:
Ignoring unrecognized parameters
Sometimes, a parameter that you want to set may not be described in the pipeline schema for a good reason. Maybe it's something you're using in your Nextflow configuration setup for your compute environment, or it's a complex parameter that cannot be handled in the schema, such as nested parameters.
In these cases, to avoid getting warnings when an unrecognised parameter is set,
you can use --validationSchemaIgnoreParams
/ params.validationSchemaIgnoreParams
.
This should be a comma-separated list of strings that correspond to parameter names.
Variable type checking
By default, validateParameters()
is strict about expecting parameters to adhere to their expected type.
If the schema says that params.foo
should be an integer
and the user sets params.foo = "12"
(a string with a number), it will fail.
If this causes problems, the user can run validation in "lenient mode", whereby the JSON Schema validation tries to cast parameters to their correct type. For example, providing an integer as a string will no longer fail validation.
Note
The validation does not affect the parameter variable types in your pipeline. It attempts to cast a temporary copy of the params only, during the validation step.
To enable lenient validation mode, set validation.lenientMode = true
in your configuration file.