Commandline Usage¶
MIALSRTK BIDS App
adopts the BIDS standard for data organization and takes as principal input the path of the dataset that is to be processed. The input dataset is required to be in valid BIDS format, and it must include at least one T2w scan with anisotropic resolution per anatomical direction. See BIDS and BIDS App standards page that provides links for more information about BIDS and BIDS-Apps as well as an example for dataset organization and naming.
Commandline Arguments¶
The command to run the MIALSRTK BIDS App
follows the BIDS-Apps definition standard with an additional option for loading the pipeline configuration file.
Argument parser of the MIALSRTK BIDS App
usage: mialsuperresolutiontoolkit-bidsapp [-h]
[--participant_label PARTICIPANT_LABEL [PARTICIPANT_LABEL ...]]
[--param_file PARAM_FILE]
[--openmp_nb_of_cores OPENMP_NB_OF_CORES]
[--nipype_nb_of_cores NIPYPE_NB_OF_CORES]
[--manual] [-v]
bids_dir output_dir {participant}
Positional Arguments¶
bids_dir | The directory with the input dataset formatted according to the BIDS standard. |
output_dir | The directory where the output files should be stored. If you are running group level analysis this folder should be prepopulated with the results of the participant level analysis. |
analysis_level | Possible choices: participant Level of the analysis that will be performed. Only participant is available |
Named Arguments¶
--participant_label | |
The label(s) of the participant(s) that should be analyzed. The label corresponds to sub-<participant_label> from the BIDS spec (so it does not include “sub-“). If this parameter is not provided all subjects should be analyzed. Multiple participants can be specified with a space separated list. | |
--param_file | Path to a JSON file containing subjects’ exams information and super-resolution total variation parameters. Default: “/bids_dir/code/participants_params.json” |
--openmp_nb_of_cores | |
Specify number of cores used by OpenMP threads Especially useful for NLM denoising and slice-to-volume registration. (Default: 0, meaning it will be determined automatically) Default: 0 | |
--nipype_nb_of_cores | |
Specify number of cores used by the Niype workflow library to distribute the execution of independent processing workflow nodes (i.e. interfaces) (Especially useful in the case of slice-by-slice bias field correction and intensity standardization steps for example). (Default: 0, meaning it will be determined automatically) Default: 0 | |
--manual | Use manual brain masks found in Default: False |
-v, --version | show program’s version number and exit |
BIDS App configuration file¶
The BIDS App configuration file specified by the input flag –param_file adopts the following JSON schema:
{
"01": [
{ "sr-id": 1,
("session": 01,)
"stacksOrder": [1, 3, 5, 2, 4, 6],
"paramTV": {
"lambdaTV": 0.75,
"deltatTV": 0.01 }
}],
"01": [
{ "sr-id": 2,
("session": 01,)
"stacksOrder": [2, 3, 5, 4],
"paramTV": {
"lambdaTV": 0.75,
"deltatTV": 0.01 }
}]
"02": [
{ "sr-id": 1,
("session": 01,)
"stacksOrder": [3, 1, 2, 4],
"paramTV": {
"lambdaTV": 0.7,
"deltatTV": 0.01 }
}]
...
}
- where:
"sr-id"
allows to distinguish between runs with different configurations of the same acquisition set."stacksOrder"
defines the list and order od scans to be used in the reconstruction."lambdaTV"
(regularization) and deltaTV (optimization time step) are parameters of the TV super-resolution algorithm."session"
MUST be specified if you have a BIDS dataset composed of multiple sessions with the sub-XX/ses-YY structure.
Important
Before using any BIDS App, we highly recommend you to validate your BIDS structured dataset with the free, online BIDS Validator.
Running the MIALSRTK BIDS App¶
You can run the MIALSRTK BIDS App using a lightweight wrapper we created for convenience or you can interact directly with the Docker Engine via the docker run command line. (See Installation Instructions for Users)
With the mialsuperresolutiontoolkit_bidsapp
wrapper¶
When you run mialsuperresolutiontoolkit_bidsapp
, it will generate a Docker command line for you,
print it out for reporting purposes, and then execute it without further action needed, e.g.:
$ mialsuperresolutiontoolkit_bidsapp \ /home/localadmin/data/ds001 /media/localadmin/data/ds001/derivatives \ participant --participant_label 01 \ --param_file /home/localadmin/data/ds001/code/participants_params.json \ (--openmp_nb_of_cores 4) \ (--nipype_nb_of_cores 4)
With the Docker Engine¶
If you need a finer control over the container execution, or you feel comfortable with the Docker Engine, avoiding the extra software layer of the wrapper might be a good decision. For instance, previous call to the mialsuperresolutiontoolkit_bidsapp
wrapper corresponds to:
$ docker run -t --rm -u $(id -u):$(id -g) \ -v /home/localadmin/data/ds001:/bids_dir \ -v /media/localadmin/data/ds001/derivatives:/output_dir \ sebastientourbier/mialsuperresolutiontoolkit-bidsapp:2.0.0 \ /bids_dir /output_dir participant --participant_label 01 \ --param_file /bids_dir/code/participants_params.json \ (--openmp_nb_of_cores 4) \ (--nipype_nb_of_cores 4)
Note
We use the -v /path/to/local/folder:/path/inside/container docker run option to access local files and folders inside the container such that the local directory of the input BIDS dataset (here: /home/localadmin/data/ds001
) and the output directory (here: /media/localadmin/data/ds001/derivatives
) used to process are mapped to the folders /bids_dir
and /output_dir
in the container respectively.
Debugging¶
Logs are outputted into
<output dir>/nipype/sub-<participant_label>/anatomical_pipeline/rec<srId>/pypeline.log
.
Support, bugs and new feature requests¶
All bugs, concerns and enhancement requests for this software are managed on GitHub and can be submitted at https://github.com/Medical-Image-Analysis-Laboratory/mialsuperresolutiontoolkit/issues.
Not running on a local machine? - Data transfer¶
If you intend to run the MIALSRTK BIDS App on a remote system, you will need to make your data available within that system first. Comprehensive solutions such as Datalad will handle data transfers with the appropriate settings and commands. Datalad also performs version control over your data.