CLI aggregate_deep

peerannot aggregate-deep --help

peerannot aggregate-deep

Crowdsourcing strategy using deep learning models

peerannot aggregate-deep [OPTIONS] [DATASET]

Options

-K, --n-classes <n_classes>

Number of classes to separate

-o, --output-name <output_name>

Name of the generated results file

-s, --strategy <strategy>

Deep learning strategy

--model <model>

Neural network to train on

--answers <answers>

Crowdsourced labels in json file

--img-size <img_size>

Size of image (square)

--pretrained

Use torch available weights to initialize the network

Default:

False

--n-epochs <n_epochs>

Number of training epochs

--lr <lr>

Learning rate

--momentum <momentum>

Momentum for the optimizer

--decay <decay>

Weight decay for the optimizer

--scheduler <scheduler>

Use a multistepscheduler for the learning rate by default. To use the cosine annealing use the keyword ‘cosine’

Default:

'multistep'

-m, --milestones <milestones>

Milestones for the learning rate decay scheduler

--n-params <n_params>

Number of parameters for the logistic regression only

--lr-decay <lr_decay>

Learning rate decay for the scheduler

--num-workers <num_workers>

Number of workers

--batch-size <batch_size>

Batch size

-optim, --optimizer <optimizer>

Optimizer for the neural network

--data-augmentation

Perform data augmentation on training set with a random choice between RandomAffine(shear=15), RandomHorizontalFlip(0.5) and RandomResizedCrop

Default:

False

--path-remove <path_remove>

Path to file of index to prune from the training set

--metadata_path <metadata_path>

Path to the metadata of the dataset if different than default

--freeze

Freeze all layers of the network except for the last one

Default:

False

--seed <seed>

random state

Arguments

DATASET

Optional argument