components medimage_insight_ft_pipeline - Azure/azureml-assets GitHub Wiki

Medimage Insight Finetune Pipeline

medimage_insight_ft_pipeline

Overview

Pipeline Component to finetune MedImageInsight Model.

Version: 0.0.2

View in Studio: https://ml.azure.com/registries/azureml/components/medimage_insight_ft_pipeline/version/0.0.2

Inputs

Name Description Type Default Optional Enum
mlflow_embedding_model_path Path to the MLflow model to be imported. uri_folder False
eval_image_tsv Path to the evaluation image TSV file. uri_file False
eval_text_tsv Path to the evaluation text TSV file. uri_file False
image_tsv Path to the image TSV file. uri_file False
text_tsv Path to the text TSV file. uri_file False
label_file Path to the label file. uri_file False
conf_files Path to the configuration files. uri_file False
instance_type_finetune Instance type to be used for finetune component in case of serverless compute, eg. standard_nc24rs_v3. The parameter compute_finetune must be set to 'serverless' for instance_type to be used string Standard_nc24rs_v3 True
compute_finetune compute to be used for finetune eg. provide 'FT-Cluster' if your compute is named 'FT-Cluster'. Special characters like \ and ' are invalid in the parameter value. If compute cluster name is provided, instance_type field will be ignored and the respective cluster will be used string serverless True
process_count_per_instance Number of processes to run per instance. This is used to set the number of GPUs to use for training. integer 1 True
instance_count Number of instances to use for training. integer 1 True

Outputs

Name Description Type
save_dir Directory to save the model and checkpoints, used for pipeline's internal operations. uri_folder
embedding_mlflow_model Directory to save the MLflow model. mlflow_model
classification_mlflow_model Path to save the output model configured with labels. mlflow_model
⚠️ **GitHub.com Fallback** ⚠️