components batch_score_llm - Azure/azureml-assets GitHub Wiki

Batch Score Large Language Models

batch_score_llm

Overview

Version: 1.1.9

View in Studio: https://ml.azure.com/registries/azureml/components/batch_score_llm/version/1.1.9

Inputs

Predefined arguments for parallel job:

https://learn.microsoft.com/en-us/azure/machine-learning/reference-yaml-job-parallel?source=recommendations#predefined-arguments-for-parallel-job

Name Description Type Default Optional Enum
resume_from The pipeline run id to resume from string True

PRS preview feature

Name Description Type Default Optional Enum
async_mode Whether to use PRS mini-batch streaming feature, which allows each PRS processor to process multiple mini-batches at a time. boolean False True

Custom arguments

Name Description Type Default Optional Enum
configuration_file Configures the behavior of batch scoring. uri_file False
data_input_table The data to be split and scored in parallel. mltable False

Outputs

Name Description Type
job_output_path uri_file
mini_batch_results_output_directory uri_folder
⚠️ **GitHub.com Fallback** ⚠️