environments mlflow py312 inference - Azure/azureml-assets GitHub Wiki

mlflow-py312-inference

Overview

AzureML MLflow/Ubuntu 22.04/Python 3.12 cpu environment.

Version: 2

Tags

OS : Ubuntu22.04 Inferencing Preview

View in Studio: https://ml.azure.com/registries/azureml/environments/mlflow-py312-inference/version/2

Docker image: mcr.microsoft.com/azureml/curated/mlflow-py312-inference:2

Docker build context

Dockerfile

FROM mcr.microsoft.com/azureml/inference-base-2204:20241111.v1

WORKDIR /
ENV AZUREML_CONDA_ENVIRONMENT_PATH=/azureml-envs/minimal
ENV AZUREML_CONDA_DEFAULT_ENVIRONMENT=$AZUREML_CONDA_ENVIRONMENT_PATH

# Prepend path to AzureML conda environment
ENV PATH $AZUREML_CONDA_ENVIRONMENT_PATH/bin:$PATH
ENV LD_LIBRARY_PATH $AZUREML_CONDA_ENVIRONMENT_PATH/lib:$LD_LIBRARY_PATH

# Set MLFlow environment variables
ENV AML_APP_ROOT="/var/mlflow_resources"
ENV AZUREML_ENTRY_SCRIPT="mlflow_score_script.py"

USER root
# Copying of mlmonitoring will add once testing is completed.
# COPY mlmonitoring /var/mlflow_resources/mlmonitoring

# We'll copy the HF scripts as well to enable better handling for v2 packaging. This will not require changes to the
# packages installed in the image, as the expectation is that these will all be brought along with the model.
COPY mlflow_score_script.py /var/mlflow_resources/mlflow_score_script.py
COPY mlflow_hf_score_cpu.py /var/mlflow_resources/mlflow_hf_score_cpu.py
COPY mlflow_hf_score_gpu.py /var/mlflow_resources/mlflow_hf_score_gpu.py

# Create conda environment
COPY conda_dependencies.yaml .
RUN conda env create -p $AZUREML_CONDA_ENVIRONMENT_PATH -f conda_dependencies.yaml -q && \
    rm conda_dependencies.yaml && \
    conda run -p $AZUREML_CONDA_ENVIRONMENT_PATH pip cache purge && \
    conda clean -a -y   
USER dockeruser

CMD [ "runsvdir", "/var/runit" ]
⚠️ **GitHub.com Fallback** ⚠️