Research Development - haxxin/win64_docker-examples GitHub Wiki

Research & Development

These notes represent the origin point that led directly to me creating this git repository! :-)

This page is being recovered from having been accidentally deleted. My brain still holds a fresh copy of this page and thus I am updating it as I find time and of course, memory recall. Excuse the mess in the mean time!

Signal Chat

IMPORTANT(JEFF): This is a copy and paste of a conversation that took place on 2025-04-16 between a friend and myself. It is more or less verbatim, in the order that it took place.

  • it's because cuda on the 50 builds is based on sm_120, newest (windows) pytorch only supports up to sm_86 at its core

  • So the nightly builds and cuda-12.8 - via new nvidia cuda docker containers - prioritized matching the pytorch work that had been done

  • The nvidia containers - I've got a link somewhere. Nvidia press released instructions and links to each container and each git necessary for the code

  • They rolled our a giant new updated driver for the gpus today

  • Fixes for the entire toolkit and cuda (for reference pytorch got their fix posted I think like a month ago?

  • This is my planned instruction guide - I believe its the most cohesive and doesn't start randomly giving commands while jumping from Debian to ubuntu to arch commands

  • The keys coupled with this info CUDA image container tags have a lifetime. The tags will be deleted Six Months after the last supported "Tesla Recommended Driver" has gone end-of-life OR a newer update release has been made for the same CUDA version. CUDA image part on was a press release from nvidia a month ago

  • Multi-arch image manifests are now LIVE for all supported CUDA container image versions

  • It is now possible to build CUDA container images for all supported architectures using Docker Buildkit in one step.

Overview of Images

Three flavors of images are provided:

  • base: Includes the CUDA runtime (cudart)

  • runtime: Builds on the base and includes the CUDA math libraries⁠, and NCCL⁠. A runtime image that also includes cuDNN⁠ is available.

  • devel: Builds on the runtime and includes headers, development tools for building CUDA images. These images are particularly useful for multi-stage builds.

The Dockerfiles for the images are open-source and licensed under 3-clause BSD. For more information see the Supported Tags section below.

NVIDIA Container Toolkit

The NVIDIA Container Toolkit⁠ for Docker is required to run CUDA images.

For CUDA 10.0, nvidia-docker2 -- v2.1.0 or greater is recommended. It is also recommended to use Docker 19.03.

initial setup guide

  1. container-toolkit: install guide
  2. container-toolkit: sample workload

related projects

reference documents

foot notes

reference hardware specs

Jeff (myself)

I, Jeff, run a virtualized environment and use the same GPU on both "workstations" (VMs) by booting into one or the other environment, dependent upon my needs at the moment. Sadly, as I write this, my GT 710 really is the best option for me to try this locally :-( My GPU represents the first generation of the CUDA Compute platform (!) and my expectations are rather low here, but this should be interesting to say the least.

GPUs

Linux

Manjaro Linux, KDE Edition; Arch Linux derived

Make Model Chipset VRAM Driver CUDA Notes
MSI GT 710 Kepler XXX 2GB 470.256.02 11.4 ...

TODO(JEFF): Verify the chipset number; Kepler ...

Windows

Windows 11 "Home" Edition

# wsl --version
WSL version: 2.4.13.0
Kernel version: 5.15.167.4-1
WSLg version: 1.0.65
MSRDC version: 1.2.5716
Direct3D version: 1.611.1-81528511
DXCore version: 10.0.26100.1-240331-1435.ge-release
Windows version: 10.0.22631.5189
Make Model Chipset VRAM Driver CUDA Notes
MSI GT 710 Kepler GK110 2GB 570.x 11.x 3.5 compute

TODO(JEFF): Verify Driver & CUDA revisions

Michael

Michael has potentially more than one GPU to test, and more importantly, modern hardware to choose from in any case. His environment is simplified as compared to my own, and most importantly, boots into Windows without any virtualization layer hoops.

GPUs

Windows

Windows 11 "Pro" Edition

Make Model Chipset VRAM Driver CUDA Notes
PNY RTX 4080-S XLR8 16GB XX.XX XX.XX ..
Gigabyte RTX 5070 TI Aero Ice OC SFF