TPU - bobbae/gcp GitHub Wiki
Tensor Processing Units (TPUs) are Google’s custom-developed application-specific integrated circuits (ASICs) used to accelerate machine learning workloads.
https://cloud.google.com/blog/products/compute/cloud-tpu-vms-are-generally-available
Cloud TPUs
https://cloud.google.com/tpu/docs/tpus
TPU Architecture
https://cloud.google.com/tpu/docs/system-architecture-tpu-vm
TPU VMs
https://cloud.google.com/blog/products/compute/introducing-cloud-tpu-vms
https://cloud.google.com/tpu/docs/system-architecture-tpu-vm
TPU ML hub
TPU types and topologies
https://cloud.google.com/tpu/docs/types-topologies
TPU regions and zones
https://cloud.google.com/tpu/docs/regions-zones
Cloud TPU v4 pods
TPU optimized models
https://github.com/tensorflow/tpu/tree/master/models/official
Cloud TPU storage options
https://cloud.google.com/tpu/docs/storage-options
Deep Learning VM Image
https://cloud.google.com/deep-learning-vm
Cloud TPU Quickstarts
https://cloud.google.com/tpu/docs/quick-starts
TPU vs GPU performance
https://cloud.google.com/blog/products/ai-machine-learning/google-wins-mlperf-benchmarks-with-tpu-v4
Jax on Cloud TPU
https://cloud.google.com/tpu/docs/run-calculation-jax
AI Accelerators
https://medium.com/@adi.fu7/ai-accelerators-part-iv-the-very-rich-landscape-17481be80917
Managing TPUs
https://cloud.google.com/tpu/docs/managing-tpus-tpu-vm