Category: Hardware & Infrastructure
Definition
CUDA is NVIDIA's programming platform that lets software use GPU power for AI computations.
How It Works
GPUs were built for graphics, but CUDA lets programmers use them for general computing. AI frameworks like TensorFlow and PyTorch use CUDA to run models on NVIDIA GPUs.
CUDA handles the complex work of distributing calculations across thousands of GPU cores.
Why It Matters
CUDA made GPUs the standard for AI training and inference. It gave developers easy access to massive parallel computing power.
Most AI development happens on NVIDIA GPUs using CUDA, making it essential infrastructure for the AI industry.
← Back to Hardware & Infrastructure | All Terms