Category: Hardware & Infrastructure
Definition
GPUs are computer chips originally designed for graphics that now power most AI training and inference.
How It Works
GPUs excel at doing many simple calculations simultaneously. AI involves massive matrix math, which maps perfectly to GPU strengths.
While CPUs have 4-16 cores, GPUs have thousands of smaller cores working in parallel.
Why It Matters
GPUs made modern AI possible. Without them, training large models would take decades instead of weeks. They're why AI advanced so rapidly in the 2010s.
Most AI applications require GPUs for practical performance.
← Back to Hardware & Infrastructure | All Terms