Artificial intelligence and machine learning are often described as some of the most demanding tasks in modern computing. In IB Computer Science, students are expected to understand why GPUs are preferred over CPUs for many AI and machine learning applications — not just that they are “faster”.
The reason lies in how AI algorithms work and how GPUs are designed.
What Machine Learning Workloads Look Like
At a basic level, many machine learning tasks involve:
- Large datasets
- Repeated calculations
- Mathematical operations on arrays or matrices
For example, training a model may require:
- Performing the same calculation millions of times
- Applying identical operations to different data points
- Processing data in parallel
These characteristics strongly influence hardware choice.
Why CPUs Are Not Ideal for Machine Learning
CPUs are designed for:
- Sequential processing
- Complex control flow
- Branching and decision-making
While CPUs are very powerful, they:
- Have relatively few cores
- Are optimised for flexibility rather than throughput
This makes CPUs inefficient for tasks where the same operation must be repeated across large datasets.
Why GPUs Are Well-Suited to AI
GPUs are designed to handle massive parallelism.
Key features that make GPUs ideal for AI include:
- Thousands of small cores
