I think CloudWatch reports CPU usage in CPU units and averages it over time windows like one minute, which causes short spikes to be lost. In contrast, Datadog shows usage in real time as a percentage relative to vCPUs, so it can capture peaks like 223%. That’s why CloudWatch numbers usually look lower, while Datadog reflects a more accurate view of actual CPU usage.