Dailytech had a bit of a confusing story the other day. It was later clarified.
Intel has the best products now, but they tend to monopolize on that and the consumer ends up paying the most for it. As such, Intel tends to mark it's TDPs really like they are, if they say 65 or 130W, their CPUs tend to consume that on real world applications. AMD on the other hand states the maximum TDP the chip has and I don't have any idea what kind of load they put their CPUs on, to come up with that kind of power consumption.
So AMD says a CPU has a TDP of 125W and it really has; compared to Intel's TDP it's more like 110W. This measured from only what comes through the 12V rail of course.
65W CPUs tend to consume less than 60W.
These 110W are done with four prime 95 running at the same time, these are not conditions that even enthusiasts often put on their computers; although some do while stress testing.
Intel on the other hand is right on: it's 130W TDP CPU is exactly right and, shock, they are understating the E6850's TDP.
They are currently overstating QX9650's TDP, as they did for a bit in the Q6600 revision G0, they're improving, let them, we all know how prescotts behaved...
On the other hand AMD always overstated it's TDP.
AMD came up with ACP, like they did with the rating scheme of CPUs that Intel has, in some way, now adopted as it's CPU numbering scheme. This way one can look at the ACP and really compare it to Intel's TDP. Sure now they're not being really accurate now, but stating a 105W ACP is not very far from target and it is a lot more close to the real power consumption than the TDP ever was. And although they are not leading in the power/performance figures, they make less power hungry CPUs and are publicly stating that so future costumer comparisons of power consumption by a costumer are fairer.
It's something they should have done a long time ago, like the rating scheme thing back in the Athlon XP days.
No comments:
Post a Comment