For the Gaming suite of the PCMark Vantage benchmark, Intel HD 4000's scores were stellar and pulled ahead of the NVIDIA GeForce GT 520 by a marginal amount. Against the older DX10 GeForce GT 220, it was just a hair behind it with a 4% differential. With the more powerful GeForce GT 440, it beat the Intel HD 4000 by a margin of about 23%. Compared to the Intel HD Graphics 3000, the new Intel HD Graphics 4000 pulled ahead by margins in the range of roughly 15%. Take note that PCMark's gaming score doesn't really tax the graphics engine much and is still primarily a CPU dominant benchmark. The other graphics tests should prove to be more accurate of the GPU's capabilities.
Moving on to a more intensive DX10 benchmark with 3DMark Vantage, the Intel HD Graphics 4000 managed to beat the old NVIDIA GT 220 by a margin of about 15% and was nearly double the performance when pitted against Intel HD Graphics 3000! Even the discrete NVIDIA GeForce GT 520 was trailing by a shameful 50%. Needless to say, the GeForce GT 440 was overpowering the rest, but at least it's good to see Intel HD Graphics 4000 faring reasonably and fulfilling low-end discrete graphics capabilities.
The Intel HD Graphics 4000 even bested AMD's Fusion concept based AMD A8-3850's results but that's only because 3DMark Vantage also factors in the CPU's capabilities in computing the overall score. Given that the Intel Core i7-3770K has much better general compute performance, 3D Mark Vantage score for the Intel camp came a little ahead of the AMD processor.
Having seen the theoretical graphics performance numbers in 3DMark Vantage, it's time to follow-up with a DX10 gaming title with Far Cry 2 to find out how it really translates in an actual game. In this benchmarking test, the frame rates churned out by the on-die Intel HD Graphics 4000 was fairly impressive as it finally broke past the absolute minimum average of 30fps and delivered average frame rate of 37.27! This came up to be 11% better than the discrete GeForce GT 520 graphics card and up 91% improvement over the last generation Intel HD Graphics 3000!
AMD's A8-3850's internal Radeon HD 6550D managed just over 40fps and thus it's only a tad better than the Intel HD Graphics 4000. Not bad at all for integrated graphics coming from Intel.
So DirectX 10 performance is looking decent for Intel's third-generation HD Graphics engine. What about DirectX 11? With the failure of Intel's new HD Graphics 4000 to run 3DMark 11, we jumped directly to a game supporting it - Battlefield Bad Company 2. Take note that in this game, the results should be taken in with care - there's a mix of DX10 and DX11 capable GPUs for which the latter will run the game utilizing DX11 routines. As such, the scores aren't exactly comparable.
The Intel HD Graphics 4000 was outpaced by the NVIDIA GeForce GT 520 card as it trailed by a margin of about 22%. Needless to say, it was also beaten by the GeForce GT 220 (though this ran at DX10) and GeForce GT 440. Compared to the previous generation Intel HD 3000, it led by margins in the range of about 20% but the gap should actually be wider because the new IGP is using DX11, whereas the old one is running on the DX10 code path. Lastly and most importantly, the HD Graphics 4000 was thoroughly outpaced by the AMD A8-3850's Radeon HD 6550D as the latter generated nearly double the frame rates!
This implies that the DirectX 11 performance of the new Intel IGP has a long way to go to compete effectively. We can only hope that perhaps Intel's software engineers can squeeze more performance out of it with better drivers in the future, but the margin of difference against the competition is very wide.
In terms of CPU utilization for our Blu-ray playback, the Intel HD Graphics 4000 saw its performance slightly improve over the Intel HD Graphics 3000. It improved most (about 25% better) when handling our more complex and higher bitrate H.264 encoded movie title. While it couldn't best the discrete graphics solutions, at these levels of utilization, there's not much to be concerned about in normal usage. AMD's A8-3850's integrated GPU did manage much better CPU utilization rates (up to 75% better than Intel) that can very well be classified as true discrete level of video decoding performance.
The Intel HD Graphics 4000 is an improvement in terms of its new features that include support for DirectX 11, DirectCompute and hardware tessellation support. In also sports 16 EUs, four more over the Intel HD Graphics 3000. In terms of performance scores, the new IGP is a marked improvement over the previous generation - so much so that it managed to double its scores for DirectX 10 based titles to catapult it close to that of AMD's A8-3850 Llano APU. To say that Intel's integrated graphics can come close to that of AMD's own integrated graphics solution is a really big leap for the Intel graphics core.
Despite improvements, the Intel HD Graphics 4000 is still underwhelming for most of the latest gaming titles, especially if you want to capitalize on high quality settings to really immerse yourself with a graphical environment that the game designers intended to recreate. With frame rates under 40fps for our DX10 gaming benchmark on rather undemanding settings, it leaves little options for gamers who want a better gaming experience. Furthermore DirectX 11 performance leaves much to be desired when compared to AMD's Fusion platform. The inability run Futuremark's 3DMark 11 could also be a sign of things where certain games may still refuse to run, as has been the case in the past where games compatibility was a big concern due to inadequate hardware capabilities and the state of its drivers. As such, a discrete graphics card is a must when playing more demanding games on high quality settings, resolution and when you need full compatibility.