AMD A8-3850 - Integrated Graphics the AMD Way

Promising the best integrated graphics performance ever, AMD's Llano is now officially on the desktop. Do these new A-Series APU live up to the billing? Will AMD's promise of Fusion computing finally kick off in the mainstream? To answer that, we check out the AMD A8-3850 APU.

Introducing the AMD A-Series APU  

You may have heard rumblings about Fusion APUs from AMD in the past few months, with cryptic codenames like Brazos and Llanos. Brazos was the  or Accelerated Processing Unit from AMD to ship this year and it was mostly targeted at the entry level mobile segment. Llano meanwhile is the mainstream offering and the mobile variants appeared earlier this month, which we covered in a preview . Our conclusion then was that Llano is a serious contender to mainstream Sandy Bridge mobile processors with its competitive graphics performance and battery life. 

Today, the desktop Llano APUs are unveiled. Known similarly as AMD A-Series APU, these new desktop models have a similar naming scheme (analogous to Intel's Core i3, i5 and i7) as the mobile variants, though currently there are just two series at launch, A6 and A8 and both feature quad-core processors.

So what exactly is an AMD A-Series APU? In AMD's own words, it's like taking a Northbridge chip, with its memory controller functionality, combining it with an x86 CPU based on a tweaked version of AMD's K10 architecture (on the ) and then adding a discrete DirectX 11 graphics processor with up to 400 Radeon cores. Shrink it down to a die size of 228mm2 and built using AMD's 32nm HKMG process.  

What is in an AMD A-Series APU? A visual illustration of what AMD is trying to do with this Fusion APU.

What is in an AMD A-Series APU? A visual illustration of what AMD is trying to do with this Fusion APU.

The four 'Stars' CPU cores are based on the Phenom II architecture, but updated with improvements of up to 6% increase in instructions processed per clock (IPC). Each core also gets 1MB of L2 cache, which compares favorably with the 512KB per core on the Athlon II X4. The presence of the graphics cores means that there's no space left for a L3 cache and as a result, we believe that AMD's Phenom II processors will be faster than these new APUs in terms of CPU horsepower.

One major change for the A-Series APU is the addition of AMD Turbo Core. This technology, which we , is similar in concept to Intel's Turbo Boost - when less processor cores are utilized, it allows the remaining, running cores to scale up in terms of clock speeds. Turbo Core has been available for Phenom II processors before, but AMD has now brought this feature to certain A-Series APU and it should work similarly.

Although the integrated, dual-channel DDR3 memory controller is not surprising in today's modern CPU architecture, the controller on the AMD A-Series APU actually supports up to DDR3 1866MHz, which is the highest we have seen so far on an available CPU. As you'll see later, the memory frequency will have a direct impact on the APU's graphics performance and users will be glad that 1866MHz is the upper limit.  

AMD will be launching two A-Series APU for the desktop today. The A8-3850, which we received for testing and review, is the top model with a maximum clock speed of 2.9GHz. It will not have AMD Turbo Core, but makes up for it with the highest clock speed and the best integrated graphics core, the Radeon HD 6550D. It's priced at US$135. The other model is the 2.6GHz A6-3650, which has the slower Radeon HD 6530D graphics and it too, has no Turbo Core. It's priced at US$115.

AMD has also revealed other models in its presentation. All the models have a similar amount of L2 cache and will use the new socket FM1. You can get the key specs for the rest of the bunch below: 

AMD revealed four models today, all quad-core APUs that use the new socket FM1. The fastest, the A8-3850 runs at 2.9GHz, but does not come with AMD Turbo Core. The slowest, the 65W A6-3600 starts at 2.1GHz, but it can scale up to 2.4GHz with Turbo Core.

AMD revealed four models today, all quad-core APUs that use the new socket FM1. The fastest, the A8-3850 runs at 2.9GHz, but does not come with AMD Turbo Core. The slowest, the 65W A6-3600 starts at 2.1GHz, but it can scale up to 2.4GHz with Turbo Core.

As the block diagram showed, the graphics array in the APU takes up almost 50% of the die, which proves that AMD is taking the graphics aspect very seriously. Included within is AMD's latest found on its Radeon HD graphics cards, which will provide hardware acceleration for HD content and will result in lower CPU utilization. As for the integrated graphics core, there are two types available in the A-Series desktop APUs launched today, the Radeon HD 6550D and the 6530D. They differ primarily in the number of Radeon cores and the clock speeds:

Like Intel's HD Graphics 2000 and 3000, the graphics core within the A-Series APU differs based on the model, with the A8 getting the full 400 cores and 600MHz clock speed.

Like Intel's HD Graphics 2000 and 3000, the graphics core within the A-Series APU differs based on the model, with the A8 getting the full 400 cores and 600MHz clock speed.

And finally, the new AMD APU can only be identified properly with the latest version 1.58 of CPU-Z. Apparently this version has just been released and when we first started testing our review A8-3850 APU, we had to make do with the previous version that barely gave any useful information.


 

 

No processor runs by itself in isolation; there's always a supporting platform and for the A-Series desktop APU, that is the Lynx platform (the mobile Llano equivalent is Sabine). More on this on the following page.

The Lynx Platform

The Lynx platform consists of AMD's new A75 or A55 Fusion Controller Hub (FCH) or what was formerly known as the Southbridge. We have from ASRock, ASUS, Gigabyte and MSI and besides the new FM1 socket, what these new A75 motherboards have over its Intel counterparts is this - native USB 3.0 support. While Intel boards nowadays come with third-party USB 3.0 controllers anyway, AMD claims that its native solution has a slight edge in real-world performance. We haven't actually tested this aspect yet, but it's always nice to see progress made here. Do note that native USB 3.0 support is only present on the A75 FCH and not the A55.

The other advantage that AMD holds over Intel is the presence of six SATA 6Gbps ports. That's four more than the two SATA ports on Intel chipsets, though it's something that AMD has had on its chipsets since the 800-series. Again, the A55 is the cost effective option and will only have SATA 3Gbps support.

Those who are thinking of setting up dual graphics cards on an A75 motherboard despite its mainstream nature will find the FCH to be quite accommodating. A single PCIe 2.0 x16 lane is available and we have already seen x8/x8 configurations on A75 boards. Besides this, the other features that you can expect to see on the A75 FCH are the usual - HD audio and up to 12 USB 1.1/2.0 ports.

 

The A75 Fusion Controller Hub diagram.

The A75 Fusion Controller Hub diagram.

 

Dual Graphics

AMD's Dual Graphics allows users to pair a suitable Radeon HD graphics card with the APU in a CrossFireX configuration that will give a boost to graphics performance. One could say it's just another form of ATI's Hybrid Graphics technology, as the idea is similar. AMD has already listed the recommended graphics card to pair with each APU and for our A8-3850, that is the Radeon HD 6670. The general rule is that the performance discrepancy between the integrated and the discrete should be minimized if possible to get the best out of this technology.

In our , we had talked a bit about the technology. For the desktop version, AMD recommends that one connects the display to the integrated graphics output, and not the discrete graphics card that is paired. This is because the Catalyst drivers that enable Dual Graphics only works on the operating system level, so users who link the display to the discrete graphics' outputs, they will not get the bootup screen at all. It will be all black on the display till the OS is loaded. AMD tells us that they are looking into updating the drivers or BIOS to improve this situation, but currently, that's how things are.

From our experience with two A75 motherboards, each motherboard vendor may have their own BIOS implementation to enable this functionality, so we highly recommend that users read through the manual on the proper sequence. If done correctly, users will be able to check the 'Enable CrossFireX' check box in the Catalyst Control Panel.

Test Setup

AMD's A-series processors are aimed at the mainstream and with the top model, the A8-3850 going for US$135, we were scrambling to find suitable comparisons from existing AMD and Intel processors. AMD itself says that the A8-3850 will compete directly against Intel's Core i3-2100, which is a Sandy Bridge dual-core processor running at 3.1GHz, with HyperThreading support. It has Intel HD Graphics 2000 along with a lower price of US$125 and a lower 65W TDP. Unfortunately, it's not a model that we have in our lab and hence we had to make do with the Core i5-2400, which is a quad-core 3.1GHz processor going for US$190.

Obviously, it's not the best of comparisons, which is why we also included older processors like the previous generation, Clarkdale, Core i3-550 and 530, which costs around US$125 and US$113 respectively. Since the A8-3850 is based on the Phenom II architecture, we also included a couple of Athlon II X4 processors, which are similar quad-core processors of around the same price range - the Athlon II X4 645 costs $105. Just think of these Athlon II X4 processors as Llano without the integrated graphics.

Check the lists below for the actual test configurations for each architecture. As usual, we'll test all processors with the same discrete graphics card to get a gauge of its general CPU capability, but we will also test the integrated graphics separately. It's the reason why we included an Intel H67 system.


AMD A8-3850 Test Configuration

  • AMD A8-3850 APU
  • ASUS F1A75-V PRO
  • 2 x 1GB Kingston HyperX DDR3-1333 (CAS 7-7-7-20)
  • Zotac GeForce GTX 260 O.C (ForceWare 197.45) for discrete graphics tests
  • AMD Radeon HD 6550D (Catalyst driver 8.86) for integrated graphics test, 256MB frame buffer
  • Western Digital Caviar Black 1TB SATA 6Gbps (one single NTFS partition)
  • AMD SATA AHCI driver: 1.2.0.164
  • Microsoft Windows 7 Ultimate (64-bit)


Intel Core i5-2400 (Sandy Bridge) Test Configuration

  • Intel Core i5-2400
  • ASUS P8P67 Deluxe (BIOS: 0602)
  • 2 x 1GB Kingston HyperX DDR3-1333 (CAS 7-7-7-20)
  • Zotac GeForce GTX 260 OC (ForceWare 197.45)
  • WD Caviar Black 1TB, SATA 6G (Intel 6G)
  • Windows 7 Ultimate (64-bit)
  • Intel INF 9.2.0.1015


Intel Core i5/i3 (Clarkdale)Test Configuration

  • Intel Core i3-550 and Core i3-530
  • MSI P55-GD85 (BIOS 1.37)
  • 2 x 1GB Kingston HyperX DDR3-1333 (CAS 7-7-7-20)
  • Zotac GeForce GTX 260 O.C (ForceWare 197.45)
  • Western Digital Caviar Black 1TB SATA 6Gbps (one single NTFS partition)
  • Intel INF 9.1.0.1025
  • Microsoft Windows 7 Ultimate (64-bit)


AMD Phenom II X4/X6 Test Configuration

  • AMD Athlon II X4 645 and 635
  • ASUS Crosshair IV Formula (AMD 890FX + SB850, 0702 BIOS)
  • 2 x 1GB Kingston HyperX DDR3-1333 (7-7-7-20)
  • Zotac GeForce GTX 260 O.C (ForceWare 197.45)
  • AMD Chipset driver
  • Western Digital Caviar Black 1TB SATA 6Gbps (one single NTFS partition)
  • Microsoft Windows 7 Ultimate (64-bit)


Intel H67 Test Configuration (for Intel HD Graphics)

  • Intel Core i5-2400 and Core i5-2600K
  • Gigabyte H67A-UD3H
  • 2 x 1GB Kingston HyperX DDR3-1333 (CAS 7-7-7-20)
  • Intel HD Graphics (256MB frame buffer)
  • WD Caviar Black 1TB, SATA 6G (Intel 6G)
  • Windows 7 Ultimate (64-bit)
  • Intel INF 9.2.0.1015, Intel Graphics Driver - 8.15.10.2266


Benchmarks

The following benchmarks were used to test the CPU and the integrated graphics:

CPU Benchmarks

  • BAPCo SYSmark 2007 Preview (ver 1.05)
  • Futuremark PCMark Vantage (ver 1.03.1, 64-bit)
  • Lightwave 3D 9.0 (64-bit)
  • 3ds Max 8 (SP2)
  • Cinebench 11.5 (64-bit)
  • Handbrake 0.9.4
  • Futuremark 3DMark Vantage (ver 1.03.1)
  • Far Cry 2
  • Battlefield Bad Company 2


Integrated Graphics Benchmarks

  • Far Cry 2
  • Battlefield Bad Company 2
  • 3DMark Vantage
  • Blu-ray Playback Testing (Black Snake Moan, Superman Returns) using PowerDVD 10 (ver 2308)

 

Results - SYSmark 2007 Preview

AMD from BAPco, the makers of SYSmark, citing differences over the direction of the latest SYSmark benchmark, SYSmark12. AMD felt that the weighting of the benchmark was not indicative of current computing trends, where the GPU is increasingly important and since the company was unable to change SYSmark12 from inside the organization, it decided to resign from it.

In the past, companies have tried to manipulate benchmarks scores via compiler tricks and other unofficial optimizations, but AMD's resignation, which was followed by NVIDIA and VIA (though these two companies have resigned without making an official statement), does put the spotlight back on SYSmark. We haven't yet transitioned to the newer version but from our experience, SYSmark does tend to be fairly conservative in terms of the applications tested. Its long gestation and relatively slow updates also mean these applications may become outdated even before the benchmark is launched.

Of course, we are not privy to the internal weighting that determines the final score but even without looking at SYSmark scores, there's no doubt that Intel has held the upper hand when it comes to pure CPU performance since the Core 2 generation. Llano, with its focus on heterogeneous computing, is not the APU to change this perception. We continue to use SYSmark 2007 for the time being as the benchmark tries to capture performance based on some of the common tasks and usage - including wait times to depict actual user pauses and disruptions that are common in everyday use. This is one aspect we've not seen other benchmarks mimic. There are other tests in our suite which will show good use of the APU's graphics engine, but not for this test.

In our SYSmark 2007 test and many other benchmarks following, we've used a discrete graphics card in the form of a GeForce GTX 260 from Zotac to keep all parameters in level that we can ascertain the actual influence of the various CPUs. The AMD A8-3850 performed better overall than the Athlon II X4, which are quad-core processors at a higher clock frequency. The scores help to illustrate AMD's point about IPC improvements in its new APU, but they weren't sufficient for AMD to compete against Intel here. Looking at the breakdown, the A8-3850 put up a fight in Video Creation and Productivity against the Core i3-500, but fell short in 3D manipulation and E-Learning categories.

Results - Futuremark PCMark Vantage

While SYSmark 2007 showed the A8-3850 behind the Core i3-500 processors, the newer system suite, PCMark Vantage had a different story. The overall score for the 3850 was comfortably ahead of the Core i3-500 series. Although the Sandy Bridge Core i5-2400 was the dominant leader (which is a given since it's a much more capable CPU), the scores did imply that SYSmark 2007 may not quite cut it nowadays. In both the Memories and Productivity suites, the A8-3850 was faster than all the other processors bar the Core i5-2400. It does have some potential, especially as a mainstream offering. As with the previous benchmark, note that all platforms used a discrete graphics card to remove the influence of the graphics subsystem. This will help us see just how capable the various CPUs are and how much they've progressed.

 

 

 

 

Results - Lightwave 3D 9.0

The A8-3850 was around the level of an Athlon II X4 in Lightwave 3D 9.0. Given its slower clock speed, it's a decent showing and while it started off slower than the Core i3-500, it was competitive once we got to 8 threads. In short, the 3850 won't be a giant leap over the previous generation of AMD processor, but it's still competitive enough for its mainstream nature. Again, in this benchmark, all platforms used discrete graphics but it shouldn't matter since this is primarily a CPU orientated test.

Results - Cinebench 11.5 & Handbrake 0.9.4

Cinebench 11.5 confirmed what we saw in Lightwave - the score of the AMD A8-3850 fell between that of an Athlon II X4 645 and 635. All three are quad-core processors, but the slightly slower clock speed on the 3850 probably contributed to its position. The dual-core Core i3 processors, may have relatively high clocks, but their lack of cores was evident here. Handbrake too followed the trend seen so far, with the 3850 marginally better than the Athlon II X4 635. In both benchmarks, Intel had a clear lead with the Core i5-2400, but it could be much closer if we had a Core i3-2100 to compare with.

Results - 3ds Max 8 (SP2)

3ds Max 8 presented two different impressions of the A8-3850. The APU was on par with the Athlon II X4 635 when the Light Tracer plugin was used, but that changed when Radiosity was selected - it was drastically improved in Radiosity, where it was only around 12% slower than the Core i5-2400. Given that Radiosity is a much more intensive option between the two renderings, hats off to the new APU which tackles it elegantly. Again, all these results are courtesy of the CPU aspect only as all platforms were using discrete graphics to level the playing field. Graphics orientated tests using the integrated graphics are just pages away.

Results - Futuremark 3DMark Vantage

With all the processors using the same graphics card, any differences in the overall scores were very minor. The CPU breakdown did show the A8-3850 to be slightly ahead of the Core i3-500 processors. However, the Athlon II X4 processors too were marginally in front of the A8-3850.

Results - Far Cry 2 & Battlefield Bad Company 2

Moving on to two games, Far Cry 2 and Bad Company 2, both of which are DirectX 10 capable. On this page, all platforms are still using a discrete graphics card. The A8-3850 managed to stay ahead of the Athlon II X4 processors, but the Core i3-500 processors were significantly faster in Bad Company 2 while the Core i3-550 performed slightly better in Far Cry 2. It does show that one needs a decent processor to get the best out of your graphics cards, though AMD's point is that a mainstream product like the 3850 is less likely to be paired with a mid-range or higher graphics card. Integrated graphics performance is next.

Integrated Graphics Performance

The main story of the AMD A-series APU is its integrated graphics performance and it's where this new APU rose to the occasion. The thing that you need to know about the integrated graphics on the AMD APU is that memory frequency matters. That's system memory we're talking about, which is of course conscripted by the graphics cores in the APU for its own use. For our integrated graphics testing, we manually set the frame buffer to 256MB, but users with more system memory can always allocate more for better performance.

As we found out, there's quite a decent amount of scaling going from DDR3 1066MHz to 1333MHz and to 1600MHz. The architecture supports up to DDR3 1866 currently, so there's even more performance uplift if you use faster memory. AMD also states that memory latency will play a part. Due to time constraints, we haven't tested this aspect but we feel that differences will be minor compared to using higher frequency memory.

The results were pretty impressive, for integrated graphics at least. In the three benchmarks we tested, the A8-3850 with DDR3 1600MHz memory was about as capable as an entry level discrete graphics card like the NVIDIA GT 220 with 512MB of RAM. And it was running with a vastly faster CPU, the top Sandy Bridge Core i7-2600K processor. By itself, the performance of the integrated graphics on the Core i7-2600K was at best 2/3 that of the A8-3850 with DDR3 1066MHz.

It was no contest at all. If one takes the lower end Intel HD Graphics 2000, like the one on the Core i5-2400 instead of the higher clocked HD Graphics 3000 (Core i7-2600K), the differences were even greater. The A8-3850 at its slowest was almost three times that of the Intel HD Graphics 2000. More importantly, these are extremely playable frame rates at settings that one expect mainstream machines to be running at. One cannot use the old argument that it was futile to choose between integrated graphics from AMD or Intel because both solutions were not good enough.

This is clearly no longer the case. Even if you push the resolution to 1280 x 1024 pixels, we bet that the AMD A8-3850 with DDR3 1600MHz would be able to handle the two DX10 games, Far Cry 2 and Bad Company 2 decently. Integrated graphics is finally good enough.  

Next up, we also tried out the new, integrated Universal Video Decoder 3 (UVD3) on the APU. Just like a discrete graphics card from AMD or NVIDIA, the Blu-ray playback on the A8-3850 was issue-free and we recorded similarly low CPU utilization numbers on the APU as an NVIDIA GT 220. Although the current generation of Intel Core processors was also quite capable of playing HD content without a hitch, the CPU utilization was somewhat higher. This proves that the AMD A8-3850 is much more efficient for these tasks.

Dual Graphics Performance

Now that we have seen the performance of the integrated graphics on the A8-3850, what about AMD's Dual Graphics. Does this 'new' form of Hybrid CrossFire improve on previous implementations and actually make it worth the time? 

To find out, we decided to pair the A8-3850 (with DDR3 1333MHz) with an Radeon HD 6670 with 1GB DDR5. This is suggested by AMD's own Dual Graphics configuration page and the company calls this combination, Radeon HD 6690D2. AMD also states that Dual Graphics will only work with DirectX 10 games and above, so that's another consideration for users.



All the tested DirectX 10 benchmarks appeared to take a step backwards with Dual Graphics enabled. The test system (with the A8-3850 APU) with only the Radeon HD 6670 installed had the best scores in all three benchmarks. The Dual Graphics enabled system or Radeon HD 6690D2 had significantly lower performance. Given the early nature of the hardware and software, we'll give AMD the benefit of the doubt for now and return to this topic once the platform is more mature. It wasn't much rosier on the mobile Llano platform either, so it's clearly something AMD has room to improve upon.

Power Consumption

First up, lets compare power consumption if you were to have a discrete graphics card for all compared platforms. The A8-3850 is rated at 100W TDP compared to 95W for the Intel Core i5-2000 series and from what we saw in our power consumption test, AMD has mostly kept parity with its Sandy Bridge competitor here, the Core i5-2400. At idle, the A8-3850 is slightly lower than the Core i5-2400 and it wasn't so much different at peak. The older Core i3 processors however were more energy efficient and considering that the Core i5-2400 does have superior CPU performance over the 3850, it's not exactly a clear-cut win for AMD. The good news is that the process shrink has ensured that the AMD A8-3850 beat the Athlon II X4 processors in power draw.

Not forgetting the integrated graphics aspect, if one takes out the discrete graphics and go with AMD's APU, the power consumption at idle drops to 65W. The maximum power draw we saw was 140W in SPECviewperf 10. That's quite a savings there for those who are considering this route.

 


Overclocking

Take a look at the CPU-Z screen below. Yes, it's no photoshop - the AMD A8-3850 reached a maximum overclock of 4.7GHz. And all we did was push the multiplier up, without an increase in voltage, until it hit the maximum allowed at 47. Unfortunately, while the increase in clock speed was impressive, it's not real.

You see, we found that even though the increase in clock speed was impressive, there was barely any performance gain. What we got was an almost negligible amount of between 1 to 2%. What was exactly the problem here?

Well, it turns out that the existing APU in the market now come with locked multipliers. Therefore, the only way to overclock these APUs is to push the base clock instead, which of course also affected other clock speeds like the memory frequency in the system. In short, it's not as easy as pushing the multiplier up.

Also, if the APU is locked, why were we able to change the multiplier? Besides the usual early BIOS fracas, we can speculate that AMD will be planning a 'Black Edition' APU in the future with unlocked multipliers; hence our review motherboards came with unlocked multipliers. We can only assume that retail boards will have a more complete BIOS that will disable the multipliers for locked APUs. As it stands now, it can be quite confusing for users.

As we found out about this overclocking issue rather late, we didn't manage to get as much time pushing the APU as we liked. Our final overclock was 3.2GHz, achieved by the default multiplier of 29 and a base clock of 115MHz. While modest, it did slightly more impact on our benchmarks than the '4.7GHz' on our initial attempt, though the Far Cry 2 scores remained static despite the overclocks.

 

 

Conclusion

Ever since Intel started into its processors, we have envisioned a time when a single processor could do everything. Unfortunately, Intel's promise of HD graphics was only good for viewing HD content - most mainstream 3D games remained unplayable. The second generation Intel Core processors featured improvements to the graphics performance, but again it didn't quite meet our gaming standards.

It has taken AMD roughly three years since its was announced to make good on its vision of merging the CPU and the GPU. We saw a hint of Fusion's potential earlier this year with , but the real deal is finally here. The is a strong competitor to Intel's mainstream Sandy Bridge mobile processors and the new desktop APUs launched today continues this trend.

The almost 50-50 split within Llano between the CPU and the GPU elements sums up AMD's Fusion efforts. Any less and it's likely that Llano would not have such promising performance in 3D games. The DirectX 11 support is an added bonus along with the expected hardware accelerated HD video playback (including stereoscopic 3D Blu-ray support).

It's not all roses for AMD. The CPU performance on the A-Series APU is arguably last generation and even the tweaks and process shrink have been unable to paper over the limitation in AMD's Phenom II architecture. Intel's Sandy Bridge processors are clearly superior in raw CPU power and if your applications are heavily slanted towards the CPU, there's little reason to consider Llano as even Intel's dual-core processors will be more than a match.

Instead, it's the mainstream and budget users that may benefit from AMD's 'balanced' approach. When the budget is tight enough that the consumer can only afford an entry level discrete graphics card, then it's time to consider an A-Series APU. At its best, one can get graphics performance to rival the low-end graphics solution from NVIDIA and AMD themselves. Then there are the other benefits of going integrated - the possibility of smaller form factors and lower power consumption.

At US$135 for the top A8-3850 and a target market of between US$400 and US$700 for PCs using these APUs, AMD knows where these APUs belong. They won't be challenging Intel's best - that's left to AMD's upcoming Bulldozer - but there's plenty of fish in that segment.

The AMD A-Series APU brings integrated graphics performance to its highest level yet.

The AMD A-Series APU brings integrated graphics performance to its highest level yet.

Our articles may contain affiliate links. If you buy through these links, we may earn a small commission.

Share this article