XFX Brings Back Blower Style R9 390X

When AMD first launched their R9 290 and 290X GPUs back in 2013, many had mixed feelings for the blower style cooler. While the cooler was one of the best efforts yet from AMD, it was not enough for the hot Hawaii chips, leading to high temperature, throttling and noisy operation. In the end, many opted for custom coolers which were not blowers and did a better job at cooling. Two years later, it looks like XFX is planning on releasing the 390/X series cards equipped with what appears to be the original 290X cooler.

Using the Grenada core, the R9 390X is fundamentally the same as the 290X, with maybe better binning and process improvements to differentiate them. XFX is also using the older cooler and not the revamped one AMD launched with the R9 390X in a while ago. The new 390X blower cooler take’s its design cues from the Fury X and Nano. Given XFX’s choice of using the 2013 cooler and not the 2015 model, either XFX has a lot of stock left or there is little difference between the 2015 and 2013 models. You can check out the 2015 model below.

There is undoubtedly a market for blower style GPUs as they tend to exhaust more of the GPU heat out of the case. This is especially important for SFF and builds with poor case cooling. If the cooler is still lacking though, there won’t be many users who will pick it up. The biggest advantage is that with a reference board, watercooling blocks will be easier to source. It will be interesting to see how well the blower card does, both performance and sales wise.

AMD Catalyst 15.7 WHQL Driver Adds Cross Generation Crossfire Support

Something that AMD have been falling behind on lately is the WHQL drivers, well drivers in general. Beta drivers are released every few months, but a certified WHQL driver has taken over 200 days to reach us. Let’s not dwell on the past, we have one here, we’ve tested it and it works perfectly fine. However, it seems AMD has returned to form and opened up cross generation Crossfire again. Over at VideoCardz.com, Crossfire has been tested between the new R9 390X and an R9 290X.

The cards used weren’t matching, so the R9 390X 8GB was the only available variation, but it was tested with an R9 290X 4GB. This then limits the R9 390X to use just 4GB as Crossfire utilises the lowest VRAM quantity. Scores are around where we previously tested 2x R9 290X 8GB cards, so there is little a performance penalty for using the previous generation.

We will be confirming this new feature for ourselves by testing the R9 390 with an R9 290 and an R9 380 with an R9 285. If it works across most of the new generation, it could prove a nice upgrade to those who already own the 200 series equivalent.

With the Crossfire options opened up, would you be willing to purchase one of the newer cards to Crossfire or even buying an older card to bridge the gap until a 300 series card becomes cheaper? Let us know in the comments.

AMD Officially Announce Details of High Bandwidth Memory

We’ve been waiting for details on the new memory architecture from AMD for a while now. Since we heard the possible specifications and performance of the new R9 390x all thanks to the new High Bandwidth Memory (HBM) that will be utilised on this graphics card.

Last week, we had a chat with Joe Macri, Corporate Vice President at AMD. He is really behind HBM and has been behind it since product proposal. Here is a little bit of background information, HBM has been in development for around 7 years and was the idea of a new AMD engineer at the time. They knew, even 7 years ago, that GDDR5 was not going to be an ever-lasting architecture and something else needed to be devised.

The basis behind HBM is to use stacked memory modules to save footprint and to also integrate them into the CPU/ GPU itself. This way, the communication distance between a stack of modules is vastly reduced and the distance between the stack and the CPU/ GPU core is again reduced. With the reduced distances, the bandwidth is increased and required power dropped.

When you look at graphics cards such as the R9 290x with 8GB RAM, the GPU core and surrounding memory modules can take up around a typical SSD size footprint and then you also need all of the other components such as voltage regulators; this requires a huge card length to accommodate all of the components and also the communication distances are large.

The design process behind this, in theory, is very simple. Decrease the size of the RAM footprint and get it as close to the CPU/ GPU as possible. Let’s take a single stack of HBM, each stack is currently only 1GB in capacity and only four ‘DRAM dies’ high. What makes this better than conventional DRAM layout is the distance between them and the CPU/ GPU die.

With the reduced distance, the bandwidth is greatly increased and also power is reduced as there is less distance to send information and fewer circuits to keep powered.

So what about performance figures? The actual clock speed isn’t amazing, just 1GBps when compared to GDDR5, but that shows just how powerful and refined they are in comparison. Over three times the bandwidth and lower voltage; it’s ticking all the right boxes.

There was an opportunity to ask a few questions towards the end, sadly only regarding HBM memory, so no confirmed GPU specifications.

Will HBM only be limited to 4GB due to only 4 stacks (1GB per stack)?

  • HBM v1 will be limited to just 4GB, but more stacks can be added.

Will HBM be added into APU’s and CPU’s?

  • There are thoughts on integrating HBM into AMD APU’s and CPU’s, but current focus is on graphics cards.

With the current limitation only being 4GB, will we see negative performance in high demanding games such as GTA V at 4K that require more than 4GB?

  • Current GDDR5 memory is wasteful, so despite lower capacity, it will perform like higher capacity DRAM

Could we see a mix of HBM and GDDR5, sort of like how a SSD and HDD would work?

  • Mixed memory subsystems are to become a reality, but nothing yet, main goal is graphics cards.

I’m liking the sound of this memory type; if it really delivers the performance stated, we could see some extremely high power GPU’s enter the market very soon. What are your thoughts on HBM memory? Do you think that this will be the new format of memory or will GDDR5 reign supreme? Let us know in the comments.

Sapphire R9 290X Tri-X 8GB CrossFireX Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different, as we have a pair of Sapphire’s amazing R9 290x 8GB Tri-x edition graphics cards to combine together for some CrossFireX action. The dedicated review for this graphics card can be found here. When striving for the best results, it is favourable to test 2 of the same models to allow for no variation in any clock speeds or variations in any integrated components, so today we should see some excellent results.

In the dedicated review, this graphics card has more than enough power to play most games at 4K resolution at 60FPS, faltering slightly in the more demanding Metro Last Light.

We inserted both graphics cards onto our Core i7 5820K and X99-based test system, ensuring adequate spacing for optimum cooling and that both have access to sufficient PCI-e bandwidth for CrossFire operation.

The typical ‘hot spot’ when arranging a CrossFire or SLI configuration is the closest graphics card to the processor, due to both of these cards being equipped with the Tri-x cooler, positioning isn’t an issue.

As these graphics cards have been subject to Sapphires treatment, they have slightly higher clock speeds than a reference model, but as these are both the same cards, there should be little to no variation in clock speeds; this will result in maximum gains during testing.  

Sapphire Radeon R9 290X Vapor-X OC 8GB Graphics Card Review

Introduction and A Closer Look


The R9 290X has led AMD’s single GPU offerings for what seems like quite a long time now. Released in October 2013 the R9 290X is only a year old but it’s ageing process has been accelerated by successive faster Nvidia graphics cards; the GTX 780 Ti and recently the GTX 980. To ensure competitiveness in the market place AMD has maintained the R9 290X at an attractive price point although Nvidia’s GTX 980, being a generation ahead in architectural terms, has thrown a spanner in the works. Many AMD partners have taken it upon themselves to issue price cuts on the R9 290X, independent of AMD’s official pricing guidance. On the wave of  the R9 290X price cuts today we are assessing Sapphire’s newest launch: the R9 290X Vapor-X 8GB graphics card. It uses an identical cooling solution to the 4GB Sapphire R9 290X Vapor-X and it’s also visually similar to the Tri-X cooling solution equipped on slightly cheaper Sapphire cards, except with a different colour scheme. The obvious flagship feature of this new card, is the doubling of VRAM, aimed at gamers tackling the newest video-memory-intensive titles like Middle-earth: Shadow of Mordor.

Packaging and Accessories

The packaging and accessory bundle is a much better equipped than most R9 290X’s on the market as it includes a free mouse-mat, HDMI cable and dual power adapters.

A Closer Look

The card itself is a sheer monster: triple fan, triple slot and backplate equipped.

At the bottom we find a rather dense vapour-chamber style heatsink.

At the end of the card we get a glimpse at the five heat-pipes being used. There’s a trio of 8mm heat-pipes at the centre of the contact and a couple of 6mm at the edge of the GPU.

The card nearly takes up three slots in thickness which isn’t surprising given how hot the R9 290X can run: you need a lot of heatsink to tame Hawaii. Along the top we find a pair of 8 pin connectors for power delivery and a BIOS switch for switching between UEFI and legacy BIOS operation modes.

On the rear of the card we find a nice looking backplate and individual heatsinks for the VRM phases: pretty cool!

The I/O offers a pair of DVI, HDMI and a DisplayPort. That’s enough connectivity to power six displays with the help of a DisplayPort MST hub.

Examining AMD’s Driver Progress Since Launch Drivers: R9 290X & HD 7970

Introduction


AMD and Nvidia both talk fairly big when it comes to driver updates. With every driver iteration that is released we hear the usual technical (or should that be marketing?) talk about improved performance in this, that and the other. After a lot of thinking I decided I wanted to investigate further. Wouldn’t it be interesting to see how much progress AMD and Nvidia actually make with their drivers over the duration of a product’s life cycle? We’ll be starting this two piece series with AMD and in particular I want to look at the last two flagship single GPUs of each generation. I’ll be putting the XFX AMD HD 7970 Double Dissipation 3GB graphics card on the test bench along with the XFX AMD R9 290X Double Dissipation 4GB graphics card: that’s the flagship single GPUs of the HD 7000 and R9 2xx series. I will be benchmarking both graphics cards on an identical test system at stock clocks under two different scenarios. Scenario 1 is using the AMD driver package that they launched with and scenario 2 is using the most recent AMD driver package made available. In this way we are able to see the driver progress that AMD’s HD 7970 and R9 290X have made since they were both launched.

AMD’s HD 7970

 

AMD’s HD 7970 was launched on December 22nd 2011 and used AMD Catalyst driver package version 11.12 RC11, this was a special beta driver release for the AMD HD 7970 as official support wasn’t added until Catalyst 12.2 WHQL was released. AMD’s R9 290X launched on October 24th 2013 and used AMD Catalyst driver package version 13.11 Beta 6. The most recent driver package release from AMD (at the time of writing this article) is Catalyst 14.7 RC1. Of course AMD’s HD 7970 has had a significant amount more time on the market, nearly 3 years, whereas the R9 290X has had less than 1 year. It is also worth noting both the R9 290X and HD 7970 are built on virtually identically 28nm GCN architecture so many of the largest optimisations had already been made for the GCN architecture before the R9 290X was even released. That’s a long-winded way of saying we will see dramatically more progress with the HD 7970 than the R9 290X. However, either way it will be really interesting to see what the results show, so let’s get on with some testing!

AMD’s R9 290X

AMD To Update GCN Series With Iceland, Tonga and Hawaii XTX

The next-generation GPU wars based on new graphics architectures and 20/16nm process nodes is not set to begin until early 2015. In the meantime AMD and Nvidia are both having to make do with product refreshes and tweaks based on their existing 28nm GCN and Kepler designs respectively. The latest refreshes are to come from the AMD camp and they are looking to out three new GPUs based on their GCN 28nm design: Iceland, Tonga and Hawaii XTX.

Iceland is set to be a mid-range GPU for the mobile market, it will also launch as a desktop card at a later date and will replace AMD’s Cape Verde GPUs: which form the HD 7750, HD 7770, R7 250X and some R7 250 models. As a result Iceland will be competing with Nvidia’s GM107 Maxwell parts: the GTX 750 Ti and GTX 750.

Next up is Tonga which we have already heard a lot about. According to VideoCardz the Tonga GPU will be based on the Tahiti design, which makes the R9 280X, R9 280, HD 7970 (GHz) and HD 7950 (Boost). However, it will be a slightly slimmed down version with a 256 bit memory bus versus the current 384 bit. Given the reduced memory bandwidth it seems likely that it will slot in between the R9 270X and R9 280. The Tonga GPU will exist alongside current Tahiti products until the 300 series is launched by AMD next year.

Finally we have Hawaii XTX. We have heard about this GPU before and the basic rumour was that the Hawaii GPU was not fully utilised and so a Hawaii XTX model would come out with extra stream processors and would offer a faster single GPU than the R9 290X. It turns out that the speculated R9 295X did not exist, at least not with 3072 GCN cores. Now it appears Hawaii XTX will merely be a revision to the Hawaii XT GPU, think of it as the R9 290X GHz Edition. Of course the R9 290X already runs at 1GHz , but the point I’m trying to get across is it will be similar to what AMD did when the HD 7970 became the HD 7970 GHz Edition. They will be increasing the core clock speeds and using a more refined second revision of the GPU that should clock higher and be more stable when overclocking.

Source: VideoCardz

Image courtesy of VideoCardz

6400×1080: Testing Mixed-Resolution AMD Eyefinity

Introduction


Around a month ago we announced to you that AMD was releasing version 3.0 of its Eyefinity software package. One of the key new additions to AMD’s Eyefinity 3.0 software is the ability to support mixed resolution monitors in an Eyefinity configuration through a variety of methods. This exciting addition means you can mix and match a variety of displays so there’s no reason to scrap any mis-matched monitors if you want to do multi-screen gaming.

Mixed resolution support will be implemented in three different ways: fill mode, expand mode and fit mode. The first of those, fill mode, utilises the full resolution of all displays but as a result of that the resolutions created are not quadrilateral (i.e. are not rectangular). The next mode, expand mode, creates a rectangular resolution based on the vertical pixel count of the highest resolution display and the combined horizontal width of all displays. The resulting area that the smaller displays cannot show is designated as “unavailable area” because the displays lack the resolution to project it.

The final method is fit mode by which the vertical pixel counts are kept constant across all three displays so it will use the maximum common vertical height. In the example below you can see that you’d end up with three 1080p height screens but the middle one becomes 2560 x 1080. This differs to previous Eyefinity where it would simply set the middle display to 1920 x 1080, now you can use the extra width by sacrificing the height which is preferable to sacrificing both.

To test out this new AMD Eyefinity technology AMD have sent us over the required monitors to run a 6400 x 1080 configuration. The first is the central 2560 x 1080 resolution display, for this AMD sent us the AOC Q2963PM 29 inch panel. Despite being a 29 inch panel, the vertical height is the same as a 23 inch 1920 x 1080 panel because remember panel height is measured diagonally, the extra width doesn’t make this display any taller. To go either side of the AOC Q2963PM AMD also sent us two NEC MultiSync E231W displays. Together these three panels form a 6400 x 1080 resolution with a uniform vertical height that makes it look like a natural combination. For testing this high-resolution set up we are testing it with XFX’s R9 290X Double Dissipation 4GB graphics card on our usual high-end graphics card test system. We were tempted to use CrossFire 290Xs but decided to opt for one card to keep the results as realistic and accessible as possible.

AMD R9 295X Could Be Incoming, R9 290X Hawaii XT Is Not A Fully Utilised Chip

Some interesting news emerging from the Overclockers UK forums suggests that we could have an AMD R9 290X successor coming in the near future, possibly called the AMD R9 295X. The information was revealed by professional overclocker 8 Pack who stated that the R9 290X is “not full fat” meaning that the Hawaii chip the R9 290X is based on is not being fully utilised. Therefore there is a high-end part which fully utilises the Hawaii GPU and will offer 48 compute units and 3072 stream processors which compares favourably to the current R9 290X.

AMD’s R9 295X will probably arrive in the very near future as the suggestion that there is already an NDA in place means preparations for a launch must be fairly close. The new R9 295X will be geared to take on Nvidia’s GTX 780 Ti, of which the R9 290X already does a great job of competing with when you look at custom cooled versions. As 8 Pack duly notes if the R9 295X is a well-executed release it could force Nvidia to lower its prices.

Source: Overclockers UK, Via: WCCFTech

Image #1 courtesy of AMD, Images #2-4 courtesy of Overclockers UK, Image #5 courtesy of WCCFTech

XFX Radeon R9 290X Double Dissipation 4GB GDDR5 Graphics Card Review

Introduction


AMD’s R9 290X is back in business when it comes to competing with Nvidia’s equivalents. The custom R9 290Xs easily beat out equivalently priced or sometimes even more expensive custom GTX 780s making them a solid proposition for any gamer on a budget but wanting maximum performance. Today we are looking a highly competitive R9 290X from XFX which has some of the most aggressive pricing on the market and is currently the cheapest R9 290X available in most markets by quite a significant margin, particularly in the USA where it can be had for just $450 at the time of writing which is $50 cheaper than the next cheapest R9 290X option. This means custom R9 290Xs are now $100 cheaper than MSRP pricing of $550 and XFXs option looks almost too good to resist. While XFX have lowered the price of their Double Dissipation SKU they have also recently added a “cherry picked” DD Black Edition skew with an additional 50MHz overclock and more overclocking potential, this will probably fetch slightly more than the model we are testing today but broadly speaking those cards are identical. We’ve already tested three R9 290Xs so I’ve got a pretty good idea about what makes a good one and what makes a bad one, with the reference design being the textbook definition of a bad one. Let’s have a look at XFX’s offering and see whether it sacrifices on anything to make it the cheapest R9 290X currently on the market!

 

Packaging and Bundle

The packaging is fairly simple and offers up a coupon code for Battlefield 4 as well as pointing out UEFI BIOS support and unlocked voltage.

The back details features a bit more but there isn’t really much to see.

Included is a variety of documentation and a driver CD.

Accessories include two power adapters, one is dual 6 pin to 8 pin and the other dual molex to 6 pin. You probably won’t need these if you’re buying this type of card.

Powercolor Planning 8GB “Dual Core” R9 290X Devil 13 Graphics Card

Powercolor have recently made it known they are working on their own dual GPU R9 290X graphics card. From what we can see Powercolor are choosing to opt for the “Dual Core R9 290X” moniker instead of the “R9 295X2” moniker, probably because AMD have quite specific requirements of what an R9 295X2 GPU must be like – maybe it requires water cooling as standard? Powercolor’s variant gets the Devil 13 treatment which means it will use Powercolor’s own special air cooling solution featuring their patented double bladed fans and their Turbo Timer device which ensures the fans spin for a short period of time after the card shuts down to ensure better cooling.

The two fully fledged R9 290X GPUs will be powered by a 15 phase power delivery system based on PowerIRstages, Super Caps and Ferrite Core Chokes. The cooling solution has three double bladed fans blowing down onto a dense aluminium heatsink with 10 heat pipes and a huge triple slot width. Powercolor have added red LED backlighting for the Devil 13 logo as well as a dual BIOS system and four PCIe 8 pin connectors instead of the two PCIe 8 pin connectors used on the AMD R9 295X2 reference design card. Powercolor are also sprucing up the bundle package offering a Razer Ouroboros gaming mouse with every graphics card sold.

From what we can see this Powercolor Dual Core R9 290X Devil 13 graphics card is an air-cooled R9 295X2 with slightly lower clock speeds of 1000MHz instead of 1018MHz – the memory remains untouched. It also has the advantage of having four PCIe 8 pins. Expect pricing to be similar to the R9 295X2 in that $1500 region, especially considering Powercolor bundle a $150 gaming mouse with it.

More details will be released by Powercolor during Computex 2014.

Source: Powercolor

Images courtesy of Powercolor

8Pack Smashes 3DMark World Record With 4 MSI R9 290X Lightnings: 1300MHz Each!

When it comes to overclocking there is no bigger name than the MSI Lightning series, time and time again it has proven itself the GPU series of choice for extreme overclockers to smash through world records and you guessed it, today it’s happened again. The professional overclocker in this case is the very talented 8Pack of Overclockers UK, while the graphics card in question is MSI’s mighty fine R9 290X Lightning – but not just one of them, a whopping four hit 8Pack’s test bench. The world record is an impressive 35018 marks in 3DMark Fire Strike for four way R9 290X CrossFire X. That score was achieved with a healthy 1300MHz on the core and 1625 MHz on the memory with some custom EK water cooling and a load of extra fans to keep the monstrous VRM solution running cool and stable. I’m honestly quite impressed the cards only needed water cooling to reach such an impressive speed, maybe there is even more headroom to be tapped with the help of some LN2 in future!

The rest of the system includes 16GB of Corsair Dominator Platinum running at 1338 MHz (2676MHz effective), an Intel Core i7 4930K running at a whopping 6.15GHz with LN2 cooling, an ASUS Rampage IV Extreme Black Edition and an Antec High Current Pro 1200W power supply.

If you feel like having a crack at some world records yourself then you can buy the MSI R9 290X Lightning graphics card today for £529.99 at Overclockers UK, €589.00 at Caseking or $699.99 at Newegg.

Images courtesy of 8Pack (OCUK)

AMD Radeon R9 295X2 8GB Graphics Card Review

Introduction


The AMD Radeon R9 295X2 graphics card is finally upon us. I know a lot of people have been eagerly awaiting this graphics card for the last few months – I myself I have been tracking its existence since AMD phased out the HD 7990 – it was inevitable there would be a replacement. Yet when we first got our hands on the R9 290X I wasn’t so sure how feasible a dual Hawaii GPU graphics card was going to be – the R9 290X was already an immensely hot graphics card with significant heat and noise problems – how could AMD make a graphics card with two of these GPUs work? The inspiration for the AMD Radeon R9 295X2 appears to have come from ASUS’ Ares II graphics card which made use of a hybrid cooling solution on a similar dual GPU solution. AMD clearly knew of the weaknesses of the Hawaii core and they have shaped the AMD Radeon R9 295X2 to correct those weaknesses. Thus AMD’s first ever water cooled reference graphics card has been born and what a performance monster it looks set to be. With two fully enabled Hawaii GPUs the Radeon R9 295X2 boasts an impressive 12.4 billion transistors, 5632 Stream processors ( 2 x 2816) and 11.5 TFLOPS of compute power.

The Radeon R9 295X2 gets a hefty 8GB of GDDR5 memory over dual 512 Bit memory buses but course only 4GB is usable as the GPUs have to mirror each other. 4GB is still a heck of a lot of frame buffer and so this beast is really targeted at multi-panel gaming (5760 x 1080, 7560 x 1600 and so on) or 4K (3840 x 2160). Unlike the Radeon R9 290X and R9 290 the R9 295X2 easily has enough power to rip through 4K gaming with 60 FPS and upwards.

One of the keys to successfully crafting the dual Hawaii GPU is the use of Asetek’s AIO cooling system. The Radeon R9 295X2 makes use of two pump heads for each GPU feeding into a 1.5X thickness 120mm radiator. For such a monstrous 500W unit the cooling solution is relatively tame. The 120mm radiator is important because it allows compatibility with the maximum range of cases – you could easily mount it at the rear of most ATX cases or on the side panel of most ATX cases.

AMD’s second main problem, power consumption, has been circumvented by pushing beyond the limits of the ATX power delivery specification. Typically we’d see graphics cards always follow the golden rule of power delivery – 75W through the PCIe bus, 75W through the 6 pin and 150W through the 8 pin. In effect this means AMD’s R9 295X2 should be a 375W card because it has dual 8 pins. However, this is a 500W card and AMD is relying on people having power supplies capable of supplying 28 Amps to each of the 8 pin connectors. I’m not sure why AMD didn’t just opt for three 8 pins as I’ve seen a lot of other high-end GPU solutions use this. The end result is that you will need at least a 750W power supply of a really high quality if you want to be able to use this graphics card. However, given the $1500 price tag I don’t think the expectation that the buyer should have a top-notch power supply is an unrealistic one.

Specifications Analysis

We’ve already established that the Radeon R9 295X2 is a seriously expensive product targeting the enthusiast but how does it compare to other high end offerings on the market. Clearly there are a few competing solutions: GTX 780 Ti SLI, GTX Titan Black SLI and R9 290X CFX. The last-gen HD 7990 and GTX 690 are both now EOL and out of stock at most retailers while the yet-to-be-released GTX Titan Z will probably be a similarly fast product, though at $3000 you’d be almost nuts to consider buying one. That’s not to say I think $1500 is good value for money either but when sat next to the GTX Titan Z it certainly looks “reasonable” shall we say?

Packaging and Bundle

Our R9 295X2 sample turned up in this rather secretive and snazzy looking briefcase.

Inside we simply found the card with its attached cooling solution, there were no accessories or bundled items.

4K Gaming Showdown – AMD R9 290X CrossFire Vs Nvidia GTX 780 Ti SLI

Introduction


When I wrote our first “4K Gaming Showdown” article, nearly 6 months ago, the article was very popular among our readers. However, one of the most common pieces of feedback I received was something to the tune of: “As 4K monitors are so expensive most people who would be buying a 4K monitor will also probably be going with SLI or CrossFire of high end graphics cards, can you test that too?”. To a large extent I agree with that idea, of course there are cheaper 4K monitors out there but most of the good quality 60Hz ones (which you really need for a fluid gaming experience) still cost thousands of dollars, if you’re willing to spend thousands of dollars on a monitor then you would be likely to spend a similar amount on GPUs. I’ve therefore taken this as an opportunity to see what SLI and CrossFire bring to the table for (60Hz) 4K gaming – as requested by our readers. In this article we will be doing a “smackdown” of Nvidia GTX 780 Ti SLI against AMD R9 290X SLI in a variety of games. Based on specified MSRPs the GTX 780 Ti SLI ($699 x 2) is a much more expensive option than the AMD R9 290Xs ($549 x 2) but with recent (mining induced) inflationary pricing on AMD R9 290X graphics cards the pricing is actually a lot closer than you might think and so it begs the question – which combination should you choose for the best 4K Gaming experience?

As you can see above we have a rather complicated mix of cards – sadly we do not have two identical cards for each GPU. That said we will both running all cards at reference clocks for the respective GPUs and the solitary reference graphics card, the Nvidia GTX 780 Ti, will be set to maximum fan speed to prevent thermal throttling as it is the only card out of the four where cooling is a limiting factor. With those modifications made we have 2 reference speed R9 290Xs and 2 reference speed GTX 780 Tis all able to performance at their maximum potential without any thermal throttling. Without any further ado let’s get on with looking at the monitor, the performance and my thoughts on 4K with dual GPUs.

Anyone interested in the reviews of the above graphics cards can find those listed below in order of their appearance in the above picture (top to bottom):

  • Gigabyte GTX 780 Ti GHz Edition 3GB Graphics Card – read our review here.
  • Nvidia GTX 780 Ti 3GB Graphics Card- read our review here.
  • Powercolor R9 290X PCS+ 4GB Graphics Card – read our review here.
  • Gigabyte R9 290X WindForce OC 4GB Graphics Card – read our review here.

Powercolor R9 290X PCS+ 4GB Graphics Card Review

Introduction


The rationale for owning a graphics card like the R9 290X is that it is an absolute price to performance champion for driving high resolution and multi-monitor gaming: it offers smooth gameplay experiences on Eyefinity set ups, 4K panels, or 1440/1600p panels. Yet there are a few caveats to the R9 290X. If we first set aside the mining inflated pricing issue, which is something AMD and its board partners can do little about, then the main issues with the R9 290X are the immense noise and heat. However, noise and heat should be a thing of the past on the R9 290X since board partners started releasing custom designs – today we have with us one of those really high end custom designs. We are taking a look at Powercolor’s R9 290X PCS+ edition graphics card. It features a factory overclock, on both the memory and the GPU core, a backplate, a custom metal shroud and a triple fan cooling solution. This card really is everything you could hope for in an R9 290X, at least on paper it is. I’ve said this in the past about Powercolor graphics cards but I’ll say it again – they deserve kudos for being one of the only vendors who bother to overclock the memory on their high-end graphics card. It may not seem like much but it actually makes a surprising amount of difference in many games. The specifications of the graphics card can be seen below:

Our GPU-Z validation indeed reveals everything we would expect to see.

We get Powercolor’s usual red and themed product packaging with a raised PCS+ logo.

On the back we have the usual specifications, features and so on. More of those can be found on the product page if that interests you.

Included with our sample was a driver and utility DVD, quick installation guide and 6 to 8 pin PCIe adapter.

AMD Radeon R9 Graphics Cards Facing Supply, not Demand, Issues

AMD’s graphics cards have been in short supply for the past 3 months or so. The market has hoovered up the old stock from the HD 7000 series at knock-down prices and most of the stock from the high end R9 series including pretty much every R9 290X, R9 290 and R9 280X. While a lot of these graphics cards have been snapped up by crypto-currency miners looking to mine Scrypt algorithm coins the main reason for the shortage, according to SweClockers, is not high demand but short supply.

Apparently a shortage of components is the primary reason for availability issues, with most GPU manufacturers not having enough ASICs (GPUs), GDDR5 memory and other components. However, other people are blaming AMD for not allocating enough production to TSMC. AMD was apparently cautious about oversupplying a declining PC desktop market and hence under-produced. Either way, whether it is a shortage of components, excessive demand or a combination of both the message is clear – the supply shortage will be with us for a lot longer before the issue is fixed.

AMD graphics card vendor VisionTek recently confirmed on Facebook that it is experiencing a shortage of components for its production. Other vendors and AMD have not commented.

Image courtesy of AMD

Gigabyte R9 290X WindForce OC 4GB Graphics Card Review

Introduction


When AMD released their R9 290X it was the new fastest consumer gaming graphics card (with a single GPU) on the planet for a short period of time. Since its release it has been surpassed by Nvidia’s GTX 780 Ti but the R9 290X is still an immensely attractive option costing around $150 less than the GTX 780 Ti and offering similar levels of performance. However, one major issue plagued the R9 290X and that was heat (which is a result of its high power consumption and also led to excessive noise). Whether in “Quiet” mode or Uber mode the R9 290X rapidly reaches its thermal limit of 95 degrees and starts to throttle its own clocks to keep temperatures under control resulting in that 1GHz engine clock dropping as low as 650MHz in some applications. We have been waiting for non-reference designs to come along and fix that and today we have the first of those from Gigabyte. Gigabyte have strapped their successful WindForce 450 VGA cooler to the R9 290X. Gigabyte are offering a modest 40MHz overclock, but the real win comes with their cooling solution which should allow the R9 290X to actually run at its rated clocks or at least a lot closer to them than the reference design allows. Gigabyte’s card offers the same dual BIOS we’ve come to expect on the R9 290X but Gigabyte are promising much quieter operation and much cooler operation all with much higher consistent average clocks on both BIOS settings. All in all its a small difference on paper that should give a big difference in reality.

The packaging denotes the stand-out feature, which is the WindForce 450 cooling solution capable of taming up to 450W – trust me with the R9 290X this is going to be necessary!

The back of the box details more features about the card, we encourage you to check out the product page if you want to learn more about these.

Our sample came direct from Gigabyte so lacks the retail accessory pack, it was merely a card in a box.

Pics And Insights About GIGABYTE’s Radeon R9 290X And R9 290 WindForce 3X Released

Gigabyte apparently has two custom Radeon R9 290 series cards with WindForce 3X 450W cooling system in the works, according to VideoCardz.

The new Radeon R9 290 and 290X graphic cards will be clocked at 1040 MHz, a 40 MHz gain for the 290X and a 93 MHz for the 290, both having a memory running at a 5 GHz value. They are also equipped with 4GB GDDR5 memory across a 512-bit interface, having the R9 290X featuring a Hawaii XT GPU with 2816 Stream Processors while the R9 290 features a Hawaii PRO with 2560 shaders. Unfortunately. there is no detailed information available at this point as to how the graphic cards were modified and all features/improvements included. But we do get to have a look at a quick preview of how they are going to look and the box they come in below.

Gigabyte Radeon R9 290X OC – GV-R929XOC-4GD
 
 
 
Gigabyte Radeon R9 290 OC – GV-R929OC-4GD

It is also recommended by Gigabyte to have a 600 watts PSU for both cards, but this requirement probably won’t go through since the graphic cards are still equipped with only 6+8pin power connectors.

Thank you VideoCardz for providing us with this information
Images courtesy of VideoCardz

MSI Reveals Radeon R9 290X Lightning PCB



When it comes to Lightning, you know it’s all about performance, and MSI’s Lightning series just got a new addition to the family. They have just revealed the R9 290X Lightning PCB and rumor has it that we can also expect to see the MSI R9 290X Graphic Cards on shelves early next year.



The card features a 12+3+2 phase power design, similar to the GTX 460 Lightning card. Given that the GTX requires a 6-pin power connector only for the memory and taking into account that the R9 290X Lightning also requires 2 x 8-pin connectors, we can only assume that we are looking at a power source of 500+ watts to power up this monster.



Also, the MSI R9 290X Lightning PBC provides V-Check points to allow direct measurement of GPU, memory and auxiliary lines voltages. There are some rumors that the card might come with Military Class IV components as well, but there are no official announcement so far.

In terms of display connectivity, MSI will equip the R9 290X Lightning with a default set of outputs, meaning we are going to see 2 x DVI outputs, a HDMI output and a DisplayPort output.

Thank you Videocardz for providing us with this information
Images courtesy of Videocardz

Battlefield 4 Graphics Performance Overview With Current Generation GPUs

Introduction


Battlefield 4 has been one of the biggest game releases so far this year for gamers on all gaming platforms. The FPS title from EA and DICE has got off to a relatively shaky start with numerous audio, graphical and gameplay problems across the various platforms it was released on. In fact for many Battlefield 4 owners the game is still in a dysfunctional or buggy state, but you can expect (or hope) that EA and DICE will begin to patch and fix the majority of the problems within the coming weeks as they have said they will. The shaky launch aside, what most PC owners/gamers want to know, if they haven’t already found out, is how do current generation GPUs perform in Battlefield 4 on the PC?

Today we put that question to the test with an extensive, albeit not entirely complete, range of current generation AMD and Nvidia GPUs. On the AMD side we have the R7 260X, R9 270, R9 270X, R9 280X, R9 290 and R9 290X while on the Nvidia side we have a few more offerings with the GTX 650 Ti Boost, GTX 660, GTX 760, GTX 770, GTX 780, GTX 780 Ti and GTX Titan. All of the aforementioned graphics cards are current offerings and to the sharp-minded readers you will notice some graphics cards are missing. Mainly the current generation lower-end graphics cards from both AMD and Nvidia are absent, that includes the Nvidia GTX 650, GT 640 GDDR5, GT 640 DDR3 and the AMD R7 250 and R7 240. The main reason for not testing these graphics cards, other than that we didn’t have most of them, is because they simply aren’t that capable of running such a high end gaming title. Of course that’s not to say they can’t but given the nature of the resolutions we test (mainly 1080p or above) and the quality settings our readers like to see (very high or ultra) these GPUs simply aren’t cut out for the test. Arguably they are more aimed at gamers with 1366 x 768 monitors tackling medium-high details but I digress. The system requirements for Battlefield 4 reveal a similar picture, if you want a smooth gameplay experience then you need an AMD Radeon HD 7870 or Nvidia GTX 660 or better. However, those system requirements show you very little about what you can expect at different resolutions.  So without any further ado let us show you our results and show you exactly how AMD and Nvidia’s offerings stack up!

Emperor Chair 1510 – The Biggest And Meanest Workstation Or Gaming Gear Ever Made?

Now this is an interesting contraption indeed, the “Infinity Emperor”, the ultimate in modern computing. Offering the future of workstations and the pinnacle of high-end computing, here and now. Featuring three 24″ Asus 144Hz Gaming monitors, support for a full audio system, Radeon R9-290X Crossfire, LED lighting and a full custom watercooled set up. Designed for the true enthusiast, the “Infinity Emperor” enables you to immerse yourself in the computing experience deeper than ever before. Constructed from premium grade materials. Equipped with the latest technology from Intel & AMD, the “Infinity Emperor”performs like no other.

The Emperor 1510 Chair could be the future for high-end workstations and home computing environments. The construction is stylized after a scorpion and very stable, providing the user varying options of tilt, an integrated audio system, LED lighting. It can also be solemnly used as a gaming seat as well. The Emperor 1510 offers various features that allow the user to experience new sensations of comfort never experienced before.

The brackets of the Emperor 1510 are handcrafted and made from molded stylish Canadian steel, which is solid 4.7 mm thick. All the steel straps present, are made of high quality powder-coating and are supplied by the manufacturer with 5 years warranty on any structural defects. The seat itself is of high quality microfiber fabric.

The varied mounting options for the monitor enables the user to install either the Emperor 1510 with a single monitor up to 30 inches wide or three monitors up to 24 inches (need to VESA standard support). The upper part of the seat is height adjustable via the attached steel bars.

In addition to the individual product, a complete system, the „Infinity Emperor“ is offered at Overclockers UK. The Infinity Emperor system is armed with the Intel Core i7 4770K Quad Core, 8-Thread CPU overclocked to a minimum of 4.70GHz. Unchained, all four cores operate in unison in speeds never experienced before, bringing the computational experience into warp speed. Thermal management is controlled by a beautiful and intricate Custom Watercooling Loop featuring Mayhems Purple pastel fluid for a wonderfully organic effect. The memory runs at a rapid 2133MHz throughout all 16GB with a tight CAS latency of 10, bandwidth is nigh on unlimited.

Storage is handled by both a SSD & HDD, a Samsung 840 Evo Series 250GB is selected a primary drive enabled for record OS boot times and configured for split second application and game load times. The secondary HDD is a Seagate Barracuda 2TB Sata III drive offering ample space for literally everything else. Tertiary drive options are available for those who wish to contain libraries worth of data with their Infinity System.

Decked out with a pair of AMD Radeon R9 290X 4096MB graphics cards configured in Crossfire for ultimate firepower in maximum attack mode. Both cards are fully watercooled which delivers a soothingly silent experience. There is even 4K display resolutions support to go with the rocket-powered configuration, coming as standardly configured with three Asus 24″ 144MHz LED monitors for a massive expanse of desktop space.

Powering up this massive beast is the Be Quiet 1000W “80+ Bronze Rated” 1000W PSU, relentlessly giving your components the life force required to operate at 110% whilst maintaining world-class efficiency and unrivalled silence. Modular cabling allows for a neat installation, which is beneficial both aesthetically and ideal for optimal airflow.All of this is installed within the Lian Li D600WB Brushed Aluminium Cube Case, showcasing the sophistication and empowerment unlined by the Infinity Emperor.

When it comes to prices, for the basic equipment you have to be prepared with a sum of £4999.99 inc VAT for each Emperor Chair 1510, and £9999.99 inc VAT for the complete system „Infinity Emperor“ consisting of the Emperor Chair 1510 and the high-end gaming system and three monitors as standard.

Thank you Guru3D for providing us with this information

AMD Releases Catalyst 13.11 Beta8, Fixes Battlefield 4 Crash

AMD has kept its word and delivered a driver which should fix game crashing in Battlefield 4. A number of users have been complaining about crashes with Battlefield 4, particularly on systems with AMD graphics cards and Windows 8. AMD had promised that it would fix the crashing, and as said, it has delivered.

The new Catalyst 13.11 Beta8 driver should have fixed the crashes that users are experiencing. As with most drivers, it contains all the extra goodies of the previous driver release.

The driver has support for Windows 7 and above machines, paired with HD 5000 series and above graphics cards, of course including the new R9-290X graphics card. It is available for download here.

Feature Highlights of The AMD Catalyst 13.11 Beta8 Driver for Windows:

  • ​​Includes all Feature Highlights of the AMD Catalyst 13.11 Beta7
  • Resolves intermittent crashes experienced with Battlefield 4 on Windows 8 based systems

Thank you Fudzilla for providing us with this information

Non-Reference R9 290X Graphics Card Coming Late November

At the moment AMD have released the reference variant of the R9 290X  and AMD partners are only allowed to sell those reference models – nothing customised in any way. If you’re looking for a custom cooled R9 290X you need to buy a reference version and modify it with your own custom cooling solution, or you can get a retailer to do it for you – like the OCUK custom R9 290X (pictured above) which is sold pre-fitted with a Prolimatech MK-26 VGA cooler. SweClockers reports that this restriction is going to last through to late November because only in late November will AMD life restrictions on partners, enabling them to launch their custom solutions.

In late November AMD will lift restrictions on custom PCB and cooler designs. From late November you can expect to see the usual variants of an AMD graphics card such as the Direct CU II TOP (ASUS), Twin Frozr Gaming (MSI), Double/Triple Dissipation (XFX), WindForce (Gigabyte), Vapor-X (Sapphire) and more. There is no specific date for the restriction being lifted but anyone after a custom PCB design is definitely going to have to wait till then. Of course the first (reference) R9 290X graphics cards aren’t shipping until November 4th so you’ll only have to wait about 2-3 weeks “extra” for those custom designs.

Image courtesy of Overclockers UK

AMD Radeon R9 290X 4GB Graphics Card Review

Introduction


After months of rumours and media hype, AMD’s new flagship graphics card is here. We have a launch-day review for you guys to read and we’re putting this brand new GPU through its paces on the eTeknix graphics card test system. We’re very excited to bring you this review because everyone loves to see how a new flagship GPU performs and what it can deliver, even if you have no intention of ever buying it. The AMD R9 290X is a brand new GPU crafted from the same 28nm process as the HD 7000 series but using a revised architecture design (GCN 2.0 vs GCN) and a new GPU die codenamed “Hawaii”.

The AMD R9 290X brings a number of new features to the table which we will cover briefly but first let’s check a run down of the specifications. AMD’s R9 290X is a big step up from the R9 280X (HD 7970) featuring 768 more stream processors, 1.5 more TFLOPS of compute performance, 32GB/s more memory bandwidth, double the number of ROPs and much more. A default clock speed of up to 1GHz on the core and up to 5GHz on the memory is deployed, this will vary dynamically as it is adjusted by AMD’s revised PowerTune technology, which leads us on nicely to the next part – the new features of the R9 290X.

Firstly, AMD has introduced a new version of its PowerTune technology. In essence the new PowerTune can be seen as AMD’s answer to Nvidia’s GPU Boost 2.0. AMD’s new PowerTune balances clock speeds and fan speeds to maintain a steady temperature threshold. The card adjusts the clock speed dynamically along with the fan speed (which also has a threshold) to maintain that steady temperature under load. By default there is a 95 degree fixed max temperature and 40% fixed max fan speed so the clock speed will be adjusted to ensure it can achieve both those parameters. Normally this means the clock speed is lowered when the card is going to exceed either the max temperature or fan speed, or the clock speed is raised when there is sufficient temperature or fan profile headroom to support this. What this creates is an extensive overclocking system where users can balance the temperature, fan speed, clock speed and also the Power Limit to get the most performance or the most silence. AMD’s OverDrive section of their Catalyst Control Center will allow you to control all these other parameters but you will also be able to use third party overclocking tools to adjust these things too when those programs are fully updated.

Linked to the new overclocking system is the dual BIOS system. By default the AMD R9 290X ships with two BIOSes and a switch to select between each one. The first and default switch is “Quiet Mode” and this limits the fan speed to 40% with the default 95 degree threshold. The second switch is for “Uber Mode” and this boosts the maximum fan speed to 55% and leaves the temperature threshold at 95 degrees. Uber mode allows the card to clock higher by having better cooling while Quiet mode allows it to remain quieter. Of course AIB partners will be able to benefit from dramatically better cooling than the reference design and you may not even need the dual BIOS switch to access the maximum performance. You can of course pick any custom combination of GPU temperature and maximum fan speed as shown above.

AMD has also worked hard to support 4K gaming in a “plug and play” fashion on its new flagship with default Eyefinity configurations for tiled 4K monitors such as the Sharp PN-K321 or ASUS PQ321Q monitors (and while on the topic of 4K be sure to check out our 4K gaming featured article here). As part of its push to 4K AMD is supporting the new industry standard for tiled displays in VESA – DisplayID v1.3. While AMD can support the very limited numbers of 4K monitors currently out on the market, in the future when 4K monitors proliferate, AMD (and other graphics providers) will need a unified standard to easily recognise stitched 4K panels.

A final noteworthy mention is AMD’s new method of CrossFire which enables the activation of CrossFire without the need for the physical CrossFire X ribbon. As far as AMD have suggested this will only be supported on the R9 290 series as it is part of the new architectural design of the Hawaii GPU. The Eyefinity and TrueAudio enhancements introduced by AMD with the other RX 2XX cards released a few weeks ago are also extended to the R9 290X, check out the details of those right here.

4K Gaming Showdown – AMD R9 290X & R9 280X Vs Nvidia GTX Titan & GTX 780

Introduction


***Please see our latest 4K Gaming article here which adds GTX 780 Ti, GTX 780 Ti SLI and R9 290X CFX to the mix***

With GPUs getting more and more powerful and 4K monitors becoming available for consumer purchase we thought we’d use AMD’s R9 290X launch as a spring-board to look at the 4K Gaming performance of AMD and Nvidia’s top 2 single GPU graphics cards. Of course since writing this article Nvidia have revealed their intentions to release a GTX 780 Ti graphics card which is worth considering when looking at these benchmarks. AMD are also expected to reveal an R9 290 graphics card at some stage this year too. So this is by no means a comprehensive or complete look at 4K performance on high end AMD and Nvidia GPUs, but we think it is an interesting place to start.

Firstly let’s recap the graphics cards we’re using, all four are pictured above and they are:

  • AMD R9 290X – read our review here.
  • Nvidia GTX Titan – read our review here.
  • Nvidia GTX 780 – read our review here.
  • Sapphire Vapor-X OC AMD R9 280X –  read our review here.

Next we’ve managed to get a hold of a 4K monitor for this testing as AMD were kind enough to provide us with the Sharp PN-K321 4K monitor.

The Sharp PN-K321 uses a 32 inch IGZO panel providing a resolution of 3840 x 2160. Being a first generation 4K panel it uses two 1920 x 2160 displays stitched together with an advanced display controller chip. The 4K monitor is able to stream 4K at up to 60Hz which is best done through DisplayPort.

We’ve used the usual selection of games that we’d normally do in our graphics card reviews so we’ve got a selection of 7 games and one synthetic benchmark to show you: Alien vs Predator, Bioshock Infinite, Hitman Absolution, Sleeping Dogs, Unigine Heaven 4, Tomb Raider, Dirt Showdown and Metro Last Light. Without any further ado let’s see exactly how these AMD and Nvidia GPUs got on at some 4K gaming.

Rumour: Nvidia’s GTX 780 Ti Will Cost $649 To Rival AMD’s R9 290X

According to Sweclockers the GTX 780 Ti will be priced at the same level as the GTX 780 currently is in order to compete more effectively with the AMD R9 290X. Apparently the GTX 780 Ti will cost $649 to compete with the AMD R9 290X which is expected to be priced in that same region. The GTX 780 will then be cut down from $649, probably to somewhere in the region of $550-600, to allow it to compete with the AMD R9 290 which will be priced a bit lower than the R9 290X.

Of course no one really knows what AMD’s pricing strategy for the R9 290X and R9 290 is yet and Nvidia cannot respond with price cuts until it knows. If AMD opts for $649 for the R9 290X like widely speculated then the R9 290 will likely be $100 cheaper than the R9 290X, mirroring how AMD spaced the HD 7970 and HD 7950 out by $100 on launch in 2011, meaning $649 and $549 respectively. This would mean Nvidia’s $649 GTX 780Ti would compete with the R9 290X and the GTX 780 would compete with the R9 290.

Ultimately, we won’t know for sure until the R9 290X and R9 290 graphics cards are released. When Nvidia announce price cuts we’ll bring you the news as soon as possible so stay tuned for that.

Image courtesy of Nvidia

Radeon R9 290 And R9 290X Launch Dates Revealed

VideoCardz.com claims to have scored the embargo lift dates of the AMD Radeon R9 290 and R9 290X graphics cards. According to a forum post, which they sourced from Cardpu, AMD employee “Chris Li” has stated the NDA lift dates are as pictured above. If these rumoured dates are accurate we should then see the AMD R9 290 series graphics cards launch on those dates, with the R9 290X coming first. VideoCardz.com also claimed that their own source verified the launch date of the R9 290X as correct. Of course we cannot confirm or deny the launch dates as we simply do not know but if VideoCardz.com are to be believed then we will see the Radeon R9 290X and R9 290 very soon. The wait is almost over for AMD GPU fans.

image courtesy of Cardpu