XFX Radeon Pro Duo Pictured & Priced!

In just under 10 days, users will finally be able to purchase their very own dual Fiji GPU. From launch, the Radeon Pro Duo would come out to a lofty $1,499 USD but given exchange rates, those in Spain will have to shell out 1696 EUR. In addition to some local pricing information, we’re also getting treated to some very nice pictures and more detailed physical specifications for the top end Radeon.

First off, confirmation has been given about the clockspeed of the dual-Fijis as 1000Mhz. This slightly lower than the FuryX which runs at 1050Mhz but the removal of PCIe latency should offset this. Memory stays the same at the standard 500Mhz though overclocking that shouldn’t be hard. Exact dimensions are 28.1 x 11.6 x 4.2 cm (length, width, height), with a 120mm as well. No word yet on the length of the tubing.

The biggest surprise is the display output which AMD told us was 4 DisplayPorts. We’re finding out now that it’s actually only 3 DisplayPort 1.2 and 1 HDMI 1.4a. Perhaps AMD misspoke display ports for DisplayPorts. Either way, it remains to be seen how well the card will sell given the hefty price tag and how close Pascal and Polaris are. Even with a strong showing from the next-gen card, though, the Radeon Pro Duo may remain the fastest single card solution.

BIOSTAR RACING H170GT3 (LGA1151) Motherboard Review

Introduction


BIOSTAR might not be the most recognizable motherboard brand in western markets but their pedigree for creating reliable products is worthy of praise. When compared to MSI, Gigabyte and ASUS, the company struggles to entice consumers with unique aesthetic designs. Furthermore, the BIOS and software package has been sorely lacking and in dire need of change. Thankfully, BIOSTAR have acknowledged these criticisms and decided to forge a brand new range based upon a racing theme to please petrolheads with an avid interest in enthusiast hardware. Each RACING motherboard sports a chequered flag PCB and stylish LED illumination while introducing a new BIOS layout. Clearly, this is a major departure from BIOSTAR’s previous products which evoked a fairly mundane appearance.

The BIOSTAR RACING H170GT3 is based on the mATX form factor and supports up to 64GB DDR4 with a maximum speed of 2133MHz. Intel’s H170 chipset blocks multiplier overclocking which means you have to resort to your processor’s default turbo frequency. Of course, there’s been some controversy surrounding BCLK overclocking on H170 and B150 motherboards to unofficially achieve boosts fairly close to traditional multiplier overclocking. Sadly, Intel has voiced their displeasure regarding this phenomenon and pressurized manufacturers to disable BCLK overclocking via a BIOS update. As a result, we have to rely on stock figures to determine the motherboard’s performance. Previously, I’ve seen some astounding results when it comes to storage with BIOSTAR products, and I’m interested to see if this trend continues.

Specifications

Packing and Accessories

Here we can see the absolutely stunning packaging which contains a carbon fibre inspired cover and vibrant text. This coincides with the RACING focus and feels quite reminiscent of a luxury sports car’s interior.

On the opposite side, there’s a detailed diagram showing the motherboard’s layout and explanation of its unique selling features. Once again, this is presented a superb manner and makes you inquisitive about the product’s specification.

In terms of accessories, the motherboard includes a user’s manual, Vivid LED DJ instructions guide, SATA cables, driver disk, and I/O shield.

AMD Releases Crimson 16.3.1 Driver

AMD is really stepping up the pace of their driver releases since February, with drivers dropped every 1 to 2 weeks. Today, we’ve been treated to Radeon Software Crimson Edition 16.3.1, the second release for March and fourth since the beginning of February. Slow drivers were one of the big complaints against AMD and since the formation of RTG, things have changed.

For 16.3.1, the major headliners are improved support for Hitman and Need For Speed which just launched. Hitman, in particular, got Crossfire profile support after first getting support back with 16.3. We’ve listed the other fixes below but the biggest one is that DX12 games are no longer locked to the refresh rate of the display panel.

  • Installed or played games sometimes do not show up in the Radeon Settings “Gaming” tab.
  • Installing via command line may not work for some users. As a workaround please use the default GUI installer.
  • Intermittent hang sometimes experienced on UE4 applications.
  • Black screen or possible hang after launching Oculus Video Application.
  • DirectX®12 application frame rates are no longer locked to the refresh rate of the display panel.

A host of known issues still remain though, most notably the slew of Crossfire issues. Both SLI and Crossfire have suffered exceptionally in recent months and one can only hope DX12 and Vulkan will provide the fix. You can find the release notes here and the drivers here.

Gigabyte Z170-Gaming K3 (LGA1151) Motherboard Review

Introduction


Intel’s current iteration of enthusiast processors offering impressive overclocking headroom incurs a fairly hefty premium compared to the previous generation especially if you’re opting for the i7-6700K. Unfortunately, the retail version of this CPU sporting a 3 year warranty still teeters around the £300 mark, and falls into a similar budget to the 6-core 5820K. The real savings when selecting the Z170 chipset revolve around cheaper motherboards which usually cater towards the gaming demographic with LED illumination, unusual colour schemes and a comprehensive software suite. It’s astonishing to see the kind of performance and bundled list of features on products under £100. At this price, there’s fierce competition and some manufacturers have struggled to outline the value of H170/B150 alternatives due to the narrow difference to affordable Z170 options.

The latest motherboard from Gigabyte targets the mainstream audience utilizing a single discrete graphics card, and overclocked Skylake processor. While it does technically support Crossfire, the lack of x8/x8 functionality might be a deal breaker for users wanting the absolute maximum bandwidth. There’s also no support for SLI setups either which may be a contentious issue. To be honest, I don’t see this as a huge problem because the motherboard retails for approximately £95 and dual card configurations are fairly niche in today’s market. Despite the very low price, Gigabyte has still implemented 32Gb/s M.2 storage, a great audio solution and USB 3.1. From the outset, it seems Gigabyte managed to achieve this excellent specification on a budget by removing SLI support. I’m interested to see the stock performance numbers though compared to high-end solutions and determine the motherboard’s credentials. Could this be the best value gaming Z170 motherboard ever released? Let’s find out!

Specifications

Packing and Accessories

The Z170-Gaming K3 is packaged in a visually appealing box showcasing the attractive dual tone PCB design. This draws you into the product and evokes a sense of excitement prior to the unboxing process.

On the rear, there’s a huge description regarding the motherboard’s high-speed connectivity, premium audio hardware and networking with traffic prioritization for gaming purposes. This is presented in an easy to understand manner, and the clear pictures do a great job of relaying technical information without bamboozling the end-user.

In terms of accessories, the motherboard comes with an I/O shield, user’s guide, G Connector, case badge and SATA cables opting for a very stylish metallic look. This is the first time I’ve seen a color of this ilk but I have to admit it’s a nice touch and looks fantastic. The G Connector is really useful when connecting those fiddly front panel connectors and improves the user-experience when building a new system. Other additions include a rather fetching door hanger, drivers disk, and World of Warships content code.

AMD’s Raja Koduri Talks Future Developments – Capsaicin

Even though a lot of information was shared from the Capsaicin live stream, some details weren’t made known till the after party. In an interview, Radeon Technologies Group head Raja Koduri spoke in more detail about the plans AMD has for the future and the direction they see gaming and hardware heading towards.

First up of course, was the topic of the Radeon Pro Duo, AMD’s latest flagship device. Despite the hefty $1499 price tag, AMD considers the card a good value, something like a FirePro Lite, with enough power to both game and develop on it, a card for creators who game and gamers who create. If AMD does tune the drivers more to enhance the professional software support, the Pro Duo will be well worth the cash considering how much real FirePro cards cost.

Koduri also see the future of gaming being dual-GPU cards. With Crossfire and SLI, dual GPU cards were abstracted away as one on the driver level. Because of this, performance widely varies for each game and support requires more work on the driver side. For DX12 and Vulkan, the developer can now choose to implement multi-GPU support themselves and build it into the game for much greater performance. While the transition won’t fully take place till 2017-2019, AMD wants developers to start getting used to the idea and getting ready.

This holds true for VR as well as each GPU can render for each eye independently, achieving near 2x performance benefit. The benefits though are highly dependent on the game engine and how well it works with LiquidVR. Koduri notes that some engines are as easy as a few hours work while others may take months. Roy Taylor, VP at AMD was also excited about the prospect of the upcoming APIs and AMD’s forward-looking hardware finally getting more use and boosting performance. In some ways, the use of multi-GPU is similar to multi-core processors and the use of simultaneous multi-threading (SMT) to maximize performance.

Finally, we come to Polaris 10 and 11. AMD’s naming scheme is expected the change, with the numbers being chronologically based, so the next Polaris will be bigger than 11 but not necessarily a higher performance chip. AMD is planning to use Polaris 10 and 11 to hit as many price/performance and performance/watt levels as possible so we can possibly expect multiple cards to be based on each chip, meaning probably 3. This should help AMD harvest imperfect dies and help their bottom line. Last of all, Polaris may not feature HBM2 as AMD is planning to hold back till the economics make sense. That about wraps it up for Capsaicin!

Microsoft Promises to Fix Windows Store VSync Lock

Microsoft’s relationship with the PC gaming audience has been fairly turbulent due to the advent of Games for Windows Live and focus on console development. Many users are still haunted by Microsoft’s atrocious form of DRM, which resulted in lost saves, regional restrictions and a very basic accounts system. Furthermore, they never really showed any interest in the PC platform, and didn’t support it when there were concerns about its future. Thankfully, Valve stepped in and catapulted the popularity of PC games. There’s a certain irony though because Microsoft could be in Valve’s position if they catered to the PC community properly.

Thankfully, times have changed and Microsoft is adopting a more unified approach to mobile, PC and console. This means, the company is open to cross-platform releases and giving the PC key franchises which remained a console exclusive for some time. Admittedly, there’s still no Halo or Forza on the PC platform, but I cannot see an announcement being too far off. Instead of releasing their own games on Steam, Microsoft is pushing the Windows Store and trying to encourage people to upgrade to Windows 10. This makes sense because Valve takes a significant portion of a game’s revenue, simply for allowing a game to be sold on their service.

Sadly, the Windows Store has some major issues and Microsoft has to address them to even consider making PC gamers part with their hard-earned money. For example, a post on Reddit suggests there’s no SLI/Crossfire support, no modding support, no refund policy, and VSync is always on! Not only that, the game files are protected, monitoring software doesn’t work properly, and borderless full screen is your only option. Clearly, these are some very severe restrictions and concerning given Microsoft’s history with DRM.

Mike Ybarra, Partner Director of Program Management, Xbox and Windows Platform at Microsoft addressed some of these concerns and said:

Apparently, SLI/Crossfire is already functional, and depends on the developer. However, I encountered some problems with dual card setups on Rise of the Tomb Raider but only with the Windows Store version. I’m pleased to see the news that VSync will become optional because many people dislike the stutter it creates or use a G-Sync panel instead. Only time will tell if Microsoft creates an open PC platform, but it’s clear that the current store needs a lot of work.

AMD Driver 16.1.1 Hotfix Released – Fixes Fallout 4 Crossfire Issues

What I updated Fallout to version 1.3 I was expecting things to be running smoother, not worse. The issues started when AMD released their new drivers, which added a Crossfire profile for the game, however, this patch worked with Fallout 1.2 and the release of both updates on the same day created many more problems than they resolved. The big one was dire gaming performance in Crossfire, despite having the profile enabled as well as invisible textures, severe glitching, flickering and more.

A batch file was made by a Steam user which helped revert the game to version 1.2, thus allowing the use of Crossfire again, but that’s hardly what I would call a fix. Thankfully, AMD has now released their 16.1.1 Hotfix driver, which combats the flickering issue and should clear things up nicely.

Article Number: RN-WIN-RADEONCRIMSON16.1.1HOTFIX​​​​​​

This article provides information on the latest posting of the AMD Radeon Software Crimson Edition 16.1.1 Hotfix Driver

Package Contents

The AMD Radeon Software Crimson Edition 16.1.1 Hotfix Driver contains the following:

AMD Radeon Software Crimson Edition 16.1.1 Hotfix Driver Version 15.301.1801

The AMD Radeon Software Crimson Edition 16.1.1 Hotfix Driver can be downloaded from the​ following links:

AMD Radeon Software Crimson Edition 16.1.1 Hotfix Driver for Windows® 10, Windows 8.1 & Windows 7 64-bit

AMD Radeon Software Crimson Edition 16.1.1 Hotfix Driver for Windows® 10, Windows 8.1 & Windows 7 32-bit

[86690] Fallout 4 – Flickering may be experienced at various game locations with the v1.3 game update and with AMD Crossfire enabled. The AMD Radeon Software Crimson Edition 16.1.1 Hotfix driver has been updated. Please download this driver once again and reinstall it to address this issue.

If you’re going to update and you’ve got things working pretty well already, be sure to create a restore point so you can revert if it all goes to hell. The new Hotfix should fix the issue for you, it did for me, but there’s always going to be that small percentage where it likely makes things worse, although this is true of any update.

You can find all the information on the new drivers on the official AMD page right here.

Ubisoft Confirms The Division Will Have Multi GPU Support And More!

Ubisoft has historically struggled to produce optimized PC versions of their major game releases which scale properly across a wide range of hardware configurations. For example, Watch Dogs ran extremely poorly, and didn’t really utilize SLI setups in an effective way. Additionally, it’s difficult to forget the complete mess that is Assassin’s Creed: Unity. As the PC gaming market continues to grow, it seems Ubisoft is finally starting to realize its potential and might just be investing the right resources to create a proper PC release. According to the publisher, The Division will not be a port, and feature a huge array of PC specific features:

  • Flexible user interface: move, scale and adjust the opacity of the HUD.
  • Intuitive controls: navigate easily through the interface, inventory panels and map designed specifically to be used with a mouse or a controller.
  • Full mouse & keyboard support: opt for the high precision mouse and keyboard experience and switch to a controller in the middle of any encounter without interruption.
  • Text chat: team up with more agents of The Division by using the in-game text chat.
  • Optimized graphic settings & customised GPU effects: adjust a vast variety of technology treats, from realistic lighting, shading, snow particles, local reflection, fog volumetric scattering, depth of field and much more…
  • Multi-GPU support: unleash the graphic power of the best computer set ups for jaw-dropping graphics powered by Massive Entertainment’s game engine Snowdrop.
  • Multi-screen support: play with up to three screens for the most immersive and stunning experience of The Division.
  • Multi-resolutions: opt for 1080p or 4k and automatically adapt the resolution to fit multi-screen configurations with FOV correction.
  • Unlocked framerate: let the most powerful computer reach the highest framerate.
  • HBAO+: enjoy the most realistic shadowing, lighting algorithm and ambient occlusion.

While, I’m still fairly cautious given Ubisoft’s chequered past, it’s fantastic to see HBAO+ implementation, multi-GPU support, an unlocked frame-rate, and native support for high resolution monitors. I do wonder if the resolution options extends to 21:9 panels without requiring a UI fix. Clearly, the PC version offers a huge boost in graphical fidelity compared to the current crop of consoles but there’s always suggestions about the visuals being downgraded. Despite this, the screens so far look pretty good, and it will be interesting to see the performance on various setups. Thankfully, I’ve already got my beta access code, and should be able to write-up a technical analysis once the beta officially begins.

AMD R9 380X 4GB Graphics Card CrossFire Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different and we are excited to bring you the CrossFireX review of the newly released R9 380X graphics cards.

Based on the R9 380, which was based on the R9 285, the R9 380X was designed to fit the gap that was obvious between the R9 380 and R9 390. Priced at just under £200, sales have proven strong in the first weeks and board partners have given their models the usual overclocking treatment with the average clock speed of around 1030MHz being around 50MHz higher than the ‘reference’ design.

Through our testing of both the XFX DD and Sapphire Nitro models, it was evident that performance wasn’t as high as I hoped and still left a gap to fill under the R9 390. Reviewing the Rx 200 series lineup, the R9 285 was an extremely late arrival. It was based on architecture we were familiar with, but it introduced GCN 1.2 which is the foundation of the brand new R9 Fury range. To me, this leaves a gap for an R9 385 to be introduced to the market and the next step in the graphics card race for late 2016.

When we test in CrossFireX, we aim to use two identical graphics card to ensure that everything is as similar as possible. When using the same cards, you can almost guarantee the same cooling capabilities, power draw, core clock and other variables. This then gives us the best possible outcome for maximum performance as the computer does not need to compensate for any differences.

Just Cause 3 Engine Does Not Support Multi GPU Configurations

Avalanche Studios is a reputable PC developer and renowned for creating optimized games including Just Cause 2 and Mad Max. While Mad Max’s enviroment is fairly basic, you have to admire how well the game runs on modest hardware. Unfortunately, their latest project, Just Cause 3 appears to suffer from poor optimization and hefty system requirements. Granted, the game runs quite well on certain hardware, but there is a lot of reports from angry customers considering a refund. Furthermore, according to NVIDIA’s Andrew Burnes, Just Cause 3’s engine is not compatible with any multi GPU configurations and said:

“The engine is incompatible with all multi-GPU solutions, so no SLI support at this time.”

This is bizarre as Mad Max supported SLI-powered systems and scaled multiple cards in an impressive manner. Clearly, Just Cause 3 is using a completely different engine and can only enable multi-GPU support through a patch. Although, this involves modifying the game engine and I cannot realistically see this being implemented anytime soon. Just Cause 3 is the latest example of terrible multi-GPU support in triple-A games and I honestly wouldn’t recommend dual GPU setups for this reason alone. Publishers see this is a very niche market and don’t want to invest the time to support such a small sector of players. That’s no excuse though and I’d be very wary of purchasing an SLI or Crossfire setup if you want to utilize both cards on launch.

Hopefully, DirectX 12 functionality will make dual GPU configurations a viable option, but the time being, it seems developers have no interest in catering to this demographic.

Batman Arkham Knight Will Never Have Multi-GPU Support

It looks like fans and owners of Warner Bros Batman Arkham Knight will be dealing with more unwelcome news. According to a statement given out via Steam, the developers have given up on ever implementing multi-GPU support like Nvidia SLI or AMD Crossfire. This means for PC gamers with the highest end rigs, they will never be able to fully enjoy the game. This is especially true of those with multiple displays or 4K  which require multiple GPUs to run games at high fps.

According to the developer, even if there were gains from implementing SLI or Crossfire support, there would be a high chance that adding such support would cause more bugs. This really speaks to the buggy nature of the game that launched buggy, stayed buggy, and finally became decent. With WB offering full refunds to customers, it looks like the developers have given up on implementing new features and are just focusing on bugs. The biggest hope is that the gaming industry as a whole will have learned a lesson when not to ship games.

You can read the full statement below:

We’ve been working with our development and graphics driver partners over the last few months to investigate utilizing multi-GPU support within Batman: Arkham Knight. The result was that even the best case estimates for performance improvements turned out to be relatively small given the high risk of creating new issues for all players. As a result we’ve had to make the difficult decision to stop work on further multi-GPU support. We are disappointed that this was not practical and apologize to those who have been waiting for this feature.

MSI Launches Z170A SLI Plus Motherboard

MSI are one of the hottest names in the PC component market, having created some of the best graphics cards and motherboards of recent years, they’re a very popular choice for system builders. Their Pro series of motherboard get its final addition today, with the release of the all-black Z170A SLI PLUS, a board that takes many of the features we’ve seen in the stunning X99A SLI PLUS motherboard, but just a little more up to date.

“Offering an enhanced experience and the best in productivity and reliability the Z170A SLI PLUS motherboard has been configured to satisfy even the most demanding user.” – MSI

Equipped with USB 3.1 Type-C, Audio Boost, SLI and CrossFire support, Steel Armor PCI-E slots, Military Class 4 components, DDR4 Boost, Turbo M.2, Click BIOS 5 and more, there’s no doubt that this is a feature packed board that will appeal to many.

There’s no official spec sheet just yet, but given MSI’s track record, I have high expectations for this board. Prices in the UK are expected to be around £109 and retailers should have stock any time now, so keep a look out for them if you want to get one of the first batch.

Would you put the Z170A SLI PLUS motherboard into your next system build?

AMD R9 Nano 4GB (HBMv1) CrossFire Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different and we are excited to bring you the CrossFireX review of the highly anticipated R9 Nano 4GB graphics cards.

The R9 Nano is the third release in the Fiji GPU core range and the third official graphics card to utilise High Bandwidth Memory (HBMv1). We’ve been impressed with the performance of the Fiji range so far with the fully unlocked R9 Fury X providing a good alternative to the NVIDIA GTX 980Ti, the R9 Fury providing a good step up from the R9 390X and the GTX 980 and the R9 Nano being the perfect option for small form factor builds. A single R9 Nano provides the perfect balance of performance, power consumption and mobility, but will combining two still be a worthwhile option?

When we test in CrossFireX, we aim to use two identical graphics card to ensure that everything is as similar as possible. When using the same cards, you can almost guarantee the same cooling capabilities, power draw, core clock and other variables. This then gives us the best possible outcome for maximum performance as the computer does not need to compensate for any differences.

AMD Dual Fiji XT “R9 Gemini” Spotted

While we’ve long know that AMD was preparing a dual Fiji GPU, we’re now getting some hints that the card will be launched and revealed imminently. According to a shipping manifest, a “Fiji Gemini” has just left AMD’s Canada headquarters. AMD Canada has always been the site that handled more graphics since it used to be ATI, and with the Gemini headed off, it probably means the card is done most of its testing is off and ready to be launched soon.

Previous names for the card have revolved around R9 Fury X2 or some variation thereof, but R9 Gemini might now be a contender. The shipping manifest also shows an attached Cooler Master heatsink. Given that ongoing litigation between Cooler Master and Asetek, AMD  either has a deal going on with Asetek or they know something we don’t. The card is expected to pack a total of 8192 shader processors and 8GB(2x4GB) of HBM1. While 4GB of VRAM shouldn’t hold things back at 4K, the advent of unified memory with DX12 may help alleviate issues in the future.

With Nvidia also set to launch their own dual-GPU graphics card and having shown off their HBM2 Pascal card, AMD only has a narrow window in which to launch this card. Hopefully, we will be hearing more about Gemini in the days to come. The launch of R9 Gemini may also bring about better Crossfire performance and quality, something which has been lacking a bit.

Thank you WCCFTech for providing us with this information 

Nvidia Prepping Dual-GPU Graphics Card

Dual GPU graphics card have been common over the [ast few generations and it looks like Nvidia is about to launch another one. Set to use the Maxwell architecture, the new dual-GPU card will feature 2 GM200 class GPUs, the same ones that power the GTX 980Ti and Titan X.

Back in the Fermi generation, Nvidia had the GTX 590 which was followed up by the GTX 690 with Kepler. Both cards were relatively well received. Looking to make use of their Titan brand, Nvidia then pushed for the Titan Z, essentially 2 Titan Blacks. That card, unfortunately (or, fortunately, depending on where you stand) flopped heavily due to an exorbitant $3000 price tag. This time around, we will likely see a return to more sane pricing, with two GTX 980Ti equivalents priced about $1500.

Given the tight time frames, a Maxwell based dual-GPU flagship likely means Pascal won’t be dropping for a while. After all, Nvidia won’t want those who shelled ou top dollar for the new card to feel burned with Pascal drops with a new architecture, memory interface, and better performance. This does cement the fact that Nvidia will probably launch most of Pascal with HBM2, with maybe a few select cards using HBM1, similar to what AMD has done.

It’s interesting to hear of this Nvidia now so close to AMD’s dual Fury GPU. The R9 Fury X2 is set to feature two of AMD’s top line Fiji chips and would have likely dominated the market for single board graphics cards. With this new card, Nvidia will be able to offer some stiff competition given Maxwell’s strength. It is important to note that Crossfire does scale a bit better than SLI, leading to AMD’s 7990 and 295X2 doing quite well.

Thank you WCCFTech for providing us with this information

Sapphire Tri-X R9 390X 8GB CrossfireX Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different and we are excited to bring you the CrossfireX review of the Sapphire Tri-X R9 390X graphics cards.

Based on the slightly aging Hawaii architecture, performance was expected to be fairly low, however, as we found in our standalone review that really wasn’t the case. Alone, this card has the power to directly take on the GTX 980 and is poised to be at the low-end of the brand new AMD R9 Fury range. At a price of £350, it is perfectly priced to fill in the gap between the R9 390 and R9 Fury.

When we test in CrossfireX, we aim to use two identical graphics card to ensure that everything is as similar as possible. When using the same cards, you can almost guarantee the same cooling capabilities, power draw, core clock, boost clock and so on. This then gives us the best possible outcome for maximum performance as the computer does not need to compensate for any differences.

AMD Catalyst 15.7 WHQL Driver Adds Cross Generation Crossfire Support

Something that AMD have been falling behind on lately is the WHQL drivers, well drivers in general. Beta drivers are released every few months, but a certified WHQL driver has taken over 200 days to reach us. Let’s not dwell on the past, we have one here, we’ve tested it and it works perfectly fine. However, it seems AMD has returned to form and opened up cross generation Crossfire again. Over at VideoCardz.com, Crossfire has been tested between the new R9 390X and an R9 290X.

The cards used weren’t matching, so the R9 390X 8GB was the only available variation, but it was tested with an R9 290X 4GB. This then limits the R9 390X to use just 4GB as Crossfire utilises the lowest VRAM quantity. Scores are around where we previously tested 2x R9 290X 8GB cards, so there is little a performance penalty for using the previous generation.

We will be confirming this new feature for ourselves by testing the R9 390 with an R9 290 and an R9 380 with an R9 285. If it works across most of the new generation, it could prove a nice upgrade to those who already own the 200 series equivalent.

With the Crossfire options opened up, would you be willing to purchase one of the newer cards to Crossfire or even buying an older card to bridge the gap until a 300 series card becomes cheaper? Let us know in the comments.

AMD R9 Fury X CrossfireX 12K Eyefinity Review

Introduction


Triple monitor configurations were massively useful a few years ago when the ‘new’ standard was 1080p and everyone wanted to have huge workspaces to process more information at once. While this was good back then, nowadays monitors can have up to 4x the resolution of 1080p in the form of 4K (2160p) and workers can fit a huge amount of information onto a single monitor.

How about when it comes to gaming? The surround monitors engulf you in a wealth of visual stimulation and even presents some details which you cannot normally see in a typical single monitor setup.

Last time we looked at our current top end cards, they all faired reasonably well when stacked against the mighty triple 4K configuration, but what about when we pitch the R9 Fury X crossfire duo against it? Let’s find out in today’s article.

AMD R9 Fury X 4GB Graphics Card Crossfire Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different, we are extremely excited to bring you the CrossFireX review of the recently released AMD Radeon R9 Fury X. As we all know, the R9 Fury X is AMD’s latest attempt to take the crown from NVIDIA in the top end consumer GPU market. In some ways, AMD has succeeded, thanks to the introduction of a new GPU architecture and the innovative High Bandwidth Memory (HBM). With the use of HBM, it has been proven that the quantity of VRAM isn’t the issue, it is the quality of the connection and bandwidth allowance for the VRAM to do its work; although more VRAM certainly couldn’t hurt.

On the test bench today, we have the XFX version of the AMD R9 Fury X 4GB featuring HBM. As we previously saw in the standalone review, the card had more than enough power to supply 30FPS at 4K; however, 30FPS isn’t enough. Adding another card into the mix should produce very high chances of witnessing 60FPS at 4K.

The two cards in the testing bench together you can get a feel of the size of them compared to the Gigabyte G1 Gaming X99 motherboard. The attention to detail that has gone into every card is simply amazing; there isn’t a piece of cable sleeve or cable tie out of place. All of the screws are perfectly inserted and the metal is buffed up to a gorgeous shine.

A single card is a testament to AMD’s attention to detail. It’s a shame the heat shrink didn’t go all of the way to the fan cowling; leaving about 1″ of coloured cables visible.

Out of the rig, the two cards in all their glory. If the comparison to the motherboard wasn’t enough, how about next to the 120mm radiators? Due to there being no metal heat sink inside the card, it weighs next to nothing compared to the radiators.

 

Up close to the cards, you can see that there isn’t a dimple on the cover plate out of place and there is no frayed cable sleeving protruding from the end of the cards.

 

We inserted both graphics cards onto our Core i7 5820K and X99-based test system, ensuring adequate spacing for optimum cooling and that both have access to sufficient PCI-e bandwidth for CrossFire operation. These cards are the best possible option for configuring a crossfire set-up, both are the reference design, same sub-vendor, exactly the same clock speeds and the same TDP. All of this means that we can achieve the best possible scaling with little to no variations due to the mismatch of graphics cards.

AMD Fury X Quadfire Results

There’s been a lot of negative press about the new AMD cards recently, but there is a diamond in the rough to be found. Despite the poorer than expected, although still very impressive, performance of the Fury X, it didn’t stop AMD employee, Matt Buck in pursuing ultimate performance with not two, not three, but four of these beats in a CrossFireX configuration.

Now one alone is enough to handle 4K gaming with a few dips under 60FPS, two should sort you out to never see under 60FPS; four on the other hand, is enough to run a decent 12K Eyefinity set-up.

https://twitter.com/amd_roy/status/614163217368150016

It’s all well and good buying and flaunting these graphics cards, but what we all want to see is how it fares in a benchmark. Thankfully, that has already been done in the form of 3DMark Firestrike Ultra (4K), along with single increments of each card. With the addition of each card, we see almost perfect scaling, losing only around 12% performance per card per addition

Looking at that result has got me thinking, even with one of the most power GPUs on the planet, surely only 4GB of VRAM proves a hindrance or is this the start of HBM where the increased bandwidth interprets more information and thus has lowered the VRAM requirement compared to GDDR5. Let us know your thoughts in the comments section.

UPDATE: Looking through some of Matt’s previous posts on forums, it seems he has posted a 4K QuadFire Fury benchmark using Thief. When compared to the 4K Quad SLI Titan X benchmark; the Fury combination pins the NVIDIA option by almost 5 FPS.

4K Thief Benchmark: Quadfire AMD FuryX

4K Thief Benchmark: Quad SLI NVIDIA Titan X

AMD Improves Batman: Arkham Knight Performance With Catalyst 15.6 Beta

We’ve seen a lot of angry fans after Rocksteady Studios released Batman: Arkham Knight. The 30FPS lock really did a number on the title’s performance and the game’s overall optimization takes us way back to the release of Assassin’s Creed Unity. We all know what happened with the latter title and how Ubisoft now promises to make up for it with Assassin’s Creed Syndicate.

The real surprise here is that even NVIDIA owners suffered performance drops while playing the title. We’ve seen a lot of problems with AMD’s drivers, but NVIDIA? This means Rocksteady really did it this time. Nevertheless, AMD finally released the updated 15.6 catalyst beta driver, promising to make the Batman title playable for AMD users.

Also, AMD seems to be in talks with the developers to resolve all Crossfire issues, so don’t expect to get a lot out of dual-gpu configurations at this point. A more detailed view of what the driver brings can be seen below:

Highlights of AMD Catalyst 15.6 Beta Windows Driver

  • Performance and Stability improvements for Batman: Arkham Knight

Important Note:

AMD Crossfire support is currently disabled for Batman: Arkham Knight while AMD works closely with Warner Bros. Interactive Entertainment to resolve the issue. An update for this issue will be released as soon as it is available

Known Issues:

  • [422129] Batman: Arkham Knight – The application may crash during in game benchmarking or while exiting the game
  • [422130] Batman: Arkham Knight – The screen may turn black or pink while changing resolution to 1680×1050.

Thank you WCCF for providing us with this information

DirectX 12 Adoption Rates Fastest Since DX9

DirectX 12 has seen the fastest adoption rates of any API package since the launch of DirectX 9, a Microsoft developer has revealed. Max McMullen, the Microsoft’s Principal DirectX Development Lead, made the declaration at the recent Build Developer Conference.

McMullen said, “[DX12] has been the fastest adoption of the Direct3D API we’ve seen since Direct3D 9 […] It’s been the fastest adopted API in more than a decade.”

He went on to speak about the Multiadapter functionalities of DirectX 12, explaining that it DX12 will support both linked and unlinked GPUs, linked being the equivalent of SLI or CrossFire, but the unlinked GPU support is notable as it will allow gamers to use multiple GPUs from different vendors, like one NVIDIA and one AMD graphics card together within a single PC system.

“This feature exposes direct control of two linked GPUs as well. So you might have known as Crossfire or SLI from AMD and NVIDIA, respectively, but you’ll get direct control of both GPUs in DX12 if you so choose,” McMullen said.

DX12 will also support independent memory allocation, giving developers full use of the total amount of VRAM available over a CrossFire or SLI multi-GPU configuration.

Thank you Dark Side of Gaming for providing us with this information.

AMD FreeSync CrossFire Support Delayed

When AMD’s anti-screen tearing and adaptive framerate FreeSync technology was launched as a rival to NVIDIA’s G-Sync last March, it was limited to single-GPU support. AMD’s Red Team promised support for CrossFire – the company’s multi-GPU configuration – by April, but with the month now come and gone with no sign of multi-GPU support, AMD has now issued a statement on the matter:

AMD fans have been outspoken about their love for the smooth gameplay of AMD FreeSync technology, and have understandably been excited about the prospect of using compatible monitors with the incredible performance of an AMD CrossFire configuration.

After vigorous QA testing, however, it’s now clear to us that support for AMD FreeSync monitors on a multi-GPU system is not quite ready for release.

As it is our ultimate goal to give AMD customers an ideal experience when using our products, we must announce a delay of the AMD Catalyst driver that would offer this support. We will continue to develop and test this solution in accordance with our stringent quality standards, and we will provide another update when it is ready for release.

With no timeframe projected for CrossFire support, we could be waiting a while until we see it, but if it means a flawless, bug-free system, I think most gamers would gladly take the hit.

Thank you WCCF Tech for providing us with this information.

Sapphire R9 290X Tri-X 8GB CrossFireX Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different, as we have a pair of Sapphire’s amazing R9 290x 8GB Tri-x edition graphics cards to combine together for some CrossFireX action. The dedicated review for this graphics card can be found here. When striving for the best results, it is favourable to test 2 of the same models to allow for no variation in any clock speeds or variations in any integrated components, so today we should see some excellent results.

In the dedicated review, this graphics card has more than enough power to play most games at 4K resolution at 60FPS, faltering slightly in the more demanding Metro Last Light.

We inserted both graphics cards onto our Core i7 5820K and X99-based test system, ensuring adequate spacing for optimum cooling and that both have access to sufficient PCI-e bandwidth for CrossFire operation.

The typical ‘hot spot’ when arranging a CrossFire or SLI configuration is the closest graphics card to the processor, due to both of these cards being equipped with the Tri-x cooler, positioning isn’t an issue.

As these graphics cards have been subject to Sapphires treatment, they have slightly higher clock speeds than a reference model, but as these are both the same cards, there should be little to no variation in clock speeds; this will result in maximum gains during testing.  

The Best Graphics Solution You Can Buy For Around £1000: Sapphire 295X2’s

Introduction and A Closer Look


The battle of performance is one key area for buying a high-end graphics card like the AMD Radeon 295X2 or Titan/Titan Black/Titan-Z from Nvidia, but for those with a sensible head on their shoulders, you have to factor in pricing, though as an impulse buy, you may not want to. It’s always been a tight margin between AMD and Nvidia on the extreme segment market, as they both know that a premium can be requested from the consumer, and those wanting the best of the best will quite happily delve into their pockets to have it, and will most likely get it in the ear from their partners shortly after.

AMD have been quite generous with their Radeon 295X2 mammoth, water-cooled, uber, dual GPU monster as of late with price cuts left, right and center. While MSI and other brands have taken some money off to give the consumer a better deal, we’re finding one brand who has taken price cuts to the extreme. Sapphire are a market leader and for a very good reason, and with the 295X2 selling at a staggeringly low £599 including VAT at Overclockers UK, we want to see exactly how much performance you get for this amazing price.

We’re not going to talk too much about the cards aesthetics or cooling performance, as we’ve done all of that around 9 months ago, and if you fancy a brush up, you can check out our fully fledged review of the card on its own here. For now, we’re going to jump straight into how two of these Goliath cards operate in CrossFire and if they really can offer extreme unrivalled performance for an amazing price.

AMD Rolls Out Catalyst 15.3 Beta with FreeSync Support

AMD has just released its latest 15.3 Catalyst version with its AMD FreeSync Technology and additional Crossfire profiles for Project CARS, Battlefield Hardline, Total War: Attila and Ryse: Son of Rome.

The full features and resolved issues of the newest Catalyst driver can be viewed below:

AMD Freesync Support for single GPU product configurations

    • AMD Freesync technology support is now available for single GPU configurations. For more information on how Freesync works, FAQ’s and what products are currently supported, please visit the  Freesync Technology Page

Mixed Rotation Eyefinity now available on Radeon R9 285 series

    • AMD Radeon R9 285 now supports Mixed Rotation Eyefinity for “0” and “90” degrees rotations
    • More information on setup, configuration and support for this feature is available at the following link:

New Crossfire Profiles for:

    • Battlefield Hardline
    • Evolve
    • Far Cry 4
    • Lords of the Fallen
    • Project Cars
    • Total War: Attila

Crossfire Profile Updates for :

    • Alien Isolation
    • Assassin’s Creed Unity
    • Civilization – Beyond Earth
    • FIFA 2015
    • Grid Autosport
    • Ryse: Son of Rome
    • Talos Principle
    • The Crew

Important Note:

The AMD CrossFire profile for Dying Light is currently disabled in this driver while AMD works with Techland to investigate an issue where AMD CrossFire configurations are not performing as intended. An update is expected on this issue in the near future through an updated game patch or an AMD driver posting.

Resolved Issues:

    • [412702] Screens may blank out when enabling a 3×1 SLS with 3 HDMI monitors
    • [411847] Leadwerks : Project Manager crashes with a “Pure Virtual Function Call” error
    • [413076] Second Life : Rigged mesh objects are not rendered correctly when hardware skinning is enabled in the in game settings
    • [413392] Star Trek Online : Block corruption is experienced when MSAA is enabled in the in game settings
    • [410367] System hangs/BSOD upon resuming from S3/S4 sleep on AMD Radeon R9 285 configured in AMD CrossFire mode
    • [410293] With AMD CrossFire enabled, Timeout Detection Recovery (TDR) occurs during actual gameplay when YouTube Mix moves to the next song in Firefox
    • [407622] Screen tearing on enabling VSync with Alien: Isolation game
    • [407175] Catalyst Control Center Video Quality settings may not be available or retained if the “Enforce Smooth Video Playback” option is not selected on some AMD HD series GPU’s.
    • [410391] Primary display may not be retained after disabling Crossfire while in Eyefinity mode
    • [409705] Enabling or disabling Crossfire may lead to one side of the 4K MST display being shown as black
    • [410393] Minor stuttering may be seen in Dragon Age Inquisition on Single and Multi GPU configurations
    • [414660] Total War : Attilla – The game may hang during in game cinematics on certain Kaveri platforms with a separate discrete GPU
    • [414120] The Elder Scrolls V – Skyrim : Fog / Clouds may flicker on Radeon HD 5800 series products

Download available from AMD.

AMD Bringing FreeSync Support to Catalyst Drivers on 19th March

AMD has announced that it is to release a new Catalyst graphics driver on 19th March, and that the driver will be FreeSync compatible. FreeSync is AMD’s variable refresh rate technology, preventing “screen tears” during gaming.

FreeSync-compatible monitors are now available in Europe, Africa, and the Middle East, the tech is yet to make it to US shores. This latest driver release raises hope that FreeSync will arrive in America soon.

It seems that drivers for AMD CrossFire setups are more difficult to develop, since they are not available until April.

A statement from AMD reads:

“AMD is very excited that monitors compatible with AMD FreeSync technology are now available in select regions in EMEA (Europe, Middle East and Africa). We know gamers are excited to bring home an incredibly smooth and tearing-free PC gaming experience powered by AMD Radeon GPUs and AMD A-Series APUs. We’re pleased to announce that a compatible AMD Catalyst graphics driver to enable AMD FreeSync technology for single-GPU configurations will be publicly available on AMD.com starting March 19, 2015. Support for AMD CrossFire configurations will be available the following month in April 2015.”

Source: Legit Reviews