Ashes of the Singularity DirectX 12 Graphics Performance Analysis

Introduction


Ashes of the Singularity is a futuristic real-time strategy game offering frenetic contests on a large-scale. The huge amount of units scattered across a number of varied environments creates an enthralling experience built around complex strategic decisions. Throughout the game, you will explore unique planets and engage in enthralling air battles. This bitter war revolves around an invaluable resource known as Turinium between the human race and masterful artificial intelligence. If you’re into the RTS genre, Ashes of the Singularity should provide hours of entertainment. While the game itself is worthy of widespread media attention, the engine’s support for DirectX 12 and asynchronous compute has become a hot topic among hardware enthusiasts.

DirectX 12 is a low-level API with reduced CPU overheads and has the potential to revolutionise the way games are optimised for numerous hardware configurations. In contrast to this, DirectX 11 isn’t that efficient and many mainstream titles suffered from poor scaling which didn’t properly utilise the potential of current graphics technology. On another note, DirectX 12 allows users to pair GPUs from competing vendors and utilise multi graphics solutions without relying on driver profiles. It’s theoretically possible to achieve widespread optimization and leverage extra performance using the latest version of DirectX 12.

Of course, Vulkan is another alternative which works on various operating systems and adopts an open-source ideology. Although, the focus will likely remain on DirectX 12 for the foreseeable future unless there’s a sudden reluctance from users to upgrade to Windows 10. Even though the adoption rate is impressive, there’s a large number of PC gamers currently using Windows 7, 8 and 8.1. Therefore, it seems prudent for developers to continue with DirectX 11 and offer a DirectX 12 render as an optional extra. Arguably, the real gains from DirectX 12 will occur when its predecessor is disregarded completely. This will probably take a considerable amount of time which suggests the first DirectX 12 games might have reduced performance benefits compared to later titles.

Asynchronous compute allows graphics cards to simultaneously calculate multiple workloads and reach extra performance figures. AMD’s GCN architecture has extensive support for this technology. In contrast to this, there’s a heated debate questioning if NVIDIA products can even utilise asynchronous compute in an effective manner. Technically, AMD GCN graphics cards contain 2-8 asynchronous compute cores with 8 queues per core which varies on the model to provide single cycle latencies. Maxwell revolves around two pipelines, one designed for high-priority workloads and another with 31 queues. Most importantly, NVIDIA cards can only “switch contexts at draw call boundaries”. This means the switching process is slower and gives AMD and a major advantage. NVIDIA has dismissed the early performance numbers from Ashes of the Singularity due to its current development phase. Finally, the game’s release has exited the beta stage which allows us to determine the performance numbers after optimizations were completed.

Do AMD Drivers Really Deserve Such a Hostile Reception?

Introduction


AMD has a serious image problem with their drivers which stems from buggy, unrefined updates, and a slow release schedule. Even though this perception began many years ago, it’s still impacting on the company’s sales and explains why their market share is so small. The Q4 2015 results from Jon Peddie Research suggests AMD reached a market share of 21.1% while NVIDIA reigned supreme with 78.8%. Although, the Q4 data is more promising because AMD accounted for a mere 18.8% during the last quarter. On the other hand, respected industry journal DigiTimes reports that AMD is likely to reach its lowest ever market position for Q1 2016. Thankfully, the financial results will emerge on April 21st so we should know the full picture relatively soon. Of course, the situation should improve once Polaris and Zen reach retail channels. Most importantly, AMD’s share price has declined by more than 67% in five years from $9 to under $3 as of March 28, 2016. The question is why?

Is the Hardware Competitive?


The current situation is rather baffling considering AMD’s extremely competitive product line-up in the graphics segment. For example, the R9 390 is a superb alternative to NVIDIA’s GTX 970 and features 8GB VRAM which provides extra headroom when using virtual reality equipment. The company’s strategy appears to revolves around minor differences in performance between the R9 390 and 390X. This also applied to the R9 290 and 290X due to both products utilizing the Hawaii core. NVIDIA employs a similar tactic with the GTX 970 and GTX 980 but there’s a marked price increase compared to their rivals.

NVIDIA’s ability to cater towards the lower tier demographic has been quite poor because competing GPUs including the 7850 and R9 380X provided a much better price to performance ratio. Not only that, NVIDIA’s decision to deploy ridiculously low video memory amounts on cards like the GTX 960 has the potential to cause headaches in the future. It’s important to remember that the GTX 960 can be acquired with either 2GB or 4GB of video memory. Honestly, they should have simplified the process and produced the higher memory model in a similar fashion to the R9 380X. Once again, AMD continues to offer a very generous amount of VRAM across various product tiers.

Part of the problem revolves around AMD’s sluggish release cycle and reliance on the Graphics Core Next (GCN) 1.1 architecture. This was first introduced way back in 2013 with the Radeon HD 7790. Despite its age, AMD deployed the GCN 1.1 architecture on their revised 390 series and didn’t do themselves any favours when denying accusations about the new line-up being a basic re-branding exercise. Of course, this proved to be the case and some users managed to flash their 290/290X to a 390/390X with a BIOS update. There’s nothing inherently wrong with product rebrands if they can remain competitive in the current market. It’s not exclusive to AMD, and NVIDIA have used similar business strategies on numerous occasions. However, I feel it’s up to AMD to push graphics technology forward and encourage their nearest rival to launch more powerful options.

Another criticism regarding AMD hardware which seems to plague everything they release is the perception that every GPU runs extremely hot. You only have to look on certain websites, social media and various forums to see this is the main source of people’s frustration. Some individuals are even known to produce images showing AMD graphics cards setting ablaze. So is there any truth to these suggestions? Unfortunately, the answer is yes and a pertinent example comes from the R9 290 range. The 290/290X reference models utilized one of the most inefficient cooler designs I’ve ever seen and struggled to keep the GPU core running below 95C under load.

Unbelievably, the core was designed to run at these high thermals and AMD created a more progressive RPM curve to reduce noise. As a result, the GPU could take 10-15 minutes to reach idle temperature levels. The Hawaii temperatures really impacted on the company’s reputation and forged a viewpoint among consumers which I highly doubt will ever disappear. It’s a shame because the upcoming Polaris architecture built on the 14nm FinFET process should exhibit significant efficiency gains and end the concept of high thermals on AMD products. There’s also the idea that AMD GPUs have a noticeably higher TDP than their NVIDIA counterparts. For instance, the R9 390 has a TDP of 275 watts while the GTX 970 only consumes 145 watts. On the other hand, the Fury X utilizes 250 watts compared to the GTX 980Ti’s rating of 275 watts.

Eventually, AMD released a brand new range of graphics cards utilizing the first iteration of high bandwidth memory. Prior to its release, expectations were high and many people expected the Fury X to dethrone NVIDIA’s flagship graphics card. Unfortunately, this didn’t come to fruition and the Fury X fell behind in various benchmarks, although it fared better at high resolutions. The GPU also encountered supply problems and emitted a large whine from the pump on early samples. Asetek even threatened to sue Cooler Master who created the AIO design which could force all Fury X products to be removed from sale.

The rankings alter rather dramatically when the DirectX 12 render is used which suggests AMD products have a clear advantage. Asynchronous Compute is the hot topic right now which in theory allows for greater GPU utilization in supported games. Ashes of the Singularity has implemented this for some time and makes for some very interesting findings. Currently, we’re working on a performance analysis for the game, but I can reveal that there is a huge boost for AMD cards when moving from DirectX11 to DirectX12. Furthermore, there are reports indicating that Pascal might not be able to use asynchronous shaders which makes Polaris and Fiji products more appealing.

Do AMD GPUs Lack Essential Hardware Features?


When selecting graphics hardware, it’s not always about pure performance and some consumers take into account exclusive technologies including TressFX hair before purchasing. At this time, AMD incorporates with their latest products LiquidVR, FreeSync, Vulkan support, HD3D, Frame rate target control, TrueAudio, Virtual Super resolution and more! This is a great selection of hardware features to create a thoroughly enjoyable user-experience. NVIDIA adopts a more secretive attitude towards their own creations and often uses proprietary solutions. The Maxwell architecture has support for Voxel Global Illumination, (VGXI), Multi Frame Sampled Anti-Aliasing (MFAA), Dynamic Super Resolution (DSR), VR Direct and G-Sync. There’s a huge debate about the benefits of G-Sync compared to FreeSync especially when you take into account the pricing difference when opting for a new monitor. Overall, I’d argue that the NVIDIA package is better but there’s nothing really lacking from AMD in this department.

Have The Drivers Improved?


Historically, AMD drivers haven’t been anywhere close to NVIDIA in terms of stability and providing a pleasant user-interface. Back in the old days, AMD or even ATI if we’re going way back, had the potential to cause system lock-ups, software errors and more. A few years ago, I had the misfortune of updating a 7850 to the latest driver and after rebooting, the system’s boot order was corrupt. To be fair, this could be coincidental and have nothing to do with that particular update. On another note, the 290 series was plagued with hardware bugs causing black screens and blue screens of death whilst watching flash videos. To resolve this, you had to disable hardware acceleration and hope that the issues subsided.

The Catalyst Control Center always felt a bit primitive for my tastes although it did implement some neat features such as graphics card overclocking. While it’s easy enough to download a third-party program like MSI Afterburner, some users might prefer to install fewer programs and use the official driver instead.

Not so long ago, AMD appeared to have stalled in releasing drivers for the latest games to properly optimize graphics hardware. On the 9th December 2014, AMD unveiled the Catalyst 14.12 Omega WHQL driver and made it ready for download. In a move which still astounds me, the company decided not to release another WHQL driver for 6 months! Granted, they were working on a huge driver redesign and still produced the odd Beta update. I honestly believe this was very damaging and prevented high-end users from considering the 295×2 or a Crossfire configuration. It’s so important to have a consistent, solid software framework behind the hardware to allow for constant improvements. This is especially the case when using multiple cards which require profiles to achieve proficient GPU scaling.

Crimson’s release was a major turning point for AMD due to the modernized interface and enhanced stability. According to AMD, the software package involves 25 percent more manual test cases and 100 percent more automated test cases compared to AMD Catalyst Omega. Also, the most requested bugs were resolved and they’re using community feedback to quickly apply new fixes. The company hired a dedicated team to reproduce errors which is the first step to providing a more stable experience. Crimson apparently loads ten times faster than its predecessor and includes a new game manager to optimize settings to suit your hardware. It’s possible to set custom resolutions including the refresh rate, which is handy when overclocking your monitor. The clean uninstall utility proactively works to remove any remaining elements of a previous installation such as registry entries, audio files and much more. Honestly, this is such a revolutionary move forward and AMD deserves credit for tackling their weakest elements head on. If you’d like to learn more about Crimson’s functionality, please visit this page.

However, it’s far from perfect and some users initially experienced worse performance with this update. Of course, there’s going to be teething problems whenever a new release occurs but it’s essential for AMD to do everything they can to forge a new reputation about their drivers. Some of you might remember, the furore surrounding the Crimson fan bug which limited the GPU’s fans to 20 percent. Some users even reported that this caused their GPU to overheat and fail. Thankfully, AMD released a fix for this issue but it shouldn’t have occurred in the first place. Once again, it’s hurting their reputation and ability to move on from old preconceptions.

Is GeForce Experience Significantly Better?


In recent times, NVIDIA drivers have been the source of some negative publicity. More specifically, users were advised to ignore the 364.47 WHQL driver and instructed to download the 364.51 beta instead. One user said:

“Driver crashed my windows and going into safe mode I was not able to uninstall and rolling back windows would not work either. I ended up wiping my system to a fresh install of windows. Not very happy here.”

NVIDIA’s Sean Pelletier released a statement at the time which reads:

“An installation issue was found within the 364.47 WHQL driver we posted Monday. That issue was resolved with a new driver (364.51) launched Tuesday. Since we were not able to get WHQL-certification right away, we posted the driver as a Beta.

GeForce Experience has an option to either show WHQL-only drivers or to show all drivers (including Beta). Since 364.51 is currently a Beta, gamers who have GeForce Experience configured to only show WHQL Game Ready drivers will not currently see 364.51

We are expecting the WHQL-certified package for the 364.51 Game Ready driver within the next 24hrs and will replace the Beta version with the WHQL version accordingly. As expected, the WHQL-certified version of 364.51 will show up for all gamers with GeForce Experience.”

As you can see, NVIDIA isn’t immune to driver delivery issues and this was a fairly embarrassing situation. Despite this, it didn’t appear to have a serious effect on people’s confidence in the company or make them re-consider their views of AMD. While there are some disgruntled NVIDIA customers, they’re fairly loyal and distrustful of AMD’s ability to offer better drivers. The GeForce Experience software contains a wide range of fantastic inclusions such as ShadowPlay, GameStream, Game Optimization and more. After a driver update, the software can feel a bit unresponsive and takes some time to close. Furthermore, some people dislike the notion of GameReady drivers being locked in the GeForce Experience Software.  If a report from PC World is correct, consumers might have to supply an e-mail address just to update their drivers through the application.

Before coming to a conclusion, I want to reiterate that my allegiances don’t lie with either company and the intention was to create a balanced viewpoint. I believe AMD’s previous failures are impacting on the company’s current product range and it’s extremely difficult to shift people’s perceptions about the company’s drivers. While Crimson is much better than CCC, it’s been the main cause of a horrendous fan bug resulting in a PR disaster for AMD.

On balance, it’s clear AMD’s decision to separate the Radeon group and CPU line was the right thing to do. Also, with Polaris around the corner and more games utilizing DirectX 12, AMD could improve their market share by an exponential amount. Although, from my experience, many users are prepared to deal with slightly worse performance just to invest in an NVIDIA product. Therefore, AMD has to encourage long-term NVIDIA fans to switch with reliable driver updates on a consistent basis. AMD products are not lacking in features or power, it’s all about drivers! NVIDIA will always counteract AMD releases with products exhibiting similar performance numbers. In my personal opinion, AMD drivers are now on par with NVIDIA and it’s a shame that they appear to be receiving unwarranted criticism. Don’t get me wrong, the fan bug is simply inexcusable and going to haunt AMD for some time. I predict that despite the company’s best efforts, the stereotypical view of AMD drivers will not subside. This is a crying shame because they are trying to improve things and release updates on a significantly lower budget than their rivals.

Far Cry Primal Graphics Card Performance Analysis

Introduction


The Far Cry franchise gained notoriety for its impeccable graphical fidelity and enthralling open world environment. As a result, each release is incredibly useful to gauge the current state of graphics hardware and performance across various resolutions. Although, Ubisoft’s reputation has suffered in recent years due to poor optimization on major titles such as Assassin’s Creed: Unity and Watch Dogs. This means it’s essential to analyze the PC version in a technical manner and see if it’s really worth supporting with your hard-earned cash!

Far Cry Primal utilizes the Dunia Engine 2 which was deployed on Far Cry 3 and Far Cry 4. Therefore, I’m not expecting anything revolutionary compared to the previous games. This isn’t necessarily a negative concept though because the amount of detail is staggering and worthy of recognition. Saying that, Far Cry 4 was plagued by intermittent hitching and I really hope this has been resolved. Unlike Far Cry 3: Blood Dragon, the latest entry has a retail price of $60. According to Ubisoft, this is warranted due to the lengthy campaign and amount on content on offer. Given Ubisoft’s turbulent history with recent releases, it will be fascinating to see how each GPU this generation fares and which brand the game favours at numerous resolutions.

“Far Cry Primal is an action-adventure video game developed and published by Ubisoft. It was released for the PlayStation 4 and Xbox One on February 23, 2016, and it was also released for Microsoft Windows on March 1, 2016. The game is set in the Stone Age, and revolves around the story of Takkar, who starts off as an unarmed hunter and rises to become the leader of a tribe.” From Wikipedia.

Inno3D GTX 980Ti iChill Black Graphics Card Review

Introduction


Closed-loop liquid coolers have become extremely popular in the CPU market due to the cleaner build, and greater space around the CPU socket compared to traditional air cooling hardware. This means you can install an all in one liquid cooler without having to make concessions in terms of memory compatibility or worry too much about your motherboard’s PCI-E arrangement. As you might expect, all in one liquid coolers have progressively moved into the GPU sector to offer improved overclocking headroom and a lower noise output. There are some interesting parallels between CPU and GPU all in one liquid cooling though which needs to be addressed.

Firstly, many air coolers like the Noctua NH-D15 can outperform Asetek units, while being much quieter. It’s a similar picture with graphics cards because proficient air cooling setups including the Gigabyte Windforce X3 and Sapphire Tri-X provide a superb noise to performance ratio. Liquid cooled graphics cards have a price premium and involve a more complicated installation process. It’s important to remember that Maxwell is a very mature and efficient architecture which allows vendors to enable a 0dB idle fan mode. Despite my own qualms about closed-loop liquid cooling, it’s fantastic to see products which cater to a different target market. There’s clearly a demand for pre-assembled liquid cooled graphics cards, and their appeal is bound to grow in the next few years.

Today, we’re taking a look at the Inno3D GTX 980Ti iChill Black which utilizes a very powerful hybrid cooling solution. The GPU incorporates a traditional fan which only switches on during heavy load, in addition to a 120mm fan/radiator combination. The Arctic Cooling radiator fan is constantly on but has a very low RPM curve to maintain silent running. This impeccable hardware allows for an impressive core clock of 1203MHz and default boost reaching 1304MHz. The memory has also been increased to 7280MHz. As you can see from the chart below, this isn’t the greatest configuration we’ve encountered from the factory, but it’s exceedingly fast and should be a top performer. It will be fascinating to contrast this graphics card with the marvellous Inno3D GTX 980Ti X3 Ultra DHS which opts for a hefty air cooling design.

Specifications:

Packing and Accessories

The Inno3D GTX 980 Ti iChill Black comes in a huge box to properly house the closed loop cooler’s tubing and protect against leaks during shipping. Honestly, the picture doesn’t provide an accurate depiction of the packaging’s size. I have to commend Inno3D because they have taken the precautionary steps to reduce the possibility of damage occurring and utilized strong foam inserts as cushioning materials. The box itself features an attractive render of the GPU, and outlines its specification.

On the rear portion, there’s a brief synopsis of NVIDIA’s Maxwell architecture. I’m a bit surprised to see the back doesn’t contain any information about the liquid cooling solution and the acoustical benefits compared to NVIDIA’s reference cooler.

In terms of accessories, the graphics card is bundled with mounting screws, 6-pin PCI-E to molex adapter, case badge, DVI-D to VGA adapter and installation guide. There’s also a driver’s disk which you should disregard, a copy of 3DMark, and other documentation. This is a great selection of items and provides everything you need to get started! The mouse mat is surprisingly high-quality and relatively thick.

Gigabyte GeForce GTX 980Ti Xtreme Gaming Graphics Card Review

Introduction


NVIDIA’s cogent strategy to launch the Titan X at $999 and subsequently release the GTX 980Ti with similar performance at a significantly reduced price was a master stroke. This made the 980Ti compelling value and a great choice for high-end consumers wanting the best possible experience at demanding resolutions. Admittedly, there isn’t a GPU on the market capable of driving a 4K panel at maximum details but you can attain 60 frames-per-second with reduced settings. Evidently, the 980Ti has proved to be a popular choice especially when you take into consideration that factory overclocked models can easily pull away from NVIDIA’s flagship graphics card. While there is some competition from the Fury X, it’s not enough to dethrone custom-cooled 980Ti models.

Some users might argue that the upcoming Pascal architecture built on the 16nm manufacturing process and utilizing HBM2 ultra fast memory is reason enough to hold off buying a top-tier Maxwell solution. However, the current estimate suggests Pascal won’t launch until Q2 this year, and there’s no indication regarding pricing. As always, any new product has a price premium and I expect enthusiast Pascal cards to retail at a high price point. This means purchasing a Maxwell-based GPU right now isn’t a terrible option unless you require additional power to enjoy 4K gaming and have deep pockets. One of the best custom-designed GTX 980Ti cards on the market is the Gigabyte G1 Gaming. This particular GPU rapidly gained a reputation for its overclocking ability and superb Windforce triple fan cooling hardware.

The latest addition to Gigabyte’s graphics range is the GTX 980Ti Xtreme Gaming sporting a 1216MHz core clock, 1317MHz boost, and memory running at 7200MHz. One major improvement is the use of illuminated RGB rings behind the fans which creates a very unusual, and stylish appearance. Gigabyte’s GPU Gauntlet is a binning process which selects the best performing chips with impressive overclocking headroom. Once discovered, the top chips are incorporated into the Xtreme Gaming series, and G1 Gaming. By default, the Xtreme Gaming is bundled with a hefty overclock and should offer sensational performance. Although I expect to see some further gains due to the excellent cooling and stringent binning procedure. Could this be the best 980Ti on the market thus far?

Specifications:

Packing and Accessories

The product comes in a visually appealing box which outlines the extreme performance and gaming focus. I really like the sharp, dynamic logo with bright colours which draws you into the packaging.

On the rear side, there’s a brief description of the Windforce X3 cooling system, RGB illumination, GPU Gauntlet, and premium components. The clear pictures provide a great insight into the GPU’s main attributes and it’s presented in such a slick way.

In terms of accessories, the graphics card includes a driver disk, quick start guide, case badge, sweat band and PCI-E power adapter. It’s quite unusual to see a sweat band, but I’m sure it could come in handy during a trip to the gym or intense eSports contest.

Sapphire Nitro OC R9 Fury Graphics Card Review

Introduction


The initial unveiling of AMD’s Fury X was eagerly anticipated due to the advent of high bandwidth memory, and potential to revolutionize the size to performance ratio of modern graphics cards. This new form of stackable video RAM provided a glimpse into the future and departure from the current GDDR5 standard. Although, this isn’t going to happen overnight as production costs and sourcing HBM on a mass scale has to be taken into consideration. On another note, JEDEC recently announced GDD5X with memory speeds up to 14 Gbps which helps to enhance non-HBM GPUs while catering to the lower-mid range market. The Fury X and Fury utilizes the first iteration of high bandwidth memory which features a maximum capacity of 4GB.

There’s some discussion regarding the effect of this limitation at high resolutions but I personally haven’t seen it cause a noticeable bottleneck. If anything, the Fury range is capable of outperforming the 980 Ti during 4K benchmarks while it tends to linger behind at lower resolutions. AMD’s flagship opts for a closed-loop liquid cooler to reduce temperatures and minimize operating noise. In theory, you can argue this level of cooling prowess was required to tame the GPU’s core. However, there are some air-cooled variants which allow us to directly compare between each form of heat dissipation.

Clearly, the Fury X’s water cooling apparatus adds a premium and isn’t suitable for certain chassis configurations. To be fair, most modern case layouts can accommodate a CLC graphics card without any problems, but there’s also concerns regarding reliability and the possibility of leaks. That’s why air-cooled alternatives which drop the X branding offer great performance at a more enticing price point. For example, the Sapphire Nitro OC R9 Fury is around £60 cheaper than the XFX R9 Fury X. This particular card has a factory overclocked core of 1050MHz, and astounding cooling solution. The question is, how does it compare to the Fury X and GTX 980 Ti? Let’s find out!

Specifications:

Packing and Accessories

The Sapphire Nitro OC R9 Fury comes in a visually appealing box which outlines the Tri-X cooling system, factory overclocked core, and extremely fast memory. I’m really fond of the striking robot front cover and small cut out which provides a sneak peek at the GPU’s colour scheme.

On the opposite side, there’s a detailed description of the R9 Fury range and award-winning Tri-X cooling. Furthermore, the packaging outlines information regarding LiquidVR, FreeSync, and other essential AMD features. This is displayed in an easy-to-read manner and helps inform the buyer about the graphics card’s functionality.

In terms of accessories, Sapphire includes a user’s guide, driver disk, Select Club registration code, and relatively thick HDMI cable.

Rise of the Tomb Raider Performance Analysis

Introduction


Rise of the Tomb Raider originally launched on November 10th and received widespread critical acclaim from various press outlets. Unfortunately, the game went under the radar because Fallout 4 released on the same day. This was a strategic error which hindered the game’s sales and prevented consumers from giving it their undivided attention. It’s such a shame because Rise of the Tomb Raider is a technical marvel when you consider the Xbox One’s limited horsepower. Even though it’s not technically an exclusive, PC players had to wait until after the Christmas period to enjoy the latest exploits of everyone’s favourite heroine.

The PC version was created by Nixxes Software who worked on the previous Tomb Raider reboot as well as a number of other graphically diverse PC games. The studio is renowned for creating highly polished and well-optimized PC versions featuring an astonishing level of graphical fidelity. Prior to release, NVIDIA recommended a GTX 970 for the optimal 1080p experience and 980 Ti for 1440P. Since then, there have been some performance patches from the developer and driver updates to help with scaling across various hardware configuration. This means it will be fascinating to see the performance numbers now that the game has matured and gone through a large number of post-release hot fixes.

Rise of the Tomb Raider is an action-adventure video game developed by Crystal Dynamics and published by Square Enix. It is the sequel to the 2013 video game Tomb Raider, which was itself, the second reboot to its series. It was released for Xbox One and Xbox 360 in November 2015 and for Microsoft Windows in January 2016. It is set to release for PlayStation 4 in late 2016.

The game’s storyline follows Lara Croft as she ventures into Siberia in search of the legendary city of Kitezh, whilst battling a paramilitary organization that intends on beating her to the city’s promise of immortality. Presented from a third-person perspective, the game primarily focuses on survival and combat, while the player may also explore its landscape and various optional tombs. Camilla Luddington returns to voice and perform her role as Lara.” From Wikipedia.

So, let’s get to it and see how some of the latest graphics cards on the market hold up with the latest from Crystal Dynamics!

Inno3D GeForce GTX 980Ti X3 Ultra DHS Graphics Card Review

Introduction


NVIDIA’s GTX 980Ti has proved to be a very popular choice among hardware enthusiasts requiring extreme performance at demanding resolutions. Whether you’re opting for a 21:9 3440×1440 60Hz panel, 4K display or high refresh rate 1440P monitor, there’s very few single card configurations on the market capable of dealing with advanced AA, complex tessellation and other visually taxing effects while driving a large number of pixels. Technically, the Titan X is NVIDIA’s flagship product and its 12GB frame buffer initially appears like an enticing proposition. However, the price to performance ratio is quite poor especially when you consider the 980Ti is based on the same GM200 silicone and only exhibits a few cost saving measures. Most notably, the video memory is reduced from 12GB to 6GB and the shader units have been slightly scaled back from 3072 to 2816.

Barring a few exceptions, the majority of Titan X models utilize a reference design which results in reduced overclocking headroom and higher temperatures. In contrast to this, custom cooled GTX 980Ti SKUs feature very impressive factory overclocks and enable users to access a higher power limit percentage when tackling manual core and memory boosts. As a result, it’s not uncommon for 980Ti GPUs to outperform the Titan X in certain scenarios despite costing £300-400 less. This means it is the perfect choice for the higher end demographic and also provides an improved price to performance ratio.

Today we’re looking at one of the fastest GTX 980 Ti models on the market incorporating a pre-overclocked core of 1216MHz and boost reaching 1317MHz. Additionally, the memory is set at 7280MHz compared to 7010MHz on the reference design. Given the impeccable 3-fan cooling solution, and impressive factory overclock, I expect the graphics card to perform superbly and pull away from the reference 980Ti by a noticeable margin.

Specifications:

Packing and Accessories

The Inno3D 980Ti X3 Ultra DHS is packaged in a hefty box which does an excellent job of protecting the GPU, and bundled accessories. On another note, the box adopts a really striking design which emphasizes the extreme level of performance on offer.

The opposite side includes a brief synopsis of the GPU’s capabilities and outlines the modern features incorporated into this particular model such as High Dynamic Range (HDR).

In terms of accessories, the product comes with interchangeable cover plates, an installation guide, 3DMark digital code, power supply guidelines, driver disk, and the usual array of adapters. Please note, the 3DMark code is not pictured to prevent the serial from being used.

Another highlight is the extremely high quality elongated mouse pad. I love extended mouse pads because they allow you to neatly position your keyboard and mouse while opting for a clean, sophisticated appearance. Despite being a free addition, the mouse pad is remarkably thick and should last a long time without becoming too frayed.

AMD R9 380X 4GB Graphics Card CrossFire Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different and we are excited to bring you the CrossFireX review of the newly released R9 380X graphics cards.

Based on the R9 380, which was based on the R9 285, the R9 380X was designed to fit the gap that was obvious between the R9 380 and R9 390. Priced at just under £200, sales have proven strong in the first weeks and board partners have given their models the usual overclocking treatment with the average clock speed of around 1030MHz being around 50MHz higher than the ‘reference’ design.

Through our testing of both the XFX DD and Sapphire Nitro models, it was evident that performance wasn’t as high as I hoped and still left a gap to fill under the R9 390. Reviewing the Rx 200 series lineup, the R9 285 was an extremely late arrival. It was based on architecture we were familiar with, but it introduced GCN 1.2 which is the foundation of the brand new R9 Fury range. To me, this leaves a gap for an R9 385 to be introduced to the market and the next step in the graphics card race for late 2016.

When we test in CrossFireX, we aim to use two identical graphics card to ensure that everything is as similar as possible. When using the same cards, you can almost guarantee the same cooling capabilities, power draw, core clock and other variables. This then gives us the best possible outcome for maximum performance as the computer does not need to compensate for any differences.

Radeon Settings: Crimson Edition Performance Analysis

Introduction


To coincide with the recent Radeon Settings: Crimson Edition release, here is our full look into the performance side using our base Windows 8.1 test system. As we all know by now, Radeon Settings is the official release name of the recently discontinued Catalyst Control Center software and comes from the newly formed Radeon Technologies graphics division.

Radeon Settings: Crimson Edition sets out to put AMD back on the map with drivers and customer support, something that has let them down in the past and ridiculed on many forums and member bases. Moving from sporadic releases of non-WHQL certified drivers, the aim is to now release 6 WHQL drivers a year with interim updates, which isn’t as many as the Green team, but it’s a vast improvement from recent years and a huge step in the right direction.

Putting stability as the core of this software, four main pillars of User Experience, Features, Performance and Efficiency are what will make the bulk of the software.

From the first look, we saw a decent improvement in performance for the sample cards and tests, but now it’s time to test our entire catalogue of AMD graphics cards from the Fury and 300 series to see how this driver really stacks up.

Sapphire Nitro R9 380X 4GB Graphics Card Review with Back Plate!

Introduction


If you’ve been reading up on the latest R9 380X range, you may have seen stories and possibly even reviews that the performance wasn’t as high as many expected. I one of those who was very disappointed with the performance when compared to the R9 380 and R9 390 graphics cards by being around 10% faster than the R9 380. I really shouldn’t be complaining, at a sub £200 price point, the R9 380X is poised as a great 1440p graphics card; albeit with some settings lowered to medium or high from ultra.

Today in the test bench we have the Sapphire Nitro R9 380X. Fitting in nicely to the current Nitro range from Sapphire at just £199.99, this card features an identical cooling design as the R9 380 but with one huge visual improvement, which I will disclose later. The card features 16K capacitors for ultra long life and an increased core and memory clock to 1040MHz and 6000MHz respectively, so we should expect improved performance over what we have been witnessing so far.

Before anyone starts chanting “rebrand”, stop! As much as I agree that this is technically a rebrand, I’m now actually re-wording that to remanufacturing. A rebrand would be taking the R9 285, put a new cooling design on there and calling it an R9 380 with no other changes. However, the R9 380 and R9 380X are remanufactured with a much more precise manufacturing procedure to squeeze as much performance as possible from the Antigua GPU Core.

Packaging and Accessories

The Nitro box has changed for this card compared to the rest of the R9 Nitro range. We now have a simple portrait style box without a window to show off the contents.

I really think Sapphire want everyone to know that this card has a back plate fitted.

Accessories are simple, general paper based material, driver disk, case sticker and DVI to VGA adapter.

AMD Radeon Software: Crimson Edition First Look and Testing

Introduction


The day has finally come where AMD’s Catalyst Control Center has taken its last installation breath and the brand new AMD Radeon Software package is released. We initially had teasers of this software a few weeks ago on November 2nd and now we are pleased to announce that it is finally here for public download. We have been using Catalyst Control Center for around 13 years with very few graphical design changes in that time, it worked for the most part and that was about it. Personally I found it too information heavy and it needed either a re-designed interface or a simpler navigation menu.

Dubbed Radeon Software: Crimson Edition, it doesn’t really flow off the tongue like CCC or Catalyst 15.## did, but we are likely to see a similar numbered structure once Crimson hits full stride in January. The Radeon Software name is pretty self-explanatory, but Crimson takes a little more explaining to fully understand. Yes the ‘AMD’ colour is Red and that has been passed onto Radeon Technologies as a signature colour, but Red is only one word for a whole host of different shades such as Current, Jam and Sangria. Yes these are typical shades of red, though likely less known compared to Scarlet, Blood and Rose. It’s not fully clear whether we will be seeing software colour changes with every annual release to match the name or if the same brushed metal background will be consistent for the upcoming releases.

Here is a quick overview of the key points of Crimson Edition which focus on a key base of stability with four pillars of User Experience, Features, Performance and Efficiency. All of these together aim to produce the best overall consumer experience, whether you are gaming or just a general day-to-day user. A full explanation into these will be on the Third Page.

Without going into too much detail here, let’s get into what is aimed to make Radeon Software: Crimson Edition a new era in graphical software drivers.

Performance Overview: The Last AMD CCC 15.11.1 Beta

Introduction


An unforeseen turn of events has taken place over the last few months. AMD split up its Processor and Graphics divisions and we recently heard the demise of Catalyst Control Centre to make way for Radeon Software. I for one was not expecting to see a graphics driver before the Radeon Software: Crimson Edition was released. Why do I think this? If AMD is struggling as much as turnover figures and rumours suggest, why would it waste effort on something that is being discontinued for a new version. That’s like announcing HBMv2 will be released in January but releasing an entire range of graphics cards on HBMv1 in December. I’m not saying this is a bad thing, far from it, I welcome AMD driver updates because it shows that it is still in the running and recent news suggests that more funding will be invested into the graphics drivers in the future to level the playing field with NVIDIA. Early reports suggest that this new driver and the one just before, 15.11, are very good performance enhancing versions for newer games such as Star Wars Battlefront, Fallout 4, Assassins Creed, etc…

So today we take a look at the very last CCC driver, 15.11.1 Beta. If you are unaware, the naming is simply [Year].[Month]; the additional; “.#” is if there are two or more updates within a month and then it would just be named in chronological order. This makes it extremely easy to understand which is the latest drive to work for you and troubleshooting is technically made easier if you can only remember approximately when you started having problems (if driver related).

This new driver doesn’t really bring anything new in terms of features apart from an updated list of graphics cards that are applicable for higher support Virtual Super Resolution modes such as the R9 380 being able to support 3840×2160.

XFX DD R9 380X 4GB Graphics Card Review

Introduction


We’ve been reporting pretty close on the new AMD R9 300 series since launch and with multiple reviews, we know a thing or two about them such as the huge performance difference between the R9 380 and R9 390 graphics cards. Today we take a look at the brand new R9 380X provided by XFX. Based on the Antigua XT GPU with GCN 1.2, this is the newest architecture to come from AMD apart from Fiji which is featured on the R9 Fury range.

When we tested the R9 380, it just didn’t have the grunt to push acceptable frame rates at 1440p, but the R9 390 was too expensive to justify those few extra FPS. This is where the R9 380X comes in, priced at just under £190 ($240), this is almost slap bang in the middle of the two. As with most of the R9 300 series, this card has DirectX 12 support, FreeSync Technology, Virtual Super Resolution (VSR) and Frame Rate Target Control which effectively caps your total FPS to save power in non-demanding games such as Tomb Raider.

Generally at this point we would take a look at the packaging and accessories, however, this was a press sample which came in a boring brown box and we don’t want any pictures of that, do you?

12K (Triple 4K Monitors) SLI & Crossfire Graphics Card Review

Introduction


Following on from our highly popular ’12K’ (Triple 4K Monitor) Upgrade, we have new graphics cards which we can update the results with. Since the original article, things got a bit hectic and cards were coming and going extremely quickly. This meant that we didn’t have enough time in one sitting to correctly configure and run the tests as the second (or even third) card needed to be sent on to another media. We are now happy to bring you a long-awaited update featuring graphics cards such as the R9 Fury, R9 Nano and SLI GTX 980Ti’s. The list still isn’t complete with gaps such as SLI Titan X and CrossFire R9 Fury, but once we get these cards in for long enough, we will carry out another update.

With 4K monitors becoming the norm in today’s enthusiast gaming set-up, thanks to the ever decreasing price of these monitors and the increasing performance supplied by single cards; it’s not surprising that some users are combining multiple units. Some will have these monitors for the simply epic screen size and productivity potential, others will simply use them for an upgrade to the current surround gaming experience. Personally I don’t like 4K resolution unless it’s on a large screen, anything under 32″ makes the pixels so small they are hard to see and then you would just have to increase the sizes of font, which defeats (some of) the object.

AMD R9 Nano 4GB (HBMv1) CrossFire Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different and we are excited to bring you the CrossFireX review of the highly anticipated R9 Nano 4GB graphics cards.

The R9 Nano is the third release in the Fiji GPU core range and the third official graphics card to utilise High Bandwidth Memory (HBMv1). We’ve been impressed with the performance of the Fiji range so far with the fully unlocked R9 Fury X providing a good alternative to the NVIDIA GTX 980Ti, the R9 Fury providing a good step up from the R9 390X and the GTX 980 and the R9 Nano being the perfect option for small form factor builds. A single R9 Nano provides the perfect balance of performance, power consumption and mobility, but will combining two still be a worthwhile option?

When we test in CrossFireX, we aim to use two identical graphics card to ensure that everything is as similar as possible. When using the same cards, you can almost guarantee the same cooling capabilities, power draw, core clock and other variables. This then gives us the best possible outcome for maximum performance as the computer does not need to compensate for any differences.

Batman: Arkham Knight October Re-Release Performance Update

Introduction


Yawn! Seriously, the third time we’ve returned to a game. Maybe it was my fault being a bit hasty on the last large update, but that update was the serious graphical update that everyone on the PC was waiting for and it seemed to have worked for a large proportion of gamers. Some still found issues with this update, but because it wasn’t a re-release, it sort of got swept under the carpet.

The re-release was put back onto the Steam store at approximately 5pm on October 28th and it hasn’t got off to the best of starts either. The performance is still not what people expected and Warner Bros even make the ridiculous claim that Windows 10 users would require 12GB of RAM to get the best possible experience. I’m not one for giving up, but I would have gone home and cried by now if this was one of my games.

Again, we are setting ourselves up with the task of testing all of the latest graphics cards from the NVIDIA GTX 900 series and AMD R9 300 and Fury range against the latest update. This time around there will be a comparison compared to the original launch, the re-test that was completed a few weeks ago and the newest results to see how far the performance and playability has come.

Let’s begin shall we?

Club3D R9 Nano 4GB HBMv1 Graphics Card Review

Introduction


eTeknix has fought hard over the last few months to be able to bring you the Fiji articles that we have, some may have been a little late, but we have managed to get them out to you one way or another. Stock levels of the Fiji core and HBM have been extremely limited, so AMD had to make the tough decision to only allow an severely limited number of media samples and plumb the rest to the consumer market.

So here it is, our R9 Nano article supplied by Club3D. Right up until launch, we covered a lot about the card and something we knew was almost exactly how the card would look. An R9 Fury X copy with a fan instead of a water cooling solution. From there we took guesses at other specifications, would it feature the Fiji core or a cut-down version like in the R9 Fury with my money on the latter due to the massively decreased size and only single fan; I was extremely surprised when I found out that it uses a full Fiji core as found in the R9 Fury X.

Let’s find out how this miniature monster performs in today’s review.

Packaging and accessories

I’m actually really disheartened by this box. If you are paying £500+ for a graphics card, you’d expect at least a little bit more premium quality to the box. It offers everything you could want in a box, but it just feels cheap.

The back of the box has some key features with some images to be more appealing.

Club3D have really cut down on the accessories with this card, just a simple installation leaflet and a driver disc.

NVIDIA GTX 950 SLI Graphics Card Review

Introduction


Here at eTeknix, we strive to give the consumer the best possible advice in every aspect of technology. Today is no different as we bring you the SLI review of the latest budget pleaser from NVIDIA, the GeFoce GTX 950.

The NVIDIA GTX 950 is the newest card to be released based on the Maxwell architecture and aimed to bring solid 1080p performance for under £150. Sub vendors went completely nuts with this card and produced some simply great options that have put the squeeze on AMD in the budget gaming segment.

Now we don’t really ever consider low-end options for SLI for cost effectiveness reasons, but some consumers may only have £150 now and £150 next month and want to play straight away.

Because I had the choice of three different cards, the smart move would be to test the two most powerful options, the Inno3D iChill GTX 950 and MSI Gaming GTX 950. These both can be purchased for under £140 each through a wide range of retailers and work extremely well together. However, a matching set would give the best possible performance as the computer wouldn’t need to throttle one card to match the others performance.

That’s enough of an introduction, let’s find out how well this compact duo performs in the real world.

How to Flash a BIOS: Graphics Card and Motherboard Edition

Introduction


Flashing a BIOS can seem a daunting process and in some cases, it really is. You have the potential at any moment to completely break the component thanks to any number of causes from power cuts to accidental premature turn off; although the latter only normally happens if your computer is connected to a plug and someone else wants it.

So why are we required to flash the BIOS? Well firstly is system stability, in recent weeks we have seen the release of new graphics cards and motherboards and while the out of the box stability is great, things can only get better. While in the manufacturers testing facilities, they can only test so much. When testing motherboards, there are so many different types of RAM, Processor, Hard Drive, Graphics Card, etc… options available, that the manufacturer would have millions of different possibilities to test; taking up much-needed time. If you think of game testing, there is in-house, Alpha and Beta tests, consider that the early weeks after the launch is the Beta testing period where most of the issues are fixed, but there may still be some remaining.

Secondly, performance. We all want the maximum performance and while when the product is first released it roughly lands within expectations, after weeks or months of consumer testing; there could be a new stable performance level which could be permanently saved through the BIOS. This part impacts both graphics cards and motherboards, so periodically checking for updates could unleash a decent amount of performance.

Updating your BIOS can bring good and bad experiences, if you are updating to a Beta BIOS, you may experience some issues such as instability or even incompatibility with some external hardware; though that is extremely unlikely. Worst case scenario is a power cut or early removal of the flash drive and the BIOS breaks through corrupt file saving. On some motherboards and graphics cards, this can be rectified through a dual BIOS system that can repair the broken BIOS.

Batman: Arkham Knight Performance Analysis Update

Introduction


It’s not very often that we return to a game, but in this case, we couldn’t say no. Batman AK was one of the most anticipated of early 2015 and the final game in the Batman: Arkham series, so the hype surrounding it meant the game had to hit the top marks from launch. Sadly we all know that Batman AK had a very rough start to life with PC performance being the major issue. Users complained of the lack of graphical settings and the 30FPS cap. While we brought Batman AK performance figures in this state; it wasn’t always going to be as reliable with the game undergoing major work.

Developers decided to pull the game from Steam and other outlets to prevent more consumers purchasing a digital copy. This worked for a while and developers have managed to implement a good amount of updates to improve stability, graphics, and overall player enjoyment.

With the overhaul to the graphics portion, we have decided to return to the game and re-test all of our current generation graphics cards on the newest drivers available. To maintain consistency with our previous Gaming performance articles, these graphics cards will be pitted against the game at the highest optimal settings across three of the most popular resolutions.

Let’s begin shall we?

PNY GTX 950 2GB Graphics Card Update Review

Introduction


Following on from our GTX 950 round-up article, PNY has sent a model which looks extremely promising. We have already seen that the GTX 950 range performs extremely well and the price is extremely attractive for a cheap gaming build.

The PNY GTX 950 is the cheapest GTX 950 that PNY offers and features a single fan design. This is a step away from the usual dual fan designs that PNY produce and personally I really like the subtle design on this.

This card is essentially an unmodified NVIDIA version with an improved cooling design as it features the same base and boost clock speeds as the reference model or 1024MHz base and 1188MHz boost.These speeds don’t sound that impressive, but the overclocking capability of Maxwell and the GTX 950 range, in particular, will see this overclock well into the 1400MHz range.

Packaging and accessories

The outer box is simple and follows the design that everyone can relate too from graphics cards manufacturers by donning a ‘mascot’.

It’s refreshing not to be bombarded with information when you flip the box over. Instead, a few key features are outlined and a quick link to visit if you need more information.

Accessories are simple, with a quick installation guide, driver disc and DVI to VGA adapter.

Gainward Phoenix GTX 970 4GB Graphics Card Review

Introduction


The GTX 970 has been with us for around a year now and in that time it has cemented itself as possibly one of the best Maxwell-based cards available. Obviously the Titan X is the most powerful and the GTX 980Ti is the obvious option if you have £600 to spend, but the GTX 970 is the right amount of power to sustain consumers into the next generation of cards without taking a massive hit come resale when Pascal and HBM v2 is released.

Today in the test bench is the Gainward Phoenix GTX 970. It’s nice to see manufacturers still pushing this lower line considering how popular the GTX 980Ti has been and how saturated the market already is with competing cards.

I can’t really go into a GTX 970 review without at least touching on the issues that were present with the VRAM and miss advertised specifications. When it was first released, the GTX 970 seemed like the perfect card, 980 performance at a reduced price, then reviewers and consumers started to notice the drop in performance at high VRAM loads even though it was well within the VRAM limit of the card. NVIDIA decided to utilise an altered DDR5 memory architecture on this card which increased the speed of the first 3.5GB, but severely hindered the last 512MB. Along with that, the cores, ROP and TMU’s were all advertised higher than they really were. All that being said, the GTX 970 is still a cracking card and one of my all time favourites.

Packaging and accessories

When I opened the shipping box, I was surprised to how large the actual retail box was; the bright colours are certainly enticing.

The rear of the box is simple with key information listed. Some of the more important features regarding NVIDIA and the power of the GTX 970 are outlined with graphics.

The accessories are the usual lot, molex to PCIe 6-pin connector, DVI to VGA adapter, driver disc and installation manual.

PowerColor R9 390 PCS+ 8GB Graphics Card Review

Introduction


We have all had mixed opinions on the R9 300 series upon release, the rebranded nature of the 200 series was seen as the fall of AMD and short-changing consumers. However, while they are in fact rebranded, they are great cards and provided an excellent performance boost over the previous generation and are a great foundation for the Fiji range to be based on.

Today in the test bench is the PowerColor R9 390 PCS+. This is the only version of the R9 390 that PowerColor offer which is good as it’s not confusing to consumers to have to choose between different models. As with all other R9 390’s, it features 8GB VRAM, a 6000MHz memory clock and over 1000MHz core clock.

This R9 390 PCS+ edition in particular, features a 3 fan monster metal cooling shroud which hugs a large heatsink; ideal for 0db operation at low load levels. The design of this card is extremely deceiving, the shroud is wide at the top and comes in. This makes the card look a lot larger than it actually is, being 10mm shorter than the Gigabyte G1 gaming and 7mm short than the Sapphire Tri-X cooler.

Packaging and accessories

The outer skin of the box is plain, but also extremely attractive to the eye. The trio of colours and simple design show that this is a no fuss card and the specifications along the bottom show that it means business.

The back of the box has some key features with some images to be more appealing.

The accessories aren’t bursting from the seams with this card, PowerColor just offering the driver disc, installation manual and PCI-e power adapter.

Gigabyte G1 Gaming GTX 980Ti 6GB Graphics Card Review

Introduction


 

The GTX 980Ti has been around for a few months now and in that time, manufacturers have had the chance to not only design and create their own versions but also perfect the product. Gigabyte is one of the many manufacturers that have given the 980Ti the magic treatment with a revamped version of the G1 Gaming that we previously had a glimpse of at Computex 2015. It definitely looks the part, but we didn’t know the specification apart from a memory clock of 7010MHz and that it would have a similar cooling capacity of previous G1 Gaming cards of 600W.

Well now I can finally and happily announce that we have the Gigabyte G1 Gaming GTX 980Ti in our test bench and it is BIG; you seem to forget how big this card is thanks to the insane Windforce x3 cooling design. Like with most of the gaming series of graphics cards being released lately, this card comes with multiple clock speed settings. These different settings are preset and only available through the use of Gigabyte’s own OC Guru II software; the three settings are ECO mode of 1060MHz base and 1151MHz boost, Gaming mode of 1090MHz base and 1241MHz boost and OC mode of 1190MHz base and 1279MHz boost. I don’t really understand why there are multiple settings as most users will obviously not only want the best possible performance out of the box but also to overclock the graphics card to the maximum potential.

The G1 Gaming cooling shroud has been given a new lick of paint and a small LED upgrade. Like before, the Silent and Fan Stop LED’s illuminate when the fans have stopped.

The new feature is the customisation of the LED through the use of the bundled OC Guru II app.

Onto the specifications of this beast.

So with those specifications and price point, this is poised to be one of the best GTX 980Ti graphics cards on the market; so let’s find out shall we?

Packaging and accessories

The outer box skin resembles that of the latest G1 Gaming designs, simple yet key features are detailed to catch the potential customers eye.

The back of the box shows a breakdown of the card and slightly more detailed information of what was shown on the front.

Gigabyte offers a ‘no-frills’ accessories package, simply offering just a quick installation guide, driver disk and PCIe power cable adapter.

Palit Super JetStream GTX 980Ti 6GB Graphics Card Review

Introduction


The graphics card world experienced a shake-up when NVIDIA announced the Titan X, a 12GB VRAM monster, then did it again just a few weeks after with the launch of the GTX 980Ti. With this, it brought extreme performance to a much more affordable level and at just $600, it was around half the price of the Titan X and a in some tests just 5% slower than the Titan X.

Today on the test bench we have the Palit Super JetStream GTX; the Super JetStream branding is reserved solely for the top end air-cooled Palit graphics cards. Poised at just £539, this is only £10 more expensive than the reference design and boasts figures that only some of the higher priced models can achieve. The card features twin 10cm turbo fan blades with are designed to disperse heat as much as possible and also are translucent to project the built-in LED lighting even further.

We know that the GTX 980Ti is the current top mainstream graphics card and when it’s given to a sub-vendor it only gets better; so let’s find out how it performs in today’s review!

The outer skin is plain, displaying only key information with the trademark black and gold theme.

The back of the box is filled with information in many different languages. The key information revolves around the display features, gaming features and new ways to game, such as VR.

This box has a front door with all of the information you could want to know about the graphics card itself. Not only does it give you a graphic rendering of the card, there is a window into the box so you can see the GPU design directly.

Inside the box, you have a HDMI to DVI adapter, manual, driver disk, JetStream sticker and PCIe power adapter.

Nvidia GTX 950 Round-Up Review: Three Cards Go Head to Head

Introduction


With all of the hype surrounding the GTX 900 series recently, it has been hard to imagine what the lower end of the graphics card market would hold for the refined Maxwell architecture. We originally saw Maxwell in the mighty GTX 750Ti, but it was only when the GTX 900 series was released that we received the Maxwell that we know today. Our reviews and news have focused heavily on the GTX 980Ti and Titan X graphics cards, so information on the GTX 950 has been scarce to say the least; that is about to change. In today’s review, there is not one, not two but three GTX 950’s in for punishment; the ASUS STRIX GTX 950 2GB, Inno3D iChill AIRBOSS ULTRA GTX 950 2GB and the MSI GAMING GTX 950 2GB.

The GTX 950 is hot off the manufacturing line and features some lack-luster, but pokey specifications; knowing NVIDIA, less is more and we should see a stormer of a graphics card here regardless. Most of the options will feature 2GB of VRAM due to the product placement, but we will see some 4GB models which should make for a very capable SLI configuration for not a great deal of money. So with a price tag of around £120 depending on the manufacturer, performance isn’t going to be outstanding compared to the bigger options. However, the estimated performance and price tag makes this an extremely attractive option for 1080p and online gamers. Personally, I feel that this GTX 950 will be the final piece in the puzzle for NVIDIA; it will then have a great graphics option at almost every price point.

Now just because the GTX 950 is aimed at the lower price market, do not assume that you are not getting the full NVIDIA treatment. As with almost every NVIDIA GPU you will get the following:

  • NVIDIA Surround
  • NVIDIA SLI
  • GPU Boost 2.0
  • NVIDIA G-Sync
  • DX12 Support

The GM206 GPU core makes its second appearance here since the GTX 960 options; however, it seems to have had a shave to bring the performance down. Would this be enough or will we see another Titan X and GTX 980Ti scenario here, let’s find out shall we.