Ashes of the Singularity DirectX 12 Graphics Performance Analysis

Introduction


Ashes of the Singularity is a futuristic real-time strategy game offering frenetic contests on a large-scale. The huge amount of units scattered across a number of varied environments creates an enthralling experience built around complex strategic decisions. Throughout the game, you will explore unique planets and engage in enthralling air battles. This bitter war revolves around an invaluable resource known as Turinium between the human race and masterful artificial intelligence. If you’re into the RTS genre, Ashes of the Singularity should provide hours of entertainment. While the game itself is worthy of widespread media attention, the engine’s support for DirectX 12 and asynchronous compute has become a hot topic among hardware enthusiasts.

DirectX 12 is a low-level API with reduced CPU overheads and has the potential to revolutionise the way games are optimised for numerous hardware configurations. In contrast to this, DirectX 11 isn’t that efficient and many mainstream titles suffered from poor scaling which didn’t properly utilise the potential of current graphics technology. On another note, DirectX 12 allows users to pair GPUs from competing vendors and utilise multi graphics solutions without relying on driver profiles. It’s theoretically possible to achieve widespread optimization and leverage extra performance using the latest version of DirectX 12.

Of course, Vulkan is another alternative which works on various operating systems and adopts an open-source ideology. Although, the focus will likely remain on DirectX 12 for the foreseeable future unless there’s a sudden reluctance from users to upgrade to Windows 10. Even though the adoption rate is impressive, there’s a large number of PC gamers currently using Windows 7, 8 and 8.1. Therefore, it seems prudent for developers to continue with DirectX 11 and offer a DirectX 12 render as an optional extra. Arguably, the real gains from DirectX 12 will occur when its predecessor is disregarded completely. This will probably take a considerable amount of time which suggests the first DirectX 12 games might have reduced performance benefits compared to later titles.

Asynchronous compute allows graphics cards to simultaneously calculate multiple workloads and reach extra performance figures. AMD’s GCN architecture has extensive support for this technology. In contrast to this, there’s a heated debate questioning if NVIDIA products can even utilise asynchronous compute in an effective manner. Technically, AMD GCN graphics cards contain 2-8 asynchronous compute cores with 8 queues per core which varies on the model to provide single cycle latencies. Maxwell revolves around two pipelines, one designed for high-priority workloads and another with 31 queues. Most importantly, NVIDIA cards can only “switch contexts at draw call boundaries”. This means the switching process is slower and gives AMD and a major advantage. NVIDIA has dismissed the early performance numbers from Ashes of the Singularity due to its current development phase. Finally, the game’s release has exited the beta stage which allows us to determine the performance numbers after optimizations were completed.

Do AMD Drivers Really Deserve Such a Hostile Reception?

Introduction


AMD has a serious image problem with their drivers which stems from buggy, unrefined updates, and a slow release schedule. Even though this perception began many years ago, it’s still impacting on the company’s sales and explains why their market share is so small. The Q4 2015 results from Jon Peddie Research suggests AMD reached a market share of 21.1% while NVIDIA reigned supreme with 78.8%. Although, the Q4 data is more promising because AMD accounted for a mere 18.8% during the last quarter. On the other hand, respected industry journal DigiTimes reports that AMD is likely to reach its lowest ever market position for Q1 2016. Thankfully, the financial results will emerge on April 21st so we should know the full picture relatively soon. Of course, the situation should improve once Polaris and Zen reach retail channels. Most importantly, AMD’s share price has declined by more than 67% in five years from $9 to under $3 as of March 28, 2016. The question is why?

Is the Hardware Competitive?


The current situation is rather baffling considering AMD’s extremely competitive product line-up in the graphics segment. For example, the R9 390 is a superb alternative to NVIDIA’s GTX 970 and features 8GB VRAM which provides extra headroom when using virtual reality equipment. The company’s strategy appears to revolves around minor differences in performance between the R9 390 and 390X. This also applied to the R9 290 and 290X due to both products utilizing the Hawaii core. NVIDIA employs a similar tactic with the GTX 970 and GTX 980 but there’s a marked price increase compared to their rivals.

NVIDIA’s ability to cater towards the lower tier demographic has been quite poor because competing GPUs including the 7850 and R9 380X provided a much better price to performance ratio. Not only that, NVIDIA’s decision to deploy ridiculously low video memory amounts on cards like the GTX 960 has the potential to cause headaches in the future. It’s important to remember that the GTX 960 can be acquired with either 2GB or 4GB of video memory. Honestly, they should have simplified the process and produced the higher memory model in a similar fashion to the R9 380X. Once again, AMD continues to offer a very generous amount of VRAM across various product tiers.

Part of the problem revolves around AMD’s sluggish release cycle and reliance on the Graphics Core Next (GCN) 1.1 architecture. This was first introduced way back in 2013 with the Radeon HD 7790. Despite its age, AMD deployed the GCN 1.1 architecture on their revised 390 series and didn’t do themselves any favours when denying accusations about the new line-up being a basic re-branding exercise. Of course, this proved to be the case and some users managed to flash their 290/290X to a 390/390X with a BIOS update. There’s nothing inherently wrong with product rebrands if they can remain competitive in the current market. It’s not exclusive to AMD, and NVIDIA have used similar business strategies on numerous occasions. However, I feel it’s up to AMD to push graphics technology forward and encourage their nearest rival to launch more powerful options.

Another criticism regarding AMD hardware which seems to plague everything they release is the perception that every GPU runs extremely hot. You only have to look on certain websites, social media and various forums to see this is the main source of people’s frustration. Some individuals are even known to produce images showing AMD graphics cards setting ablaze. So is there any truth to these suggestions? Unfortunately, the answer is yes and a pertinent example comes from the R9 290 range. The 290/290X reference models utilized one of the most inefficient cooler designs I’ve ever seen and struggled to keep the GPU core running below 95C under load.

Unbelievably, the core was designed to run at these high thermals and AMD created a more progressive RPM curve to reduce noise. As a result, the GPU could take 10-15 minutes to reach idle temperature levels. The Hawaii temperatures really impacted on the company’s reputation and forged a viewpoint among consumers which I highly doubt will ever disappear. It’s a shame because the upcoming Polaris architecture built on the 14nm FinFET process should exhibit significant efficiency gains and end the concept of high thermals on AMD products. There’s also the idea that AMD GPUs have a noticeably higher TDP than their NVIDIA counterparts. For instance, the R9 390 has a TDP of 275 watts while the GTX 970 only consumes 145 watts. On the other hand, the Fury X utilizes 250 watts compared to the GTX 980Ti’s rating of 275 watts.

Eventually, AMD released a brand new range of graphics cards utilizing the first iteration of high bandwidth memory. Prior to its release, expectations were high and many people expected the Fury X to dethrone NVIDIA’s flagship graphics card. Unfortunately, this didn’t come to fruition and the Fury X fell behind in various benchmarks, although it fared better at high resolutions. The GPU also encountered supply problems and emitted a large whine from the pump on early samples. Asetek even threatened to sue Cooler Master who created the AIO design which could force all Fury X products to be removed from sale.

The rankings alter rather dramatically when the DirectX 12 render is used which suggests AMD products have a clear advantage. Asynchronous Compute is the hot topic right now which in theory allows for greater GPU utilization in supported games. Ashes of the Singularity has implemented this for some time and makes for some very interesting findings. Currently, we’re working on a performance analysis for the game, but I can reveal that there is a huge boost for AMD cards when moving from DirectX11 to DirectX12. Furthermore, there are reports indicating that Pascal might not be able to use asynchronous shaders which makes Polaris and Fiji products more appealing.

Do AMD GPUs Lack Essential Hardware Features?


When selecting graphics hardware, it’s not always about pure performance and some consumers take into account exclusive technologies including TressFX hair before purchasing. At this time, AMD incorporates with their latest products LiquidVR, FreeSync, Vulkan support, HD3D, Frame rate target control, TrueAudio, Virtual Super resolution and more! This is a great selection of hardware features to create a thoroughly enjoyable user-experience. NVIDIA adopts a more secretive attitude towards their own creations and often uses proprietary solutions. The Maxwell architecture has support for Voxel Global Illumination, (VGXI), Multi Frame Sampled Anti-Aliasing (MFAA), Dynamic Super Resolution (DSR), VR Direct and G-Sync. There’s a huge debate about the benefits of G-Sync compared to FreeSync especially when you take into account the pricing difference when opting for a new monitor. Overall, I’d argue that the NVIDIA package is better but there’s nothing really lacking from AMD in this department.

Have The Drivers Improved?


Historically, AMD drivers haven’t been anywhere close to NVIDIA in terms of stability and providing a pleasant user-interface. Back in the old days, AMD or even ATI if we’re going way back, had the potential to cause system lock-ups, software errors and more. A few years ago, I had the misfortune of updating a 7850 to the latest driver and after rebooting, the system’s boot order was corrupt. To be fair, this could be coincidental and have nothing to do with that particular update. On another note, the 290 series was plagued with hardware bugs causing black screens and blue screens of death whilst watching flash videos. To resolve this, you had to disable hardware acceleration and hope that the issues subsided.

The Catalyst Control Center always felt a bit primitive for my tastes although it did implement some neat features such as graphics card overclocking. While it’s easy enough to download a third-party program like MSI Afterburner, some users might prefer to install fewer programs and use the official driver instead.

Not so long ago, AMD appeared to have stalled in releasing drivers for the latest games to properly optimize graphics hardware. On the 9th December 2014, AMD unveiled the Catalyst 14.12 Omega WHQL driver and made it ready for download. In a move which still astounds me, the company decided not to release another WHQL driver for 6 months! Granted, they were working on a huge driver redesign and still produced the odd Beta update. I honestly believe this was very damaging and prevented high-end users from considering the 295×2 or a Crossfire configuration. It’s so important to have a consistent, solid software framework behind the hardware to allow for constant improvements. This is especially the case when using multiple cards which require profiles to achieve proficient GPU scaling.

Crimson’s release was a major turning point for AMD due to the modernized interface and enhanced stability. According to AMD, the software package involves 25 percent more manual test cases and 100 percent more automated test cases compared to AMD Catalyst Omega. Also, the most requested bugs were resolved and they’re using community feedback to quickly apply new fixes. The company hired a dedicated team to reproduce errors which is the first step to providing a more stable experience. Crimson apparently loads ten times faster than its predecessor and includes a new game manager to optimize settings to suit your hardware. It’s possible to set custom resolutions including the refresh rate, which is handy when overclocking your monitor. The clean uninstall utility proactively works to remove any remaining elements of a previous installation such as registry entries, audio files and much more. Honestly, this is such a revolutionary move forward and AMD deserves credit for tackling their weakest elements head on. If you’d like to learn more about Crimson’s functionality, please visit this page.

However, it’s far from perfect and some users initially experienced worse performance with this update. Of course, there’s going to be teething problems whenever a new release occurs but it’s essential for AMD to do everything they can to forge a new reputation about their drivers. Some of you might remember, the furore surrounding the Crimson fan bug which limited the GPU’s fans to 20 percent. Some users even reported that this caused their GPU to overheat and fail. Thankfully, AMD released a fix for this issue but it shouldn’t have occurred in the first place. Once again, it’s hurting their reputation and ability to move on from old preconceptions.

Is GeForce Experience Significantly Better?


In recent times, NVIDIA drivers have been the source of some negative publicity. More specifically, users were advised to ignore the 364.47 WHQL driver and instructed to download the 364.51 beta instead. One user said:

“Driver crashed my windows and going into safe mode I was not able to uninstall and rolling back windows would not work either. I ended up wiping my system to a fresh install of windows. Not very happy here.”

NVIDIA’s Sean Pelletier released a statement at the time which reads:

“An installation issue was found within the 364.47 WHQL driver we posted Monday. That issue was resolved with a new driver (364.51) launched Tuesday. Since we were not able to get WHQL-certification right away, we posted the driver as a Beta.

GeForce Experience has an option to either show WHQL-only drivers or to show all drivers (including Beta). Since 364.51 is currently a Beta, gamers who have GeForce Experience configured to only show WHQL Game Ready drivers will not currently see 364.51

We are expecting the WHQL-certified package for the 364.51 Game Ready driver within the next 24hrs and will replace the Beta version with the WHQL version accordingly. As expected, the WHQL-certified version of 364.51 will show up for all gamers with GeForce Experience.”

As you can see, NVIDIA isn’t immune to driver delivery issues and this was a fairly embarrassing situation. Despite this, it didn’t appear to have a serious effect on people’s confidence in the company or make them re-consider their views of AMD. While there are some disgruntled NVIDIA customers, they’re fairly loyal and distrustful of AMD’s ability to offer better drivers. The GeForce Experience software contains a wide range of fantastic inclusions such as ShadowPlay, GameStream, Game Optimization and more. After a driver update, the software can feel a bit unresponsive and takes some time to close. Furthermore, some people dislike the notion of GameReady drivers being locked in the GeForce Experience Software.  If a report from PC World is correct, consumers might have to supply an e-mail address just to update their drivers through the application.

Before coming to a conclusion, I want to reiterate that my allegiances don’t lie with either company and the intention was to create a balanced viewpoint. I believe AMD’s previous failures are impacting on the company’s current product range and it’s extremely difficult to shift people’s perceptions about the company’s drivers. While Crimson is much better than CCC, it’s been the main cause of a horrendous fan bug resulting in a PR disaster for AMD.

On balance, it’s clear AMD’s decision to separate the Radeon group and CPU line was the right thing to do. Also, with Polaris around the corner and more games utilizing DirectX 12, AMD could improve their market share by an exponential amount. Although, from my experience, many users are prepared to deal with slightly worse performance just to invest in an NVIDIA product. Therefore, AMD has to encourage long-term NVIDIA fans to switch with reliable driver updates on a consistent basis. AMD products are not lacking in features or power, it’s all about drivers! NVIDIA will always counteract AMD releases with products exhibiting similar performance numbers. In my personal opinion, AMD drivers are now on par with NVIDIA and it’s a shame that they appear to be receiving unwarranted criticism. Don’t get me wrong, the fan bug is simply inexcusable and going to haunt AMD for some time. I predict that despite the company’s best efforts, the stereotypical view of AMD drivers will not subside. This is a crying shame because they are trying to improve things and release updates on a significantly lower budget than their rivals.

Hitman PC Patch 1.03 Enhances DirectX 12 Performance

The Hitman franchise quickly became a firm favourite with stealth aficionados due to the tense takedowns and open world environments. Even though Hitman Absolution felt a little out-of-place, the game’s core mechanics were solid and provided an excellent experience. Its sequel was originally intended to be a standard full price release but this was quickly changed to suit an episodic business model. This sudden alteration raised concerns about the game’s content and value proposition. However, the reception for the first episode was overwhelmingly positive and looks like it helped improve the overall pacing.

Today, the latest Hitman patch has been released which includes a number of performance enhancements using the DirectX 12 render. This is vital because the DirectX 12 option created some performance issues on numerous setups. There’s also new challenges and a host of other content in the 1.03 update. Here is the changelog in full:

General game improvements

  • The Vampire Magician Challenge Pack : 10 new challenges that were inspired by how our community have been playing the game.
  • Continued improvements to load times: The improvements will be most notable when loading The Showstopper mission in Paris and we’re already working on improving all loading times even more.
  • Improved responsiveness for in-game menus and image loading: We’ve implemented an image caching system that will improve responsiveness and loading times for all images in the game menu.
  • Fixed issues with scoring: Primarily, this fixes an issue that resulted in many players earning a “0 second” time bonus and an incorrect score of 210,000. A leaderboard reset will be implemented at a later date.
  • Continued improvements to connectivity: Server stability improvements.

PC-specific improvements

General gameplay

  • Prompts for dumping a body now appear, regardless of the positioning of the game camera, both in Showstopper and Final Test missions.
  • 47 no longer drops a body immediately after starting to drag it.
  • Unnoticed kills or subdue will no longer trigger a brief “compromised” state. This previously caused players to fail the ‘never spotted’ reward.
  • The ‘visibly armed’ warning is now always shown when 47 is carrying a large weapon on his back.
  • The light rig (Showstopper) and life raft (Freeform Training) can now be reliably dropped using explosives.
  • Opportunities in Showstopper and Final Test have been made more consistent.
  • Fixed an issue where the first few seconds of the game would appear to run at double speed.

Showstopper

  • Novikov will no longer get stuck in an infinite loop during the Rare Scoop opportunity. This would happen if the player gets Max Decker to call Novikov whilst he is on his way to the interview.
  • Novikov will no longer talk to Dalia on the phone when she has already been eliminated.
  • Fixed a rare issue that made the “In Seine” challenge impossible to complete. This would happen when the Private Meeting opportunity overlapped with Novikov on stage at the Fashion Show.
  • Fixed an issue where the Showstopper mission could not be completed if the player calls Dalia during the Fireworks show whilst disguised as Helmut Kruger.

Crash Fixes

  • Fixed various crash issues that could occur when doing the following: loading a new stage in Escalation Contracts, loading the game or after exploding a gas cylinder.
  • Fixed a crash that occurred when shooting at the target in Freeform training during the “Searching” state.
  • Fixed many other crash issues that were occurring during gameplay.

Options

  • Added an option to the game launcher that allows players to override the ‘default memory safeguards’ and allow them to use any available resolution and graphic quality settings.
  • Increased resolution cap for players with 3gb graphics cards from 1920×1200 to 2560×1600.

DX12

  • DX12 shader and pipeline cache now included in all Steam builds.
  • Fixed a graphical issue with transparent windows on DX12.
  • Improved vertex colours on DX12.

Misc.

  • Briefings for Featured Contracts can now be viewed at the planning stage.
  • The ‘Contract Assassin’ achievement can now be completed.
  • The mouse cursor is now always visible on the ‘Load Game’ menu.
  • Fixed an issue where the game would get stuck at 12% loading between the Prologue missions.
  • Fixed a rare issue where players were asked to (re)download the Showstopper mission.

Hardware

  • Fixed a rare issue where the game was switching between controller and keyboard controls, due to detecting specific hardware as a ‘game controller’.

First AMD Zen Benchmarks Were an April Fools Prank

UPDATE: We have now been informed from a kind reader that this was an April Fools joke, so please disregard the benchmark images below

AMD has been lingering behind in the enthusiast CPU market and really struggled to compete with Intel’s flagship products. This isn’t a shocking revelation when you consider AMD is still using the FM2+ socket to house its current processor line-up. Thankfully, Zen is upon us and the first major socket change in a considerable amount of time. We’re all hoping that AMD can become competitive again and Zen really helps bring innovation forward in the stagnant CPU market. AMD’s President and CEO, Lisa Su provided a small insight into Zen’s performance numbers and suggested they will bring a 40 percent IPC boost over the current line-up. Up to this point, all performance benchmarks have been kept under wraps and any numbers revolved around pure speculation.

However, images provided by Bits&Chips clearly illustrate the performance differences between a Octacore AMD Zen CPU and competing products. The CPU’s FP32 Ray-Trace score outperforms the i7-4930K but it’s not as impressive as the CPU hash results. This means the architecture might implement a weaker FMA.

On a more positive note, DDR4 bandwidth performance is impressive and competes with the i7-6700K. The CPU Hash is significantly better than the i7-5820K and even surpasses a 20-core Xeon. Only time will tell if AMD’s latest processors can offer similar performance to Intel products and instigate a pricing war. Currently, the i7-6700K is extremely expensive for a 4-core CPU and there needs to be some competition to drive innovation. I cannot wait to get my hands on AMD’s AM4 motherboards and finally see if they’ve come up with the goods. The basic data we have so far and information from AMD is promising but it’s always unclear until the testing has been completed from independent sources.

Do you think AMD will be able to have a much stronger foothold in the CPU market once Zen releases?

MSI Z170A GAMING M5 (LGA1151) Motherboard Review

Introduction

Since the release of Intel’s Z170 chipset, MSI has unveiled a fantastic, feature-rich motherboard range which caters to contrasting user demands. For example, the Z170A GAMING PRO CARBON is an excellent choice for consumers wanting a stylish black colour scheme and great reliability at an affordable price point. In contrast to this, the MSI Z170A XPOWER GAMING TITANIUM Edition‘s gorgeous aesthetic design makes it one of the most innovative LGA1151 motherboards on the market. Of course, the iconic dragon styling on many MSI products have become a popular choice among the core gaming demographic. This red and black theme compliments mainstream hardware and retails at very competitive prices across various performance tiers.

The MSI Z170A GAMING M5 is a mid-range motherboard sporting an attractive design and impressive specification. More specifically, the product is capable of housing two PCI-E M.2 storage devices and has support for USB 3.1 Gen2 connectivity. Not only that, the motherboard includes a one year premium XSplit license and Nahimic audio enhancements. As you might expect, many of MSI’s leading technologies are incorporated such as DDR4 Boost, Game Boost and much more. Given this particular model’s astonishing software suite and military class components, I expect to see it rival higher priced offerings rather well. Could this be the best value Z170 motherboard thus far for high-end users? Let’s find out!

Specifications

Packing and Accessories

MSI always does a phenomenal job when it comes to packaging design and the Z170 GAMING M5 is no different. The bold colours and stunning product snapshot contrasts extremely well. This is one of the most eye-catching motherboard boxes I’ve seen and showcases the motherboard’s beautiful appearance.

On the opposite side, there’s a brief synopsis of the motherboard’s key selling points such as support for 3-way crossfire, 2-way SLI and Audio Boost 3.0. This is presented in a slick manner and doesn’t alienate the end-user with technical jargon.

In terms of accessories, the motherboard comes with a user’s guide, driver’s disk, metal case badge, I/O shield, SLI bridge, registration details, basic installation guide and four SATA cables. Please note, the press sample I received was previously used by another media so there’s only 3 SATA cables displayed in the photograph. Rest assured, the retail version will include four and be packaged without the need for an elastic band.

ASUS Sabertooth Z170 S (LGA1151) Motherboard Review

Introduction

ASUS have compiled a comprehensive Z170 motherboard range which caters to different sections of the consumer market. For example, the GAMING PRO line-up offers superb functionality and impeccable stability at an affordable price point. ROG products evoke a more premium feel and includes a stunning software suite for power users. The Sabertooth brand revolves around a stringent testing procedure to ensure each motherboard exhibits unprecedented reliability. The extreme thermal testing and deployment of TUF components prioritizes long-term durability. As a result, it’s a great option for consumers who demand a very high-end motherboard and have no intentions of upgrading in the near future.

Typically, motherboards opt for a red and black colour scheme because it’s the most popular option among the core gaming audience. Some time ago, ASUS unleashed the limited edition Z97 Sabertooth Mark S which utilizes an innovative white PCB and military camouflage. It’s quite rare to see motherboard sporting a white theme and while there is some competition from the MSI Krait series, ASUS is the only manufacturer to offer a pure white PCB.

The Sabertooth Z170 S is the spiritual successor to the Z97 Sabertooth Mark S and features a very unusual design philosophy. When adopting such a wacky colour scheme, it’s bound to have a polarizing reception and I’m fascinated to hear feedback on ASUS’ aesthetic choices. Looking beyond the visual aspects, I’m expecting to see some very impressive numbers given the premium electronics and DIGI+ Power Control.

Specifications

Packing and Accessories

The motherboard’s box is characterized by a white finish and contains camouflage highlights. This provides a great insight into the product’s unconventional styling and creates a web of intrigue. On the front, information regarding the 5 year warranty is displayed in a clear manner.

Moving onto the opposite side, there’s a detailed description of the product’s thermal radar monitoring made possible by the TUF ICe processor. On another note, the packaging outlines the rear I/O connectivity and basic motherboard layout.

Included in the package is a user’s guide, M.2 screws, driver’s disk, case badge, certificate of reliability, stickers and a gorgeous white I/O shield. As someone who loves the technical details of motherboards, it’s fantastic to read the reliability assessment document. Here, you can browse information regarding a huge array of tests such as moisture resistance, thermal shock, solder bath, salt spray and more!

There’s also four SATA cables, a Q Connector, CPU installation tool, back I/O dust cover and SLI bridge. The Q Connector is a really handy tool which eliminates the frustration factor when attaching front panel headers. Furthermore, the CPU installation tool is designed to minimize the contact time and pressure between your fingers and the CPU. While it’s not necessary for veteran builders, it could prevent beginners from causing damage during the build process.

Far Cry Primal Graphics Card Performance Analysis

Introduction


The Far Cry franchise gained notoriety for its impeccable graphical fidelity and enthralling open world environment. As a result, each release is incredibly useful to gauge the current state of graphics hardware and performance across various resolutions. Although, Ubisoft’s reputation has suffered in recent years due to poor optimization on major titles such as Assassin’s Creed: Unity and Watch Dogs. This means it’s essential to analyze the PC version in a technical manner and see if it’s really worth supporting with your hard-earned cash!

Far Cry Primal utilizes the Dunia Engine 2 which was deployed on Far Cry 3 and Far Cry 4. Therefore, I’m not expecting anything revolutionary compared to the previous games. This isn’t necessarily a negative concept though because the amount of detail is staggering and worthy of recognition. Saying that, Far Cry 4 was plagued by intermittent hitching and I really hope this has been resolved. Unlike Far Cry 3: Blood Dragon, the latest entry has a retail price of $60. According to Ubisoft, this is warranted due to the lengthy campaign and amount on content on offer. Given Ubisoft’s turbulent history with recent releases, it will be fascinating to see how each GPU this generation fares and which brand the game favours at numerous resolutions.

“Far Cry Primal is an action-adventure video game developed and published by Ubisoft. It was released for the PlayStation 4 and Xbox One on February 23, 2016, and it was also released for Microsoft Windows on March 1, 2016. The game is set in the Stone Age, and revolves around the story of Takkar, who starts off as an unarmed hunter and rises to become the leader of a tribe.” From Wikipedia.

Gigabyte Z170-Gaming K3 (LGA1151) Motherboard Review

Introduction


Intel’s current iteration of enthusiast processors offering impressive overclocking headroom incurs a fairly hefty premium compared to the previous generation especially if you’re opting for the i7-6700K. Unfortunately, the retail version of this CPU sporting a 3 year warranty still teeters around the £300 mark, and falls into a similar budget to the 6-core 5820K. The real savings when selecting the Z170 chipset revolve around cheaper motherboards which usually cater towards the gaming demographic with LED illumination, unusual colour schemes and a comprehensive software suite. It’s astonishing to see the kind of performance and bundled list of features on products under £100. At this price, there’s fierce competition and some manufacturers have struggled to outline the value of H170/B150 alternatives due to the narrow difference to affordable Z170 options.

The latest motherboard from Gigabyte targets the mainstream audience utilizing a single discrete graphics card, and overclocked Skylake processor. While it does technically support Crossfire, the lack of x8/x8 functionality might be a deal breaker for users wanting the absolute maximum bandwidth. There’s also no support for SLI setups either which may be a contentious issue. To be honest, I don’t see this as a huge problem because the motherboard retails for approximately £95 and dual card configurations are fairly niche in today’s market. Despite the very low price, Gigabyte has still implemented 32Gb/s M.2 storage, a great audio solution and USB 3.1. From the outset, it seems Gigabyte managed to achieve this excellent specification on a budget by removing SLI support. I’m interested to see the stock performance numbers though compared to high-end solutions and determine the motherboard’s credentials. Could this be the best value gaming Z170 motherboard ever released? Let’s find out!

Specifications

Packing and Accessories

The Z170-Gaming K3 is packaged in a visually appealing box showcasing the attractive dual tone PCB design. This draws you into the product and evokes a sense of excitement prior to the unboxing process.

On the rear, there’s a huge description regarding the motherboard’s high-speed connectivity, premium audio hardware and networking with traffic prioritization for gaming purposes. This is presented in an easy to understand manner, and the clear pictures do a great job of relaying technical information without bamboozling the end-user.

In terms of accessories, the motherboard comes with an I/O shield, user’s guide, G Connector, case badge and SATA cables opting for a very stylish metallic look. This is the first time I’ve seen a color of this ilk but I have to admit it’s a nice touch and looks fantastic. The G Connector is really useful when connecting those fiddly front panel connectors and improves the user-experience when building a new system. Other additions include a rather fetching door hanger, drivers disk, and World of Warships content code.

Inno3D GTX 980Ti iChill Black Graphics Card Review

Introduction


Closed-loop liquid coolers have become extremely popular in the CPU market due to the cleaner build, and greater space around the CPU socket compared to traditional air cooling hardware. This means you can install an all in one liquid cooler without having to make concessions in terms of memory compatibility or worry too much about your motherboard’s PCI-E arrangement. As you might expect, all in one liquid coolers have progressively moved into the GPU sector to offer improved overclocking headroom and a lower noise output. There are some interesting parallels between CPU and GPU all in one liquid cooling though which needs to be addressed.

Firstly, many air coolers like the Noctua NH-D15 can outperform Asetek units, while being much quieter. It’s a similar picture with graphics cards because proficient air cooling setups including the Gigabyte Windforce X3 and Sapphire Tri-X provide a superb noise to performance ratio. Liquid cooled graphics cards have a price premium and involve a more complicated installation process. It’s important to remember that Maxwell is a very mature and efficient architecture which allows vendors to enable a 0dB idle fan mode. Despite my own qualms about closed-loop liquid cooling, it’s fantastic to see products which cater to a different target market. There’s clearly a demand for pre-assembled liquid cooled graphics cards, and their appeal is bound to grow in the next few years.

Today, we’re taking a look at the Inno3D GTX 980Ti iChill Black which utilizes a very powerful hybrid cooling solution. The GPU incorporates a traditional fan which only switches on during heavy load, in addition to a 120mm fan/radiator combination. The Arctic Cooling radiator fan is constantly on but has a very low RPM curve to maintain silent running. This impeccable hardware allows for an impressive core clock of 1203MHz and default boost reaching 1304MHz. The memory has also been increased to 7280MHz. As you can see from the chart below, this isn’t the greatest configuration we’ve encountered from the factory, but it’s exceedingly fast and should be a top performer. It will be fascinating to contrast this graphics card with the marvellous Inno3D GTX 980Ti X3 Ultra DHS which opts for a hefty air cooling design.

Specifications:

Packing and Accessories

The Inno3D GTX 980 Ti iChill Black comes in a huge box to properly house the closed loop cooler’s tubing and protect against leaks during shipping. Honestly, the picture doesn’t provide an accurate depiction of the packaging’s size. I have to commend Inno3D because they have taken the precautionary steps to reduce the possibility of damage occurring and utilized strong foam inserts as cushioning materials. The box itself features an attractive render of the GPU, and outlines its specification.

On the rear portion, there’s a brief synopsis of NVIDIA’s Maxwell architecture. I’m a bit surprised to see the back doesn’t contain any information about the liquid cooling solution and the acoustical benefits compared to NVIDIA’s reference cooler.

In terms of accessories, the graphics card is bundled with mounting screws, 6-pin PCI-E to molex adapter, case badge, DVI-D to VGA adapter and installation guide. There’s also a driver’s disk which you should disregard, a copy of 3DMark, and other documentation. This is a great selection of items and provides everything you need to get started! The mouse mat is surprisingly high-quality and relatively thick.

Gigabyte GeForce GTX 980Ti Xtreme Gaming Graphics Card Review

Introduction


NVIDIA’s cogent strategy to launch the Titan X at $999 and subsequently release the GTX 980Ti with similar performance at a significantly reduced price was a master stroke. This made the 980Ti compelling value and a great choice for high-end consumers wanting the best possible experience at demanding resolutions. Admittedly, there isn’t a GPU on the market capable of driving a 4K panel at maximum details but you can attain 60 frames-per-second with reduced settings. Evidently, the 980Ti has proved to be a popular choice especially when you take into consideration that factory overclocked models can easily pull away from NVIDIA’s flagship graphics card. While there is some competition from the Fury X, it’s not enough to dethrone custom-cooled 980Ti models.

Some users might argue that the upcoming Pascal architecture built on the 16nm manufacturing process and utilizing HBM2 ultra fast memory is reason enough to hold off buying a top-tier Maxwell solution. However, the current estimate suggests Pascal won’t launch until Q2 this year, and there’s no indication regarding pricing. As always, any new product has a price premium and I expect enthusiast Pascal cards to retail at a high price point. This means purchasing a Maxwell-based GPU right now isn’t a terrible option unless you require additional power to enjoy 4K gaming and have deep pockets. One of the best custom-designed GTX 980Ti cards on the market is the Gigabyte G1 Gaming. This particular GPU rapidly gained a reputation for its overclocking ability and superb Windforce triple fan cooling hardware.

The latest addition to Gigabyte’s graphics range is the GTX 980Ti Xtreme Gaming sporting a 1216MHz core clock, 1317MHz boost, and memory running at 7200MHz. One major improvement is the use of illuminated RGB rings behind the fans which creates a very unusual, and stylish appearance. Gigabyte’s GPU Gauntlet is a binning process which selects the best performing chips with impressive overclocking headroom. Once discovered, the top chips are incorporated into the Xtreme Gaming series, and G1 Gaming. By default, the Xtreme Gaming is bundled with a hefty overclock and should offer sensational performance. Although I expect to see some further gains due to the excellent cooling and stringent binning procedure. Could this be the best 980Ti on the market thus far?

Specifications:

Packing and Accessories

The product comes in a visually appealing box which outlines the extreme performance and gaming focus. I really like the sharp, dynamic logo with bright colours which draws you into the packaging.

On the rear side, there’s a brief description of the Windforce X3 cooling system, RGB illumination, GPU Gauntlet, and premium components. The clear pictures provide a great insight into the GPU’s main attributes and it’s presented in such a slick way.

In terms of accessories, the graphics card includes a driver disk, quick start guide, case badge, sweat band and PCI-E power adapter. It’s quite unusual to see a sweat band, but I’m sure it could come in handy during a trip to the gym or intense eSports contest.

Assassin’s Creed Syndicate Patch 1.4 Improves Performance

Prior to the release of Assassin’s Creed Syndicate, Ubisoft promised to learn from their previous mistakes and ensure the game was properly optimized across a wide range of hardware configurations. The previous title, Assassin’s Creed Unity was a shambles, and became a source of mockery due to the hilarious bugs, and broken gameplay. Despite Ubisoft’s best assurances, Assassin’s Creed Syndicate didn’t run that well and had some fairly hefty system requirements for high resolutions. Furthermore, the game’s SLI support was terrible and exhibited poor scaling. Thankfully, this has now been resolved in a driver update from NVIDIA which helps users with dual card setups to leverage extra performance.

On another note, Ubisoft has just released patch 1.4 with a raft of performance improvements and enhanced stability. Here is the changelog in full:

Assassin’s Creed: Syndicate – Patch 1.4 Changelog:

PC

  • Added DLC – The Last Maharaja support
  • Fixed “Jack’s message” puzzle issue
  • Fixed crash in World War I mission
  • Fixed crash on Title Screen when downloading Jack The Ripper
  • Fixed geometry corruption on Intel integrated GPU
  • Fixed TXAA shaking
  • Fixed several render issues
  • Fixed several UI issues
  • Fixed few online issues

Online

  • Fixed an issue where glitches might fail to award the player Helix rewards
  • Fixed an issue where the permanent XP Boost from the Season Pass would not be present in some cases

World/3D/Menu/HUD

  • Fixed a typo with the Military Chapel’s bombing description
  • Fixed an issue where the “Legendary Assassin kukri” would appear uncrafted and unusable after crafting it
  • Fixed an issue with some achievements unlocking when they should not

Mission

  • Fixed an issue in the “A Night to Remember” mission where the user could be stuck outside the vault in some rare cases, leaving no possibility to progress further
  • Fixed an issue, in the “Jack the Ripper” DLC, in the “Jack’s Lieutenants” mission where the objective would not update
  • Fixed an issue, in the “Jack the Ripper” DLC, in the “Letter of Intent” mission where it could fail without a desynch in some specific cases

Stability/Performance

  • Improved performance and stability

Does Assassin’s Creed Syndicate run well on your system?

Sapphire Nitro OC R9 Fury Graphics Card Review

Introduction


The initial unveiling of AMD’s Fury X was eagerly anticipated due to the advent of high bandwidth memory, and potential to revolutionize the size to performance ratio of modern graphics cards. This new form of stackable video RAM provided a glimpse into the future and departure from the current GDDR5 standard. Although, this isn’t going to happen overnight as production costs and sourcing HBM on a mass scale has to be taken into consideration. On another note, JEDEC recently announced GDD5X with memory speeds up to 14 Gbps which helps to enhance non-HBM GPUs while catering to the lower-mid range market. The Fury X and Fury utilizes the first iteration of high bandwidth memory which features a maximum capacity of 4GB.

There’s some discussion regarding the effect of this limitation at high resolutions but I personally haven’t seen it cause a noticeable bottleneck. If anything, the Fury range is capable of outperforming the 980 Ti during 4K benchmarks while it tends to linger behind at lower resolutions. AMD’s flagship opts for a closed-loop liquid cooler to reduce temperatures and minimize operating noise. In theory, you can argue this level of cooling prowess was required to tame the GPU’s core. However, there are some air-cooled variants which allow us to directly compare between each form of heat dissipation.

Clearly, the Fury X’s water cooling apparatus adds a premium and isn’t suitable for certain chassis configurations. To be fair, most modern case layouts can accommodate a CLC graphics card without any problems, but there’s also concerns regarding reliability and the possibility of leaks. That’s why air-cooled alternatives which drop the X branding offer great performance at a more enticing price point. For example, the Sapphire Nitro OC R9 Fury is around £60 cheaper than the XFX R9 Fury X. This particular card has a factory overclocked core of 1050MHz, and astounding cooling solution. The question is, how does it compare to the Fury X and GTX 980 Ti? Let’s find out!

Specifications:

Packing and Accessories

The Sapphire Nitro OC R9 Fury comes in a visually appealing box which outlines the Tri-X cooling system, factory overclocked core, and extremely fast memory. I’m really fond of the striking robot front cover and small cut out which provides a sneak peek at the GPU’s colour scheme.

On the opposite side, there’s a detailed description of the R9 Fury range and award-winning Tri-X cooling. Furthermore, the packaging outlines information regarding LiquidVR, FreeSync, and other essential AMD features. This is displayed in an easy-to-read manner and helps inform the buyer about the graphics card’s functionality.

In terms of accessories, Sapphire includes a user’s guide, driver disk, Select Club registration code, and relatively thick HDMI cable.

Rise of the Tomb Raider Performance Analysis

Introduction


Rise of the Tomb Raider originally launched on November 10th and received widespread critical acclaim from various press outlets. Unfortunately, the game went under the radar because Fallout 4 released on the same day. This was a strategic error which hindered the game’s sales and prevented consumers from giving it their undivided attention. It’s such a shame because Rise of the Tomb Raider is a technical marvel when you consider the Xbox One’s limited horsepower. Even though it’s not technically an exclusive, PC players had to wait until after the Christmas period to enjoy the latest exploits of everyone’s favourite heroine.

The PC version was created by Nixxes Software who worked on the previous Tomb Raider reboot as well as a number of other graphically diverse PC games. The studio is renowned for creating highly polished and well-optimized PC versions featuring an astonishing level of graphical fidelity. Prior to release, NVIDIA recommended a GTX 970 for the optimal 1080p experience and 980 Ti for 1440P. Since then, there have been some performance patches from the developer and driver updates to help with scaling across various hardware configuration. This means it will be fascinating to see the performance numbers now that the game has matured and gone through a large number of post-release hot fixes.

Rise of the Tomb Raider is an action-adventure video game developed by Crystal Dynamics and published by Square Enix. It is the sequel to the 2013 video game Tomb Raider, which was itself, the second reboot to its series. It was released for Xbox One and Xbox 360 in November 2015 and for Microsoft Windows in January 2016. It is set to release for PlayStation 4 in late 2016.

The game’s storyline follows Lara Croft as she ventures into Siberia in search of the legendary city of Kitezh, whilst battling a paramilitary organization that intends on beating her to the city’s promise of immortality. Presented from a third-person perspective, the game primarily focuses on survival and combat, while the player may also explore its landscape and various optional tombs. Camilla Luddington returns to voice and perform her role as Lara.” From Wikipedia.

So, let’s get to it and see how some of the latest graphics cards on the market hold up with the latest from Crystal Dynamics!

Rise of The Tomb Raider Patch Boosts Performance

Rise of the Tomb Raider originally launched as an Xbox One exclusive and looks phenomenal given the technical limitations of Microsoft’s latest console. The PC version was spearheaded by Nixxes Software, a studio renowned for its optimization skills and created a wide range of very competent PC ports. Thankfully, the team produced Rise of the Tomb Raider and implemented various enhancements including HBAO+, enhanced lighting effects as well as support for high resolutions. While the game is quite demanding, this is justified by the visual fidelity on offer.

Recently, the publisher released a patch which contains a number of performance improvements in GPU bound scenarios. Here we can see the changelog in full:

Rise of the Tomb Raider – PC Update 1.0.616.4 Changelog:

  • Fixed Map sometimes not showing or showing the wrong region.
  • Fixed ALT-TAB in combination with Exclusive Fullscreen occasionally hanging the game or entire system. (Steam Only)
  • Fixed graphics glitches on NPC clothing on NVIDIA 6×0 and 7×0 hardware.
  • Fixed rare crashes with a “DX11 Internal Heap” error.
  • Added separate mouse sensitivity control for X and Y axis, allowing users to equalize sensitivity.
  • Added option to reduce in-game camera shake, for users that prefer this.
  • Added audio-cue for finding secrets in the relic viewer.
  • Fixed game changing system screen saver settings for some users. (Steam Only)
  • Added additional error handling and messaging in case the GPU driver is crashing or unresponsive.
  • Various performance improvements for GPU-bound situations. CPU bound scenarios are not impacted.
  • A variety of other smaller optimizations, bug-fixes, and tweaks.

It will be fascinating to see how this impacts on the frame-rate across various tiers of GPUs. In the coming days, we should have a performance analysis using the latest patch and I cannot wait to see the results. Nixxes Software always works extremely hard to refine the overall level of optimization and continues to provide excellent post launch support.

MSI Z170A GAMING PRO CARBON (LGA1151) Motherboard Review

Introduction


MSI has rapidly established itself as one of the most reputable motherboard manufacturers and constantly strives to enhance the user experience through an intuitive BIOS interface and marvellous reliability. Additionally, the company offers a huge range of products to suit various colour schemes and often creates extremely-unique designs as demonstrated by the gorgeous, Z170A XPOWER GAMING TITANIUM motherboard. Some time ago, MSI released the Z170A GAMING PRO which adopted a fantastic red and black theme to please the core gaming demographic. Although, it’s difficult to stand out when using this colour scheme because manufacturers tend to fixate on a safe, popular design. That’s not to say there’s anything particular wrong with utilizing these colours, but I’d prefer to see more vendors breaking the mould through truly unusual aesthetic choices.

Since the Z170A GAMING PRO’s release, MSI has listened intently to user feedback and decided to construct a brand new model entitled, the Z170A GAMING PRO CARBON. At first glance, the only difference appears to be the new carbon fibre skin. However, this isn’t the case because MSI has made a raft of changes to enhance the motherboard’s connectivity. More specifically, the Z170A GAMING PRO CARBON features two USB 3.1 Gen 2 ports, one being type-A while the other is type-C. Furthermore, the redundant PCI slot has been dropped in favour of a fourth PCI-E x1 slot. Thankfully, the 180-degree angle SATA ports have been removed and replaced with a more suitable arrangement using right-angled connectors. Finally, the USB 3.1 Gen 1 layout features two on the rear and four via an internal header while USB 2.0 options now contain a total of 8 ports through a front four rear four setup.

With a recommended retail price of £119.99, the Z170A GAMING PRO CARBON is destined to compete alongside the ASUS Z170 PRO GAMING. As a result, it will be fascinating to see how the different products compare and I expect the Z170A GAMING PRO CARBON to remain very competitive in synthetic testing.

Specifications

Packaging and Accessories

The motherboard comes is a visually appealing box which outlines the RGB functionality, and gaming focus. I particularly like the neon design from the background vehicle which corresponds with the sort of lighting embedded onto the motherboard’s PCB.

The rear portion is packed full of information regarding the motherboard’s layout, impressive software package and premium-grade hardware. This is presented in a really clean, and concise manner with statistics to help quantify the importance of each unique feature.

In terms of accessories, there’s a detailed user’s guide, product registration card, cable labels, CPU installation guide and driver disk. It’s great to see the inclusion of cable labels because they help with diagnostics if you have multiple drives in a RAID configuration and struggle to determine which is the boot device.

Here we can see the bundled I/O shield, SLI bridge and SATA cables. The I/O shield’s red lettering and MSI dragon logo evokes a luxury feel and emphasizes the motherboard’s target audience.

SuperMicro C7Z170-OCE (LGA1151) Motherboard Review

Introduction



SuperMicro is one of the most respected names in the server industry and synonymous with unparalleled reliability. Whether you’re after a rackmount, blade server system or workstation motherboard, there’s nothing on the market which manages to enthuse such a sense of rock solid stability. Recently, the company has taken their server roots and impeccable reputation into enthusiast consumer motherboards. This allows them to retain the server framework while offering more ostentatious motherboard designs. Additionally, SuperMicro’s highly refined production line results in a low RMA rate and each product evokes such a premium feel. While their previous attempts have been a little rough around the edges, it’s clear to see the rapid progress in terms of motherboard layout and visual exuberance. Typically, motherboard manufacturers opt for the gaming themed red and black colour scheme because it caters towards the core demographic and maximize sales.

However, they have adopted a very different approach and launched the SuperMicro C7Z170-OCE, which utilizes a very striking green colour scheme. Not only that, the motherboard incorporates an impressive array of overclocking buttons to make manual tweaks without entering the BIOS. There’s also a high-quality speaker, and LED post readout to help with system diagnostics. The C7Z170-OCE’s electrical circuitry is designed with extreme overclocking in mind and able to push any Skylake CPU to its absolute limit. Yes, that means BCLK overclocking is possible even on a locked multiplier CPU. Although, given the Z170 chipset, this wouldn’t be a sensible choice.

Another key selling point is the embedded PLX PEX8747 chip capable of supporting 3-way SLI in a x16/x8/x8 configuration. If you require a 2-way setup, then the motherboard can easily accommodate this via a x16/x0/x16. Other notable features include a Realtek ALC1150 audio solution with dedicated PCB isolation, USB 3.1 Type-C connectivity, ample supply of fan headers and much more! As you can see, the motherboard sports an incredible specification and I expect it to perform superbly across CPU intensive tasks. In the past, I’ve experienced a few issues with our DDR4-2666MHz bench memory kit on SuperMicro products, so it will be interesting to see if compatibility has improved on this latest model.

Specifications

Packaging and Accessories

The motherboard comes in SuperMicro’s iconic box design which looks fantastic and creates a sense of luxury.

On the rear, there’s a detailed description of SuperMicro’s philosophy to bring server quality to the gaming market. Furthermore, this section outlines the key specification in an easy-to-understand manner.

In terms of accessories, the motherboard is packaged with a driver disk, I/O shield, M.2 screws and a quick reference guide. Unlike the majority of other vendors, the SuperMicro guide is fairly brief and provides a technical rundown of the motherboard layout. While I find the included diagram quite useful, some users might prefer a more comprehensive set of instructions to help with troubleshooting. For example, the user’s guide directs you to URL to find the meaning of each BIOS debug code instead of printing them. This is important because when the system fails to post, you cannot access the online documentation. Although, I guess it’s easy enough to check on a phone or tablet for the meaning of each error code.

There’s also six SATA connectors in a red finish. Ideally, I’d like to see three of these with a right-angled end, and the red colour doesn’t really match the motherboard’s theme. Perhaps, swapping the red for green, or a jet black tone would enhance the overall level of synergy.

Patriot Viper 4 DDR4 3200MHz 16GB (2x8GB) Dual Channel Memory Kit Review

Introduction


DDR4 memory kits are steadily superseding DDR3 DIMMs due to competitive pricing and the advent of Intel’s LGA1151 chipset which supports speeds in excess of 3200MHz. Furthermore, DDR4 modules require less voltage to remain stable despite the typical increase in memory bandwidth. Recently, professional overclocker Shamino set an astounding world record and overclocked the G.Skill Ripjaws 4 to 4255MHz using a mere 1.3 volts. Clearly, this is an extreme case and the majority of DDR4 kits available to consumers range between 2400MHz and 4000MHz. Plus, the performance difference in gaming tasks primarily revolves around your system’s graphics card, and CPU. Nevertheless, it’s still important to select high-quality DIMMs to keep your PC perfectly stable and compliment the other components.

The Patriot Viper series is synonymous for offering excellent memory speeds at an affordable price point. Here’s a brief description of the product directly from the manufacturer:

“Patriot Memory’s Viper 4 Series memory modules are designed with true performance in mind. Built for the latest Intel® Skylake processor utilizing the 100 series platform, the Viper 4 series provides the best performance and stability for the most demanding computer environments.

The Viper 4 series utilizes a custom designed high performance heat shield for superior heat dissipation to ensure rock solid performance even when using the most taxing applications. Built from the highest quality Build of Materials, Patriot Memory’s Viper 4 Series memory modules are hand tested and validated for system compatibility.

Available in dual kits, 8GB, 16GB and 32GB kits, Patriot Memory’s Viper 4 Series will be offered at speeds from 2400MHz up to 3600MHz and XMP 2.0 ready. Hand tested for quality assurance the Viper 4 series is backed by a lifetime warranty.”

As you can see, the latest version of the Viper range comes in a variety of capacities and memory speeds to suit a wide range of user requirements. Given the impressive 3200MHz speed, 16-16-16 timings and respectable voltage, I expect to see some superb numbers which legitimately rival the best dual channel kits we’ve tested!

Specifications

Packaging and Accessories

Patriot have adopted a clean, bold design to the memory’s packaging which makes it easy to read the key specifications while admiring the DIMM’s colour scheme. Here we can see a visual run down of the memory’s speed, capacity, XMP version and other essential statistics. Many kits on the market utilize pretty plain blister packs which don’t enthuse a luxury feel. In this case, the packaging draws you in and leaves a very positive initial impression.

On the rear section, there’s information about Patriot’s lifetime warranty, a brief synopsis of the product, and links to the company’s presence across various social media platforms.

A Closer Look

From an aesthetics standpoint, the DIMMs have a rather understated look and targets the mainstream gaming audience. Any red and black heatspreader combination is going to become a popular choice, and the different shades combine quite nicely. Another striking touch is the contrast between the textured black finish and matte section towards the PCB. I’m also quite fond of the sophisticated Viper logo and small gap between the main heatspreader which creates an impressive visual effect. Sadly, the green PCB is difficult to overlook and detracts from the attractive design. If a black PCB was used instead, the memory would be the perfect choice for a high-end build. Despite these qualms, once the RAM is installed, you’re not going to notice the PCB colour in an enclosed chassis.

Gaming Sites Demonstrate Lack of Knowledge for PC Gaming Graphics

Oh Gamespot, how you tickle me so. I must admit, I’ve nothing against the people at Gamespot, I visit their site every now and then for a bit of casual gaming news from time to time, as I do many parts of the web. However, this week they and a few others out there may have gone a little too casual by demonstrating something I find a little frustrating. They seem to think the graphics settings between the PC and console versions of The Division have more in common than they actually do.

Now, I’m all for additional graphics settings, even on console games. However, for a console focused site, brainwashing the console fans with this garbage does nothing to help the whole “peasant” and “PC master race” debate. As we’ll have people saying it has “PC graphics” all over again, when it simply does not.

So here’s what we’ve got in the graphics settings. Chromatic aberration, something I personally always turn off on a PC game anyway as it provides no pleasant visual benefit as far as I am concerned, it just blurs colours to simulate a camera lens. Then we’ve got Sharpen Image, which as many of you will know either makes things look blurry or jagged. Sharpen has little benefit to most, but it can help on some poor quality displays, more often than not, older TVs, so it’s not the worst thing to have, but it’s hardly “PC-like visual settings.” I’ve seen some sites claiming this is allowing to adjust the AA setting, but it is not, it’s a much simpler upscale/downscale visual effect and even causes a halo artefacts issue on nearby objects when maximised; don’t remember my graphics getting worse when I maxed AA.

So what should some PC-like graphics settings look like? Full antialiasing, particle detail, wind-affected snow, volumetric fog, reflection quality, sub-surface scattering, anisotropic filtering, and that’s just the tip of the iceberg. Just look at the graphics tweaking guide released today for Rise of the Tomb Raider for another example.

I’m not ripping on consoles, I honestly am not, but I would like to see a gaming community that is better educated on what options they’re actually being sold. Do you think we’ll ever see real PC like graphics tweaks on consoles, or do you think that’s a realm that will forever stay with PC gaming?

The Division is looking great on consoles and PC already and it’s certainly a lot of fun, have you been playing it this week? Let us know what you think in the comments section below.

Images via 1 ,2

Nvidia Release Rise of the Tomb Raider Performance Guide

Having trouble getting the performance or the best graphics out of the latest Tomb Raider game? If you’re running an Nvidia graphics card, you’ll be happy to hear that Nvidia has released a GPU performance guide, to help you get the most performance out of the game. Of course, most of this will be common knowledge to a lot of PC gamers, but not everyone out there is the graphics settings guru you or I may be.

Crystal Dynamics have done a great job on the graphics engine, and there’s little doubt that it’s the best looking Tomb Raider game to date.

Crystal Dynamics’ Foundation Engine returns for the sequel with upgrades galore. Physically Based Rendering gives materials a natural look under all conditions, HDR and adaptive tone mapping create stunningly bright zone transitions and highlights, deferred lighting with localized Global Illumination increases the realism of lighting, volumetric lighting adds God Rays and other shafts of light, dynamic color grading gives artists control over the appearance of individual areas, reactive water enables dynamic water ripples game-wide, physically correct forward lighting enables translucencies to be accurately lit, particle lighting enables particles to be dynamically lit by light from their surroundings, and Subsurface Scattering and Backscattering increases the quality of lighting on characters’ skin.

The visual effects in this game are pretty breathtaking, so long as you’ve got the settings right.

The guide is pretty extensive so telling you all of it here would be pretty futile, to say the least. Nvidia has worked hard to bring you performance graphs for each settings, as well as side by side image comparisons to demonstrate what each effect is, how it looks and what kind of performance impact you can expect on some of their most popular cards.

This is really handy for those of you who aren’t sure what is what in the graphics settings and while there have been updated drivers released, the performance will likely improve a bit more with tweaked drivers over the coming weeks.

Are you enjoying Rise of the Tomb Raider? Let us know in the comments section below.

Check out the full Nvidia guide right here.

Inno3D GeForce GTX 980Ti X3 Ultra DHS Graphics Card Review

Introduction


NVIDIA’s GTX 980Ti has proved to be a very popular choice among hardware enthusiasts requiring extreme performance at demanding resolutions. Whether you’re opting for a 21:9 3440×1440 60Hz panel, 4K display or high refresh rate 1440P monitor, there’s very few single card configurations on the market capable of dealing with advanced AA, complex tessellation and other visually taxing effects while driving a large number of pixels. Technically, the Titan X is NVIDIA’s flagship product and its 12GB frame buffer initially appears like an enticing proposition. However, the price to performance ratio is quite poor especially when you consider the 980Ti is based on the same GM200 silicone and only exhibits a few cost saving measures. Most notably, the video memory is reduced from 12GB to 6GB and the shader units have been slightly scaled back from 3072 to 2816.

Barring a few exceptions, the majority of Titan X models utilize a reference design which results in reduced overclocking headroom and higher temperatures. In contrast to this, custom cooled GTX 980Ti SKUs feature very impressive factory overclocks and enable users to access a higher power limit percentage when tackling manual core and memory boosts. As a result, it’s not uncommon for 980Ti GPUs to outperform the Titan X in certain scenarios despite costing £300-400 less. This means it is the perfect choice for the higher end demographic and also provides an improved price to performance ratio.

Today we’re looking at one of the fastest GTX 980 Ti models on the market incorporating a pre-overclocked core of 1216MHz and boost reaching 1317MHz. Additionally, the memory is set at 7280MHz compared to 7010MHz on the reference design. Given the impeccable 3-fan cooling solution, and impressive factory overclock, I expect the graphics card to perform superbly and pull away from the reference 980Ti by a noticeable margin.

Specifications:

Packing and Accessories

The Inno3D 980Ti X3 Ultra DHS is packaged in a hefty box which does an excellent job of protecting the GPU, and bundled accessories. On another note, the box adopts a really striking design which emphasizes the extreme level of performance on offer.

The opposite side includes a brief synopsis of the GPU’s capabilities and outlines the modern features incorporated into this particular model such as High Dynamic Range (HDR).

In terms of accessories, the product comes with interchangeable cover plates, an installation guide, 3DMark digital code, power supply guidelines, driver disk, and the usual array of adapters. Please note, the 3DMark code is not pictured to prevent the serial from being used.

Another highlight is the extremely high quality elongated mouse pad. I love extended mouse pads because they allow you to neatly position your keyboard and mouse while opting for a clean, sophisticated appearance. Despite being a free addition, the mouse pad is remarkably thick and should last a long time without becoming too frayed.

Rise of The Tomb Raider Recommended NVIDIA GPUs Revealed

The original Tomb Raider reboot scaled remarkably well across a wide range of hardware configurations and still provides a good indication of GPU performance. Nixxes Software, who worked on the PC version, has gained a great deal of respect for being one of the best in the industry when it comes to optimization. Thankfully, the studio was given the contract for Rise of the Tomb Raider, and I cannot wait to see how it performs on various setups. This time, the game is supported by NVIDIA and included when you purchase a GTX 970 or above from participating stores. Furthermore, this also applies if you buy a GTX 970M or above gaming notebook. NVIDIA also revealed the recommended specification to attain 60 frames-per-second at 1080P and 1440P. Here is a brief description of the testing methodology:

“With further testing, our technical marketing team concluded that a 60 frames per second average during this particularly demanding scene, on the High-detail preset, delivered the best balance between graphical fidelity, input responsiveness, and performance in all of Rise of the Tomb Raider’s gameplay locations and cutscenes.”

“In our Rise of the Tomb Raider test there are only a few momentary spikes, and none above 25 milliseconds, there are no periods of spiking between low and high frametimes, almost all of the benchmark is below 20 milliseconds, and much of it is at, around or below 16.6 millisecond, the 60 FPS sweet spot. In other words, the GeForce GTX 970 not only delivers a High level of graphical fidelity at over 60 frames per second, it’s also super smooth with no stuttering or stalls, giving you a fluid, responsive gaming experience.”

As you can see from the image below, NVIDIA recommends a GTX 970 to maintain 60 FPS at 1920×1080. Please note, this is on “high settings” and we’re currently unaware if this is the best preset. On another note, 1440P gamers should be utilizing a GTX 980 Ti to attain 60 frames-per-second at that particular resolution. Sadly, there’s no information regarding 4K or 21:9 setups.

Rise of the Tomb Raider looks phenomenal so far, and has some fairly hefty system requirements. Hopefully, the game supports SLI on launch and doesn’t encounter any major issues on AMD graphics cards. Once the game is launched, we should be conducting a through performance analysis at various settings and resolutions.

AMD Showcases Polaris’ Incredibly Low Wattage Demands @ CES 2016

CES 2016: AMD’s upcoming GPU range, codenamed Polaris, is built on a 14nm FinFET manufacturing process and reports massive improvements in performance per watt compared to the competition. To demonstrate this, AMD compared two runs of Star Wars Battlefront; one with an NVIDIA GTX 950 and another using an unannounced Polaris chip. As you can see, Polaris delivers twice the performance per watt and maintained a solid 60 frames-per-second at medium details on a 1920×1080 display.

It’s impressive to see an actual chip from AMD being demoed before its release and to showcase the benefits of a refined manufacturing process. In contrast to this, details about Pascal’s performance, features and architecture is still fairly unknown. You have to commend AMD for adopting such an open approach and the graphics market really needs some competition to rebalance the overall market share percentage. So far, Polaris looks really promising, it’s working, and performing superbly when you take into account, the low-wattage under load. Theoretically, this means core temperatures should be moderate and have enough overclocking headroom providing the cards are not voltage locked.

Synology DiskStation DS416 4-bay High-Performance NAS Review

Introduction


I’ve taken a look at quite a few 2-bay NAS units lately and while they already allow for an impressive 16TB raw storage, that might not be enough for everyone. Today I’m taking Synology’s DiskStation DS416, a 4-bay feature-rich and high-performance NAS server, for a spin on my test bench. The Synology DiskStation DS416 is built around an Annapurna Labs Alpine AL-212 32-bit dual-core CPU with 1.4GHz, floating point unit, and hardware encryption engine. The CPU is backed by 1GB DDR3 memory that sadly isn’t upgradable.

The DS416 is a full-fledged NAS that comes with everything you’d want from a performance system. It features dual Gigabit Ethernet network that supports all types of failover and link aggregation, allowing you to keep your system connected under heavier load or when one connection should fail. The DS416 also features three USB 3.0 ports where one of them is conveniently placed on the front of the NAS for easy access.

With powerful hardware like that, the DS416 is able to deliver an average reading and writing speeds of 220 MB/s and 140 MB/s respectively while encrypted file transfers come in at an evenly impressive 146 MB/s reading and 65 MB/s writing. Those figures naturally depend on what hard disk drives you use as well as the method of data access and your network setup.

The drive bays feature a convenient tool-free drive installation for 3.5-inch drives where as you still need screws to mount 2.5-inch drives. The latter is a rare occurrence anyway and shouldn’t be anything to bother anyone. The drive bays are covered by a removable front plate that gives the NAS a cleaner overall look by hiding away the drives. The drive bays aren’t the only thing that you can swap out in this NAS, the fans are also redundant. The DS416 incorporates a passive cooling design on the actual components and only uses two 92mm fans on the rear to keep everything at an optimal temperature. When the system detects a fan failure, the built-in redundancy mechanism ensures continuous operation until the replacement fan arrives.

The DS416 is naturally powered by DiskStation Manager (DSM), the award-winning operating system from Synology. It has a beautiful and functional user interface and all the features you will need at home or in a small to medium-sized business. It offers all the network protocols you’d want including SMB2, AFP, NFS, WebDAV, and FTP. Server admins don’t need to worry about maintaining two set of user credentials either as the DS416 comes with both Windows AD and LDAP integration. An added convenience factor is the network recycle bin that is featured on AFP, CIFS, File Station, and WebDAV. All files deleted in a shared folder will be automatically moved into the Recycle Bin instead of being destroyed.

The DS416 isn’t just an all in one server when it comes to features, it also has the power to handle it. You can easily turn your DS416 into a mail server with webmail interface, set up a VPN server connect as a VPN client, create a print server for USB printers and much more such as Radius server, Proxy server, and Syslog server features.

Creating your own personal yet comprehensive cloud solutions is no problem either with the DS416 thanks to the Cloud Station package.It is the perfect package to sync files across multiple devices, perform offline edits and later synced them, and keep up to 32 historical versions for easy restoration. You also got all the normal backup features that you’ll want such as 2-way sync between different DiskSations as well as support for commercial cloud services such as Glacier Backup and HiDrive Backup.

The Synology DiskStation DS416 has a comprehensive security package, starting with AES 256-bit encryption but also more advanced setups like 2-step verification. It also comes with AppArmor that can block malicious programs from accessing unauthorized system resources and you can customize the trust level in the package center to only accept apps from certain publishers. The built-in firewall is another useful feature just as the Denial of Service (DoS) protection. Antivirus can also be downloaded and installed for that extra level of security. Overall, everything you need to keep your data safe from malicious code and people.

Synology also offers a row of mobile apps that you can use to easy connection and usage of your NAS from your mobile devices such as tablets and smartphones. The apps include DS note, DS audio, DS video, DS photo+, DS cloud, DS file, DS download and DS cam.

Feature Highlights

  • Dual-core CPU with hardware encryption engine
  • Dual LAN with failover and link aggregation support
  • Over 221.05MB/s reading, 139.51MB/s writing
  • Effective backup solution for all desktop and mobile devices
  • Front USB 3.0 for fast easy access
  • Hot-swappable & tool-less hard drive tray design
  • Running on Synology DiskStation Manager (DSM)

Packaging and Accessories

The Synology DS416 comes in a simple brown package with a sticker for model inside. It has all the relevant features presented as well as an image of the unit that is inside.

The rear of the box has a second sticker with some of the hardware highlights.

Inside the box, next to the NAS itself, you’ll find an AC adapter with power cable for the region where you purchased it, two RJ45 LAN cables, screws for use with 2.5-inch drives, and a Welcome and Quick Installation Guide. Everything you need to get it setup and I like that Synology included two LAN cables so users can get port trunking up and running right away.

Raven’s Cry Developer Claims “Anything Above 30FPS Does Not Matter”

PC gaming has always been about choice and the freedom to select between extremely high image quality. smooth framerate or a balance of both with the appropriate hardware. As a result, PC players are accustomed to 60 frames-per-second gameplay and some users even play games at framerates in excess of 120FPS. In contrast to this, the majority of console games only run at 30 frames-per-second due to performance constraints and prioritize resolution over framerate. Evidently, 60 frames-per-second is superior and produces much smoother gameplay despite outlandish claims by Ubisoft that it creates a more “cinematic experience.”

According to a Steam post found by The Dark Side of Gaming, the developer of Raven’s Cry made a number of bizarre statements regarding the lack of optimization and suggesting SLI isn’t incorporated due to development costs:

“On the one hand it is quite understandable, that a customer who spent 600+ bucks on a high end card (e.g 980Ti) expects that a game runs faster than on a 2 years old (e.g.780 GTX) – on the other hand … if you put two or four engines in a car it will be probably not run faster than with one powerful engine. It will for sure be more heavy and bigger, consume more fuel and be very expensive. Please notice, that the developers of the well known Benchmarks receive any support, money and any hardware they want from the graphic card manufacturers … we received the last free graphic card 5 years ago, never ever any money and have to purchase graphic cards like an end user without any discount in retail.”

Not only that, he also went onto suggest that 30 frames-per-second creates the optimal experience:

“And I may add something: it was not our intention to create a graphic benchmark. VCR is a complex RPG with unique characters and story driven. Our focus is on interesting quest chains (which are very difficult to create) and not on frames per second. Beside this I thing that anything above 30 FPS does not matter for the gaming experience. And on my PC i7, GF 780GTX the game runs with 4K never below 30 FPS. No idea what it does on a GF980ti – I heard it is sometimes slower.”

The latest sentence is frankly unbelievable and demonstrates a reckless and unprofessional attitude towards game development. You cannot just overlook the requirements of the high-end market and dismiss them so easily. Furthermore, if the game runs worse on better hardware, then there’s something wrong with the level of optimization and the developer’s coding prowess. This entire line of thinking is laughable as 60 frames-per-second clearly makes such a substantial difference to the core experience barring a few casual titles which rely on text or a basic art style.

Batman: Arkham Knight Receives PC Performance Patch

Batman: Arkham Knight launched in a horrendous state on PC and suffered from frame hitching, poor optimization and instability. Unbelievably, Warner Brothers decided to outsource the PC version to Iron Galaxy, a studio with a fairly sketchy record. For example, they are still a relatively new studio created in 2008 and were responsible for the clunky Borderlands 2 port on the PlayStation 4. Batman: Arkham Knight took things to a completely new level and customers had to resort to the Steam Refund policy because the game was completely broken. Warner Brothers admitted this was a massive error on their part, and promised to make the PC version a priority. Despite this, the PC edition is still quite buggy and doesn’t run that well on some setups. Additionally, there’s no support for SLI at all, and the publisher said the possibility of it being integrated is zero.

Yesterday, Batman: Arkham Knight received a new patch and changelog on the official Steam page which reads:

“Hi Everyone,

The latest patch for Batman: Arkham Knight was released today and contains the following updates:

  • Fixed some issues with stars being awarded or lost incorrectly in specific AR Challenges
  • Improved target prioritization during combat
  • Restored heavier rain during the opening section of the game
  • Fixed missing rain effects on a few remaining player character skins
  • Miscellaneous gameplay fixes and stability improvements
  • Fixes to some keyboard and mouse prompts after being rebound
  • Made frame times more consistent for 60hz monitors running at 30fps with VSync enabled
  • Previously equipped gadgets can be selected with keyboard and mouse again after restarting an AR Challenge
  • Fixed the default key binding for Harley’s Snare gadget
  • Batgirl’s Remote Hacking Device can now be properly selected with keyboard & mouse in AR Challenges
  • Minor performance optimizations for certain combinations of hardware
  • Fixed keyboard & mouse controls that did not function in DLC AR Challenges when no previous save data existed
  • Quick Photo Mode can now be triggered with keyboard and mouse controls when using the Batmobile
  • Special Combo Takedowns can now be performed with Quickfire Gadget binds if they were rebound
  • Fixed graphical corruption that may occur after Alt-Tabbing
  • Improvements and corrections to some localized text
  • Added new Classic Harley Quinn skin for use in AR Challenges & the Harley Quinn Story Pack
  • Added Arkham Knight as a playable character for AR Challenges & the Red Hood Story Pack
  • Added support for December DLC content”

How does Batman: Arkham Knight run on your PC?

Avalanche Studios Says Just Cause 3 PC Patch Will Take “a Little Bit of Time”

Avalanche Studios is a PC developer and synonymous for creating highly polished games which scale superbly across a wide range of hardware. Sadly, it appears Just Cause 3 isn’t their finest hour from a technical standpoint and suffers from some optimization issues. Additionally, the engine doesn’t support multi GPU setups which is a major disappointment considering Mad Max utilized SLI configurations extremely well. Despite the rocky launch, I’m fairly confident in the developer’s ability to rectify these problems. Yesterday, Avalanche Studios released a statement on Steam addressing people’s concerns which reads:

“We know that some of you are encountering some technical issues—we’re looking into them all and we’re fully committed to providing you the best possible experience.

We know you’re going to want specific information on when a patch will land and what will be fixed—we would love to give you that information, and we will as soon as we have it. But right now, a little over one day since we launched, we have huge numbers of players in our enormous game world and we’re monitoring all the data coming in.

We need a little bit of time to recreate some of these issues and build fixes. Rest assured though—we are fully committed to making Just Cause 3 as awesome as possible. We already know loads of people are having a blast with the game but we’ll do all we can to make sure everyone is laughing and smiling as they play.”

It’s impossible to defend Avalanche Studios though as the game should have been delayed instead of being released unfinished. This is just another example of a major release exhibiting poor performance and terrible multi-GPU support. Not only that it’s inexcusable to charge for something which doesn’t even work properly. Here we can see an example from Robert Allen, at Tech-Gaming.com:

AMD Crimson Performance Gains on Linux are Disappointing

AMD recently overhauled the Catalyst Control Center software suite and created a more visually appealing design entitled, “Crimson”.  Furthermore, the latest driver includes a whole host of new features and optimization enhancements. If you’d like to know more, feel free to check out our full review here. While Crimson is receiving an overwhelmingly positive reception from Windows users, it appears the performance gains on Linux are minimal. The highly revered Linux-based site, Phoronix decided to test the driver’s performance using a number of GPUs. As you can see from the image, Linux users still have to use the outdated user-interface but this was expected:

The original press slides from AMD proclaimed there would be “Linux performance improvements” from “112% to 155%”. However, Phoronix’s testing shows a complete lack of progress and in some cases the update driver actually performs worse. Here we can see the performance differences between the 15.9 and 15.11 drivers. Honestly, the results are within a margin of error and nowhere near the 112% to 155% percent gains AMD promised. Obviously, this can improve via future driver revisions but this doesn’t look promising for Linux users on AMD hardware. Please note, this just one example, and Phoronix’s benchmarks show a similar pattern throughout various games.

I hope this is just an isolated incident due to AMD’s completely reworked driver package. Whatever the case, it seems like you’re not going to see huge fps boosts at this current time.

Have you upgraded to the Crimson Radeon software yet? If so, let us know what you think of it.