Ashes of the Singularity DirectX 12 Graphics Performance Analysis

Introduction


Ashes of the Singularity is a futuristic real-time strategy game offering frenetic contests on a large-scale. The huge amount of units scattered across a number of varied environments creates an enthralling experience built around complex strategic decisions. Throughout the game, you will explore unique planets and engage in enthralling air battles. This bitter war revolves around an invaluable resource known as Turinium between the human race and masterful artificial intelligence. If you’re into the RTS genre, Ashes of the Singularity should provide hours of entertainment. While the game itself is worthy of widespread media attention, the engine’s support for DirectX 12 and asynchronous compute has become a hot topic among hardware enthusiasts.

DirectX 12 is a low-level API with reduced CPU overheads and has the potential to revolutionise the way games are optimised for numerous hardware configurations. In contrast to this, DirectX 11 isn’t that efficient and many mainstream titles suffered from poor scaling which didn’t properly utilise the potential of current graphics technology. On another note, DirectX 12 allows users to pair GPUs from competing vendors and utilise multi graphics solutions without relying on driver profiles. It’s theoretically possible to achieve widespread optimization and leverage extra performance using the latest version of DirectX 12.

Of course, Vulkan is another alternative which works on various operating systems and adopts an open-source ideology. Although, the focus will likely remain on DirectX 12 for the foreseeable future unless there’s a sudden reluctance from users to upgrade to Windows 10. Even though the adoption rate is impressive, there’s a large number of PC gamers currently using Windows 7, 8 and 8.1. Therefore, it seems prudent for developers to continue with DirectX 11 and offer a DirectX 12 render as an optional extra. Arguably, the real gains from DirectX 12 will occur when its predecessor is disregarded completely. This will probably take a considerable amount of time which suggests the first DirectX 12 games might have reduced performance benefits compared to later titles.

Asynchronous compute allows graphics cards to simultaneously calculate multiple workloads and reach extra performance figures. AMD’s GCN architecture has extensive support for this technology. In contrast to this, there’s a heated debate questioning if NVIDIA products can even utilise asynchronous compute in an effective manner. Technically, AMD GCN graphics cards contain 2-8 asynchronous compute cores with 8 queues per core which varies on the model to provide single cycle latencies. Maxwell revolves around two pipelines, one designed for high-priority workloads and another with 31 queues. Most importantly, NVIDIA cards can only “switch contexts at draw call boundaries”. This means the switching process is slower and gives AMD and a major advantage. NVIDIA has dismissed the early performance numbers from Ashes of the Singularity due to its current development phase. Finally, the game’s release has exited the beta stage which allows us to determine the performance numbers after optimizations were completed.

ASUS MG24UQ and MG28UQ Monitor Review

Introduction


ASUS has rapidly become a dominant force in the gaming monitor sector due to the popularity of its ROG range. According to Digitimes, the company has seen a staggering growth rate of 30 percent throughout Europe in 2015. Models such as the PG278Q, commonly referred to the ROG Swift beautifully catered towards users requiring a high refresh monitor and G-Sync functionality. This combination proved to be extraordinarily successful and inspired future products such as the PG279Q. The advent of responsive IPS panels featuring enhanced viewing angles and colour reproduction compared to their TN counterparts, forged a new audience for the ROG series.

Today, we’re taking a look at the company’s latest gaming monitors which opt for FreeSync technology instead of the NVIDIA’s proprietary alternative. As a result, these products provide a fluid user experience at a more digestible price point. The MG24UQ utilizes a 4K IPS panel, 4ms response time and 178 degree viewing angles. This is targeted towards consumers who enjoy stunning image quality and yearn for a high pixels per inch display. In contrast to this, the MG28UQ is based on TN technology and has a 1ms response time. While the colour gamut isn’t as impressive on TN monitors, they have a quicker response time and better suited towards professional gamers. Of course, it’s all about your individual priorities and whether you care more about image quality or responsiveness.

Packaging and Features

ASUS MG24UQ

Firstly, it’s important to note that the press samples I received contain the MG24U and MG28U branding. After performing extensive research, I discovered these are now known under the UQ title instead for retail channels so please disregard the box’s naming scheme. As you can see, the packaging adopts a really bold, stylish design and lists the monitor’s key features.

The monitor’s base and cables are housed within the top section to allow for easy access. On another note, the hardened polystyrene inserts offer superb protection and ensures the display arrives in perfect condition.

ASUS gaming monitors incorporate a number of unique features including:

Rather bizarrely, the information here is incorrect and the PPI rate should read 185.


ASUS MG28UQ

The 28-inch model utilises identical packaging apart from the 1ms response time icon.

Here we can see the larger variant has a lower ppi rate, 1ms response time and USB 3.0 hub.

Do AMD Drivers Really Deserve Such a Hostile Reception?

Introduction


AMD has a serious image problem with their drivers which stems from buggy, unrefined updates, and a slow release schedule. Even though this perception began many years ago, it’s still impacting on the company’s sales and explains why their market share is so small. The Q4 2015 results from Jon Peddie Research suggests AMD reached a market share of 21.1% while NVIDIA reigned supreme with 78.8%. Although, the Q4 data is more promising because AMD accounted for a mere 18.8% during the last quarter. On the other hand, respected industry journal DigiTimes reports that AMD is likely to reach its lowest ever market position for Q1 2016. Thankfully, the financial results will emerge on April 21st so we should know the full picture relatively soon. Of course, the situation should improve once Polaris and Zen reach retail channels. Most importantly, AMD’s share price has declined by more than 67% in five years from $9 to under $3 as of March 28, 2016. The question is why?

Is the Hardware Competitive?


The current situation is rather baffling considering AMD’s extremely competitive product line-up in the graphics segment. For example, the R9 390 is a superb alternative to NVIDIA’s GTX 970 and features 8GB VRAM which provides extra headroom when using virtual reality equipment. The company’s strategy appears to revolves around minor differences in performance between the R9 390 and 390X. This also applied to the R9 290 and 290X due to both products utilizing the Hawaii core. NVIDIA employs a similar tactic with the GTX 970 and GTX 980 but there’s a marked price increase compared to their rivals.

NVIDIA’s ability to cater towards the lower tier demographic has been quite poor because competing GPUs including the 7850 and R9 380X provided a much better price to performance ratio. Not only that, NVIDIA’s decision to deploy ridiculously low video memory amounts on cards like the GTX 960 has the potential to cause headaches in the future. It’s important to remember that the GTX 960 can be acquired with either 2GB or 4GB of video memory. Honestly, they should have simplified the process and produced the higher memory model in a similar fashion to the R9 380X. Once again, AMD continues to offer a very generous amount of VRAM across various product tiers.

Part of the problem revolves around AMD’s sluggish release cycle and reliance on the Graphics Core Next (GCN) 1.1 architecture. This was first introduced way back in 2013 with the Radeon HD 7790. Despite its age, AMD deployed the GCN 1.1 architecture on their revised 390 series and didn’t do themselves any favours when denying accusations about the new line-up being a basic re-branding exercise. Of course, this proved to be the case and some users managed to flash their 290/290X to a 390/390X with a BIOS update. There’s nothing inherently wrong with product rebrands if they can remain competitive in the current market. It’s not exclusive to AMD, and NVIDIA have used similar business strategies on numerous occasions. However, I feel it’s up to AMD to push graphics technology forward and encourage their nearest rival to launch more powerful options.

Another criticism regarding AMD hardware which seems to plague everything they release is the perception that every GPU runs extremely hot. You only have to look on certain websites, social media and various forums to see this is the main source of people’s frustration. Some individuals are even known to produce images showing AMD graphics cards setting ablaze. So is there any truth to these suggestions? Unfortunately, the answer is yes and a pertinent example comes from the R9 290 range. The 290/290X reference models utilized one of the most inefficient cooler designs I’ve ever seen and struggled to keep the GPU core running below 95C under load.

Unbelievably, the core was designed to run at these high thermals and AMD created a more progressive RPM curve to reduce noise. As a result, the GPU could take 10-15 minutes to reach idle temperature levels. The Hawaii temperatures really impacted on the company’s reputation and forged a viewpoint among consumers which I highly doubt will ever disappear. It’s a shame because the upcoming Polaris architecture built on the 14nm FinFET process should exhibit significant efficiency gains and end the concept of high thermals on AMD products. There’s also the idea that AMD GPUs have a noticeably higher TDP than their NVIDIA counterparts. For instance, the R9 390 has a TDP of 275 watts while the GTX 970 only consumes 145 watts. On the other hand, the Fury X utilizes 250 watts compared to the GTX 980Ti’s rating of 275 watts.

Eventually, AMD released a brand new range of graphics cards utilizing the first iteration of high bandwidth memory. Prior to its release, expectations were high and many people expected the Fury X to dethrone NVIDIA’s flagship graphics card. Unfortunately, this didn’t come to fruition and the Fury X fell behind in various benchmarks, although it fared better at high resolutions. The GPU also encountered supply problems and emitted a large whine from the pump on early samples. Asetek even threatened to sue Cooler Master who created the AIO design which could force all Fury X products to be removed from sale.

The rankings alter rather dramatically when the DirectX 12 render is used which suggests AMD products have a clear advantage. Asynchronous Compute is the hot topic right now which in theory allows for greater GPU utilization in supported games. Ashes of the Singularity has implemented this for some time and makes for some very interesting findings. Currently, we’re working on a performance analysis for the game, but I can reveal that there is a huge boost for AMD cards when moving from DirectX11 to DirectX12. Furthermore, there are reports indicating that Pascal might not be able to use asynchronous shaders which makes Polaris and Fiji products more appealing.

Do AMD GPUs Lack Essential Hardware Features?


When selecting graphics hardware, it’s not always about pure performance and some consumers take into account exclusive technologies including TressFX hair before purchasing. At this time, AMD incorporates with their latest products LiquidVR, FreeSync, Vulkan support, HD3D, Frame rate target control, TrueAudio, Virtual Super resolution and more! This is a great selection of hardware features to create a thoroughly enjoyable user-experience. NVIDIA adopts a more secretive attitude towards their own creations and often uses proprietary solutions. The Maxwell architecture has support for Voxel Global Illumination, (VGXI), Multi Frame Sampled Anti-Aliasing (MFAA), Dynamic Super Resolution (DSR), VR Direct and G-Sync. There’s a huge debate about the benefits of G-Sync compared to FreeSync especially when you take into account the pricing difference when opting for a new monitor. Overall, I’d argue that the NVIDIA package is better but there’s nothing really lacking from AMD in this department.

Have The Drivers Improved?


Historically, AMD drivers haven’t been anywhere close to NVIDIA in terms of stability and providing a pleasant user-interface. Back in the old days, AMD or even ATI if we’re going way back, had the potential to cause system lock-ups, software errors and more. A few years ago, I had the misfortune of updating a 7850 to the latest driver and after rebooting, the system’s boot order was corrupt. To be fair, this could be coincidental and have nothing to do with that particular update. On another note, the 290 series was plagued with hardware bugs causing black screens and blue screens of death whilst watching flash videos. To resolve this, you had to disable hardware acceleration and hope that the issues subsided.

The Catalyst Control Center always felt a bit primitive for my tastes although it did implement some neat features such as graphics card overclocking. While it’s easy enough to download a third-party program like MSI Afterburner, some users might prefer to install fewer programs and use the official driver instead.

Not so long ago, AMD appeared to have stalled in releasing drivers for the latest games to properly optimize graphics hardware. On the 9th December 2014, AMD unveiled the Catalyst 14.12 Omega WHQL driver and made it ready for download. In a move which still astounds me, the company decided not to release another WHQL driver for 6 months! Granted, they were working on a huge driver redesign and still produced the odd Beta update. I honestly believe this was very damaging and prevented high-end users from considering the 295×2 or a Crossfire configuration. It’s so important to have a consistent, solid software framework behind the hardware to allow for constant improvements. This is especially the case when using multiple cards which require profiles to achieve proficient GPU scaling.

Crimson’s release was a major turning point for AMD due to the modernized interface and enhanced stability. According to AMD, the software package involves 25 percent more manual test cases and 100 percent more automated test cases compared to AMD Catalyst Omega. Also, the most requested bugs were resolved and they’re using community feedback to quickly apply new fixes. The company hired a dedicated team to reproduce errors which is the first step to providing a more stable experience. Crimson apparently loads ten times faster than its predecessor and includes a new game manager to optimize settings to suit your hardware. It’s possible to set custom resolutions including the refresh rate, which is handy when overclocking your monitor. The clean uninstall utility proactively works to remove any remaining elements of a previous installation such as registry entries, audio files and much more. Honestly, this is such a revolutionary move forward and AMD deserves credit for tackling their weakest elements head on. If you’d like to learn more about Crimson’s functionality, please visit this page.

However, it’s far from perfect and some users initially experienced worse performance with this update. Of course, there’s going to be teething problems whenever a new release occurs but it’s essential for AMD to do everything they can to forge a new reputation about their drivers. Some of you might remember, the furore surrounding the Crimson fan bug which limited the GPU’s fans to 20 percent. Some users even reported that this caused their GPU to overheat and fail. Thankfully, AMD released a fix for this issue but it shouldn’t have occurred in the first place. Once again, it’s hurting their reputation and ability to move on from old preconceptions.

Is GeForce Experience Significantly Better?


In recent times, NVIDIA drivers have been the source of some negative publicity. More specifically, users were advised to ignore the 364.47 WHQL driver and instructed to download the 364.51 beta instead. One user said:

“Driver crashed my windows and going into safe mode I was not able to uninstall and rolling back windows would not work either. I ended up wiping my system to a fresh install of windows. Not very happy here.”

NVIDIA’s Sean Pelletier released a statement at the time which reads:

“An installation issue was found within the 364.47 WHQL driver we posted Monday. That issue was resolved with a new driver (364.51) launched Tuesday. Since we were not able to get WHQL-certification right away, we posted the driver as a Beta.

GeForce Experience has an option to either show WHQL-only drivers or to show all drivers (including Beta). Since 364.51 is currently a Beta, gamers who have GeForce Experience configured to only show WHQL Game Ready drivers will not currently see 364.51

We are expecting the WHQL-certified package for the 364.51 Game Ready driver within the next 24hrs and will replace the Beta version with the WHQL version accordingly. As expected, the WHQL-certified version of 364.51 will show up for all gamers with GeForce Experience.”

As you can see, NVIDIA isn’t immune to driver delivery issues and this was a fairly embarrassing situation. Despite this, it didn’t appear to have a serious effect on people’s confidence in the company or make them re-consider their views of AMD. While there are some disgruntled NVIDIA customers, they’re fairly loyal and distrustful of AMD’s ability to offer better drivers. The GeForce Experience software contains a wide range of fantastic inclusions such as ShadowPlay, GameStream, Game Optimization and more. After a driver update, the software can feel a bit unresponsive and takes some time to close. Furthermore, some people dislike the notion of GameReady drivers being locked in the GeForce Experience Software.  If a report from PC World is correct, consumers might have to supply an e-mail address just to update their drivers through the application.

Before coming to a conclusion, I want to reiterate that my allegiances don’t lie with either company and the intention was to create a balanced viewpoint. I believe AMD’s previous failures are impacting on the company’s current product range and it’s extremely difficult to shift people’s perceptions about the company’s drivers. While Crimson is much better than CCC, it’s been the main cause of a horrendous fan bug resulting in a PR disaster for AMD.

On balance, it’s clear AMD’s decision to separate the Radeon group and CPU line was the right thing to do. Also, with Polaris around the corner and more games utilizing DirectX 12, AMD could improve their market share by an exponential amount. Although, from my experience, many users are prepared to deal with slightly worse performance just to invest in an NVIDIA product. Therefore, AMD has to encourage long-term NVIDIA fans to switch with reliable driver updates on a consistent basis. AMD products are not lacking in features or power, it’s all about drivers! NVIDIA will always counteract AMD releases with products exhibiting similar performance numbers. In my personal opinion, AMD drivers are now on par with NVIDIA and it’s a shame that they appear to be receiving unwarranted criticism. Don’t get me wrong, the fan bug is simply inexcusable and going to haunt AMD for some time. I predict that despite the company’s best efforts, the stereotypical view of AMD drivers will not subside. This is a crying shame because they are trying to improve things and release updates on a significantly lower budget than their rivals.

Gladiator Computers BATTALION 800 Gaming PC Review

Introduction


Gladiator Computers is the name given to Aria’s custom PC division and provides consumers with a wide range of options to suit various budgets. Just in case you’re unfamiliar with Aria, they’re one of the leading PC hardware stores and have an excellent reputation among customers. Currently, the company’s TrustPilot rating is scored at nine out of ten which evokes a sense of confidence when investing in a pre-configured PC. Of course, you can customize each model and select between various cases, memory configurations, CPU coolers and lots more! As a result, it’s incredibly easy to make savings on various components if you’re not overly concerned about colour coordination. On the other hand, consumers who demand a visually appealing system can add LED lighting or other extravagant extras.

Today, we’re taking a detailed look at the BATTALION 800 featuring an Intel i5-6500 processor, 16GB DDR4 2133MHz memory, Gigabyte Z170-Gaming K3 motherboard, 120GB Samsung 850 Evo boot drive and the Zotac GTX 970 Gaming Edition graphics card. Furthermore, Gladiator Computers have employed a very reputable air cooler to find a great balance between thermal dissipation and noise output. There’s also a quality non-modular power supply with an efficiency rating of 80+ White. I’m interested to see how this will impact on cable management especially given the budget chassis in the basic bundle. Priced at £889.99, the system is targeted towards mainstream consumers utilizing a single 1920×1080 display. Let’s see how it performs compared to other machines sporting a similar specification.

Specifications

  • Name: Gladiator Computers BATTALION 800
  • Case: Game Max Destroyer Windowed
  • Motherboard: Gigabyte Z170-Gaming K3
  • Processor: Intel i5 6500 3.20GHz Base, 3.60GHz Turbo Quad Core CPU
  • Processor Cooler: Raijintek Aidos Direct Contact CPU Cooler
  • System Memory: Corsair 16GB DDR4 Vengeance LPX 2133MHz
  • Main Boot Drive: 120GB Samsung 850 EVO Series Solid State Drive
  • Additional Storage Drive(s): 1TB Seagate Barracuda Hard Drive 3.5″ SATA III
  • Graphics card: Zotac GeForce® GTX 970 Gaming Edition 4GB
  • Power Supply: Corsair VS550 550 Watt 80+ White Rated ATX
  • Peripherals: N/A
  • Monitor: N/A
  • Optical Drive24x LiteOn Internal DVD-RW Drive
  • Wireless: N/A
  • OS: N/A
  • Warranty: 4 Year Standard Warranty (2 Month Collect/Returns, 1 Year Parts, 4 Year Labour)
  • Price: £889.99

Packing and Accessories

The system is dispatched in an extremely large outer box which offers superb protection against damage during delivery. On the top, fragile tape has been used to instruct the courier about the item’s delicate nature. This should reduce the possibility of the delivery driver throwing the package around. I do think there needs to be side handles because the box’s large surface area is difficult to lift from an angle.

Inside the package is a huge collection of packing peanuts to prevent the chassis box from moving around. While these inserts can be irritating if they manage to scatter all over the floor, this is a small price to pay for the superb level of protection.

The chassis box utilizes thick cardboard which feels pretty sturdy and provides an additional layer of cushioning.

Despite the case’s budget focus, there’s been a great deal of attention paid to the packaging including durable foam supports. The top cover also ensures that there’s very little chance of cosmetic damage occurring during the unboxing processing.

Gladiator Computers have positioned a sticker over the power supply’s AC connector to prevent you from booting up the system with the foam pack still installed.

The foam insert is absolutely essential because it prevents each component from becoming dislodged. Furthermore the cushioning should allow fan headers and other cables to remain in their optimal position. When it comes to packaging, foam packs are possibly the most important safety aspect and it’s great to see them used in this custom configuration.

In terms of accessories, the system is bundled with a thank you note, installation guide, driver/software disks, a funky door hanger and loads of documentation.

Other notable mentions include a power adapter, retail component packaging, CPU cover (required for warranty purposes), front bay cover where the optical drive is positioned, tasty Haribo sweets, various adapters and an assortment of fittings.

 

CPU-Z

GPU-Z

Computer Paints ‘New Rembrandt’ From Data Analysis

Rembrandt Harmenszoon van Rijn is regarded as one of the most imaginative and talented European artists during the baroque era. As with any iconic artist, it’s always important to showcase their finest work in art galleries to inspire people to take up a creative hobby. The incredible advancements in modern technology allow us to take this one step further and employ data analysis to create new pieces. Recently, a team of engineers working alongside Microsoft managed to create a 3D printed painting in the style of Rembrandt. The end result is absolutely breathtaking and uses the same texture as an authentic oil painting. Emmanuel Flores, director of technology on the project told the BBC:

“We really wanted to understand what makes a face look like a Rembrandt,”

Information about Rembrandt’s previous works were compiled and computers discovered key patterns to gauge his artistic style. For example, it recognized how he would shape a subject’s eyes or other facial features. Machine learning algorithms were developed to create a new piece which accurately mimicked Rembrandt’s signature brush strokes. Flores also added:

“We found that with certain variations in the algorithm, for example, the hair might be distributed in different ways,”

“Our goal was to make a machine that works like Rembrandt,”

“We will understand better what makes a masterpiece a masterpiece.”

To limit the number of possibilities, the computational equations revolved around a portrait of a Caucasian male between the ages of 30 and 40 sporting a fashionable beard. Furthermore, details about the individual’s clothing ensured that the final result could be narrowed down using strict parameters. After being verified with digital tagging, humans selected algorithms based on their efficiency and allowed the computer to create the final piece. Once this was complete, a 3D texture was applied to correspond with the height and depth of paint used on typical Rembrandt works.

For more information about this intriguing project, please visit “The Next Rembrandt” website.

BIOSTAR RACING H170GT3 (LGA1151) Motherboard Review

Introduction


BIOSTAR might not be the most recognizable motherboard brand in western markets but their pedigree for creating reliable products is worthy of praise. When compared to MSI, Gigabyte and ASUS, the company struggles to entice consumers with unique aesthetic designs. Furthermore, the BIOS and software package has been sorely lacking and in dire need of change. Thankfully, BIOSTAR have acknowledged these criticisms and decided to forge a brand new range based upon a racing theme to please petrolheads with an avid interest in enthusiast hardware. Each RACING motherboard sports a chequered flag PCB and stylish LED illumination while introducing a new BIOS layout. Clearly, this is a major departure from BIOSTAR’s previous products which evoked a fairly mundane appearance.

The BIOSTAR RACING H170GT3 is based on the mATX form factor and supports up to 64GB DDR4 with a maximum speed of 2133MHz. Intel’s H170 chipset blocks multiplier overclocking which means you have to resort to your processor’s default turbo frequency. Of course, there’s been some controversy surrounding BCLK overclocking on H170 and B150 motherboards to unofficially achieve boosts fairly close to traditional multiplier overclocking. Sadly, Intel has voiced their displeasure regarding this phenomenon and pressurized manufacturers to disable BCLK overclocking via a BIOS update. As a result, we have to rely on stock figures to determine the motherboard’s performance. Previously, I’ve seen some astounding results when it comes to storage with BIOSTAR products, and I’m interested to see if this trend continues.

Specifications

Packing and Accessories

Here we can see the absolutely stunning packaging which contains a carbon fibre inspired cover and vibrant text. This coincides with the RACING focus and feels quite reminiscent of a luxury sports car’s interior.

On the opposite side, there’s a detailed diagram showing the motherboard’s layout and explanation of its unique selling features. Once again, this is presented a superb manner and makes you inquisitive about the product’s specification.

In terms of accessories, the motherboard includes a user’s manual, Vivid LED DJ instructions guide, SATA cables, driver disk, and I/O shield.

MSI Z170A GAMING M5 (LGA1151) Motherboard Review

Introduction

Since the release of Intel’s Z170 chipset, MSI has unveiled a fantastic, feature-rich motherboard range which caters to contrasting user demands. For example, the Z170A GAMING PRO CARBON is an excellent choice for consumers wanting a stylish black colour scheme and great reliability at an affordable price point. In contrast to this, the MSI Z170A XPOWER GAMING TITANIUM Edition‘s gorgeous aesthetic design makes it one of the most innovative LGA1151 motherboards on the market. Of course, the iconic dragon styling on many MSI products have become a popular choice among the core gaming demographic. This red and black theme compliments mainstream hardware and retails at very competitive prices across various performance tiers.

The MSI Z170A GAMING M5 is a mid-range motherboard sporting an attractive design and impressive specification. More specifically, the product is capable of housing two PCI-E M.2 storage devices and has support for USB 3.1 Gen2 connectivity. Not only that, the motherboard includes a one year premium XSplit license and Nahimic audio enhancements. As you might expect, many of MSI’s leading technologies are incorporated such as DDR4 Boost, Game Boost and much more. Given this particular model’s astonishing software suite and military class components, I expect to see it rival higher priced offerings rather well. Could this be the best value Z170 motherboard thus far for high-end users? Let’s find out!

Specifications

Packing and Accessories

MSI always does a phenomenal job when it comes to packaging design and the Z170 GAMING M5 is no different. The bold colours and stunning product snapshot contrasts extremely well. This is one of the most eye-catching motherboard boxes I’ve seen and showcases the motherboard’s beautiful appearance.

On the opposite side, there’s a brief synopsis of the motherboard’s key selling points such as support for 3-way crossfire, 2-way SLI and Audio Boost 3.0. This is presented in a slick manner and doesn’t alienate the end-user with technical jargon.

In terms of accessories, the motherboard comes with a user’s guide, driver’s disk, metal case badge, I/O shield, SLI bridge, registration details, basic installation guide and four SATA cables. Please note, the press sample I received was previously used by another media so there’s only 3 SATA cables displayed in the photograph. Rest assured, the retail version will include four and be packaged without the need for an elastic band.

ASUS Sabertooth Z170 S (LGA1151) Motherboard Review

Introduction

ASUS have compiled a comprehensive Z170 motherboard range which caters to different sections of the consumer market. For example, the GAMING PRO line-up offers superb functionality and impeccable stability at an affordable price point. ROG products evoke a more premium feel and includes a stunning software suite for power users. The Sabertooth brand revolves around a stringent testing procedure to ensure each motherboard exhibits unprecedented reliability. The extreme thermal testing and deployment of TUF components prioritizes long-term durability. As a result, it’s a great option for consumers who demand a very high-end motherboard and have no intentions of upgrading in the near future.

Typically, motherboards opt for a red and black colour scheme because it’s the most popular option among the core gaming audience. Some time ago, ASUS unleashed the limited edition Z97 Sabertooth Mark S which utilizes an innovative white PCB and military camouflage. It’s quite rare to see motherboard sporting a white theme and while there is some competition from the MSI Krait series, ASUS is the only manufacturer to offer a pure white PCB.

The Sabertooth Z170 S is the spiritual successor to the Z97 Sabertooth Mark S and features a very unusual design philosophy. When adopting such a wacky colour scheme, it’s bound to have a polarizing reception and I’m fascinated to hear feedback on ASUS’ aesthetic choices. Looking beyond the visual aspects, I’m expecting to see some very impressive numbers given the premium electronics and DIGI+ Power Control.

Specifications

Packing and Accessories

The motherboard’s box is characterized by a white finish and contains camouflage highlights. This provides a great insight into the product’s unconventional styling and creates a web of intrigue. On the front, information regarding the 5 year warranty is displayed in a clear manner.

Moving onto the opposite side, there’s a detailed description of the product’s thermal radar monitoring made possible by the TUF ICe processor. On another note, the packaging outlines the rear I/O connectivity and basic motherboard layout.

Included in the package is a user’s guide, M.2 screws, driver’s disk, case badge, certificate of reliability, stickers and a gorgeous white I/O shield. As someone who loves the technical details of motherboards, it’s fantastic to read the reliability assessment document. Here, you can browse information regarding a huge array of tests such as moisture resistance, thermal shock, solder bath, salt spray and more!

There’s also four SATA cables, a Q Connector, CPU installation tool, back I/O dust cover and SLI bridge. The Q Connector is a really handy tool which eliminates the frustration factor when attaching front panel headers. Furthermore, the CPU installation tool is designed to minimize the contact time and pressure between your fingers and the CPU. While it’s not necessary for veteran builders, it could prevent beginners from causing damage during the build process.

Gigabyte P35W v5 Gaming Laptop Review

Introduction


Consumers typically purchase gaming laptops over their desktop counterparts due to portability and requiring hefty processing power on the move. Saying that, it’s exceedingly difficult to offer adequate thermal dissipation in a slim form factor which limits the convenience factor of many flagship gaming laptops. These tend to be rather bulky and difficult to carry around on public transportation where space is quite restricted. Thankfully, efficiency improvements on mobile graphics chipsets and CPUs have enabled manufacturers to create a better balance between performance and size. Granted, the top-tier options with dual GPUs still feel heavy but less extreme alternatives can be surprising portable. For example, the Gigabyte P34W v3 provides a superb gaming experience and weighs a mere 1.81Kg. Back when I reviewed this, the performance to size ratio astounded me. Although, the system’s load temperatures were higher than I hoped and felt like a concession too far.

The latest gaming laptop to prioritize a thin design from Gigabyte is the P35W v5 sporting an Intel i7-6700HQ, ultra fast 128GB NVMe boot drive, and GTX 970M. Unlike the P34W v3, Gigabyte has opted for a 6GB variant of this graphics chip but I can’t see the increased video memory making a substantial difference. On the other hand, some games with high memory utilization might fare better with an improved minimum frame-rate. Another key benefit is the inclusion of DDR4 memory, and a greatly improved battery. As always, you can customize the specification to suit your needs and the standard package utilizes a 1920×1080 display. If this seems a little underwhelming, you can select a 4K panel for an additional fee but this has some drawbacks when it comes to performance. Given the P35W v5’s marvellously thin design, I’m interested to see the thermals under stress and determine if the cooling hardware is up to scratch.

Specifications

  • Name: Gigabyte P35W v5
  • Processor: Intel Core i7-6700HQ (2.6GHz base frequency, 3.5GHz turbo)
  • System Memory: 16GB Dual Channel DDR4 2133MHz
  • Main Boot Drive: Samsung NVMe MZVPV128 M.2 128GB SSD
  • Additional Storage Drive(s): HGST 1TB 7200RPM HDD
  • Graphics card: NVIDIA GeForce GTX 970M 6GB
  • Peripherals: N/A
  • Display: 15.6-inch 3840×2160 IPS LCD
  • Optical Drive: MATSHITA DVD-RAM UJ8G2
  • Wireless: Intel Dual Band Wireless AC 8260
  • Battery: Li-Polymer 11.1V, 75.81Wh
  • Weight: 2.3Kg with Battery
  • Dimensions: 385(W) x 270(D) x 20.9(H) mm
  • OS: Windows 10 Home
  • Warranty: 2 Year
  • Price: £1399

Packing and Accessories

Gigabyte has adopted a fairly understated theme to the packaging which showcases the beautiful display and professional aesthetic design. Furthermore, there’s a brief description about the laptop’s unique selling features but I have to say the translation is confusing and doesn’t make a lot of sense. Perhaps, this is because the press sample originates from the factory and I’m sure Gigabyte will update the message for western markets.

The opposite side is almost identical barring another stunning snapshot of the product’s thin profile. This level of uniformity works well and evokes a premium feel. The packaging’s durable cardboard shell and soft inserts protect the item during transit meaning you shouldn’t encounter any cosmetic imperfections.

Included with the laptop is a power adapter, user’s guide, warranty card, driver disk, PowerDVD 12 software, swappable storage bay and light stickers. The swappable storage bay is an ingenious extra which allows you to remove the optical drive and fit an internal 9.5mm SSD instead. This is a great idea because many people use flash storage devices instead of optical media and the ability to easily house a traditional SATA SSD greatly enhances the laptop’s flexibility.

The 4K model contains a removable orange sticker near the lid which can be replaced with either a green or turquoise colour. Gigabyte even provides a complimentary pair of tweezers to obtain a neat finish and customize the theme to your own personal taste. Small touches like this creates the perception every customer’s needs have been attended to.

CPU-Z


GPU-Z

Overclockers UK Titan Dark Zone Gaming PC Review

Introduction


Overclockers UK is one of the leading stockists of PC hardware and their engineering team produces an impressive range of custom rigs to suit contrasting tastes. Whether you’re looking for a silent air-cooled build, or extreme overclocked PC with premium water cooling parts, there’s something designed for your specific requirements. Often, whenever a new game is released which sells remarkably well, consumers like to pay homage with a system based around its theme. This can be a challenge especially if the game in question doesn’t have a distinctive colour scheme. The Division is an open world third-person shooter set in a bleak vision of New York City ravaged by a smallpox pandemic. This intriguing setting and captivating multiplayer confrontations have proved to be incredibly popular! As a result, The Division became Ubisoft’s fastest selling game on record and attracted a very passionate community.

This success story has given Overclockers UK inspiration for their latest gaming PC entitled the Titan Dark Zone. The system opts for orange braided PSU extension cables and vibrant LED lighting which creates a stunning aesthetic design. Combining the orange tones with black jet black components is quite unusual and a reference to The Division’s box art. Therefore, the Titan Dark Zone is a dream come true for fans of this particular title and features a very potent specification capable of powering VR devices without any concessions. The Intel i7-6700K is professionally overclocked to 4.5GHz using the Alpenfohn Broken 2 cooler. As a result, I expect to see an impeccable performance to noise ratio which surpasses many closed-loop-coolers. On another note, the 16GB 2400MHz DDR4 memory, factory overclocked GTX 980Ti and Samsung 250GB boot drive should be able to provide a sensational gaming experience even on high-resolution monitors. Rather surprising, Overclockers UK have decided to use a non-modular power supply which complicates cables management. On the other hand, the PSU has received a great deal of critical acclaim and showcases the careful decision-making process when designing a system’s specification.

Specifications

  • Name: Overclockers UK Titan Dark Zone
  • Case: Phanteks Enthoo Evolv ATX
  • Motherboard: MSI Z170A-SLI Plus
  • Processor: Intel Core i7-6700K Overclocked to 4.5GHz
  • Processor Cooler: Alpenfohn Broken 2
  • System Memory: Team Group Elite 16GB (8x2GB) 2400MHz CL16 RAM
  • Main Boot Drive: Samsung 250GB 850 Evo SSD
  • Additional Storage Drive(s): Seagate 2TB 7200RPM HDD
  • Graphics card: MSI GeForce GTX 980ti Armor X2 6GB
  • Power Supply: XFX TS 750W 80 Plus Gold
  • Peripherals: N/A
  • Monitor: N/A
  • Optical Drive: N/A 
  • Wireless: N/A
  • OS: Microsoft Windows 10 Home 64 Bit
  • Warranty: Three Year (24 Month Collect and Return plus 12 Month labour) Mainland UK and Ireland Only
  • Price: £1549.99

Packing and Accessories

The system arrived in an extremely large box which cannot fit on my photography backdrop. This is the reason why I’ve taken a snapshot in the hallway to emphasize the package’s mammoth size. Overclockers UK always adopt such an attentive approach to packaging and employ durable materials which enhances the level of protection substantially. It’s evidently clear that the company has considered the strain delicate PCs go under during transit and taken every necessary step to dramatically reduce the probability of damage occurring.

Once the top cover has been removed, we can see an ample supply of durable cardboard inserts which holds the system firmly in position.

The Titan Dark Zone is placed in the original chassis box and secured with strong tape. Honestly, I’d be extremely surprised if you received the system with even cosmetic imperfections considering multiple layers were used for protective purposes.

There’s additional support inside the chassis box via two strong polystyrene blocks.

The system’s internal components are surrounded by three Instapak foam pieces. These are essential additions which protect the CPU mounting and prevent the graphics card from applying too much pressure on the PCI-E slot during delivery.

In terms of accessories, OCUK included a Welcome Pack and Windows 10 Home OEM DVD containing the product code. The Welcome Pack outlines the system’s specification, and warranty terms in an easy to understand manner. Personally, I love the overall presentation and solder joints design on the front cover.

CPU-Z


GPU-Z

Far Cry Primal Graphics Card Performance Analysis

Introduction


The Far Cry franchise gained notoriety for its impeccable graphical fidelity and enthralling open world environment. As a result, each release is incredibly useful to gauge the current state of graphics hardware and performance across various resolutions. Although, Ubisoft’s reputation has suffered in recent years due to poor optimization on major titles such as Assassin’s Creed: Unity and Watch Dogs. This means it’s essential to analyze the PC version in a technical manner and see if it’s really worth supporting with your hard-earned cash!

Far Cry Primal utilizes the Dunia Engine 2 which was deployed on Far Cry 3 and Far Cry 4. Therefore, I’m not expecting anything revolutionary compared to the previous games. This isn’t necessarily a negative concept though because the amount of detail is staggering and worthy of recognition. Saying that, Far Cry 4 was plagued by intermittent hitching and I really hope this has been resolved. Unlike Far Cry 3: Blood Dragon, the latest entry has a retail price of $60. According to Ubisoft, this is warranted due to the lengthy campaign and amount on content on offer. Given Ubisoft’s turbulent history with recent releases, it will be fascinating to see how each GPU this generation fares and which brand the game favours at numerous resolutions.

“Far Cry Primal is an action-adventure video game developed and published by Ubisoft. It was released for the PlayStation 4 and Xbox One on February 23, 2016, and it was also released for Microsoft Windows on March 1, 2016. The game is set in the Stone Age, and revolves around the story of Takkar, who starts off as an unarmed hunter and rises to become the leader of a tribe.” From Wikipedia.

Gigabyte Z170-Gaming K3 (LGA1151) Motherboard Review

Introduction


Intel’s current iteration of enthusiast processors offering impressive overclocking headroom incurs a fairly hefty premium compared to the previous generation especially if you’re opting for the i7-6700K. Unfortunately, the retail version of this CPU sporting a 3 year warranty still teeters around the £300 mark, and falls into a similar budget to the 6-core 5820K. The real savings when selecting the Z170 chipset revolve around cheaper motherboards which usually cater towards the gaming demographic with LED illumination, unusual colour schemes and a comprehensive software suite. It’s astonishing to see the kind of performance and bundled list of features on products under £100. At this price, there’s fierce competition and some manufacturers have struggled to outline the value of H170/B150 alternatives due to the narrow difference to affordable Z170 options.

The latest motherboard from Gigabyte targets the mainstream audience utilizing a single discrete graphics card, and overclocked Skylake processor. While it does technically support Crossfire, the lack of x8/x8 functionality might be a deal breaker for users wanting the absolute maximum bandwidth. There’s also no support for SLI setups either which may be a contentious issue. To be honest, I don’t see this as a huge problem because the motherboard retails for approximately £95 and dual card configurations are fairly niche in today’s market. Despite the very low price, Gigabyte has still implemented 32Gb/s M.2 storage, a great audio solution and USB 3.1. From the outset, it seems Gigabyte managed to achieve this excellent specification on a budget by removing SLI support. I’m interested to see the stock performance numbers though compared to high-end solutions and determine the motherboard’s credentials. Could this be the best value gaming Z170 motherboard ever released? Let’s find out!

Specifications

Packing and Accessories

The Z170-Gaming K3 is packaged in a visually appealing box showcasing the attractive dual tone PCB design. This draws you into the product and evokes a sense of excitement prior to the unboxing process.

On the rear, there’s a huge description regarding the motherboard’s high-speed connectivity, premium audio hardware and networking with traffic prioritization for gaming purposes. This is presented in an easy to understand manner, and the clear pictures do a great job of relaying technical information without bamboozling the end-user.

In terms of accessories, the motherboard comes with an I/O shield, user’s guide, G Connector, case badge and SATA cables opting for a very stylish metallic look. This is the first time I’ve seen a color of this ilk but I have to admit it’s a nice touch and looks fantastic. The G Connector is really useful when connecting those fiddly front panel connectors and improves the user-experience when building a new system. Other additions include a rather fetching door hanger, drivers disk, and World of Warships content code.

Inno3D GTX 980Ti iChill Black Graphics Card Review

Introduction


Closed-loop liquid coolers have become extremely popular in the CPU market due to the cleaner build, and greater space around the CPU socket compared to traditional air cooling hardware. This means you can install an all in one liquid cooler without having to make concessions in terms of memory compatibility or worry too much about your motherboard’s PCI-E arrangement. As you might expect, all in one liquid coolers have progressively moved into the GPU sector to offer improved overclocking headroom and a lower noise output. There are some interesting parallels between CPU and GPU all in one liquid cooling though which needs to be addressed.

Firstly, many air coolers like the Noctua NH-D15 can outperform Asetek units, while being much quieter. It’s a similar picture with graphics cards because proficient air cooling setups including the Gigabyte Windforce X3 and Sapphire Tri-X provide a superb noise to performance ratio. Liquid cooled graphics cards have a price premium and involve a more complicated installation process. It’s important to remember that Maxwell is a very mature and efficient architecture which allows vendors to enable a 0dB idle fan mode. Despite my own qualms about closed-loop liquid cooling, it’s fantastic to see products which cater to a different target market. There’s clearly a demand for pre-assembled liquid cooled graphics cards, and their appeal is bound to grow in the next few years.

Today, we’re taking a look at the Inno3D GTX 980Ti iChill Black which utilizes a very powerful hybrid cooling solution. The GPU incorporates a traditional fan which only switches on during heavy load, in addition to a 120mm fan/radiator combination. The Arctic Cooling radiator fan is constantly on but has a very low RPM curve to maintain silent running. This impeccable hardware allows for an impressive core clock of 1203MHz and default boost reaching 1304MHz. The memory has also been increased to 7280MHz. As you can see from the chart below, this isn’t the greatest configuration we’ve encountered from the factory, but it’s exceedingly fast and should be a top performer. It will be fascinating to contrast this graphics card with the marvellous Inno3D GTX 980Ti X3 Ultra DHS which opts for a hefty air cooling design.

Specifications:

Packing and Accessories

The Inno3D GTX 980 Ti iChill Black comes in a huge box to properly house the closed loop cooler’s tubing and protect against leaks during shipping. Honestly, the picture doesn’t provide an accurate depiction of the packaging’s size. I have to commend Inno3D because they have taken the precautionary steps to reduce the possibility of damage occurring and utilized strong foam inserts as cushioning materials. The box itself features an attractive render of the GPU, and outlines its specification.

On the rear portion, there’s a brief synopsis of NVIDIA’s Maxwell architecture. I’m a bit surprised to see the back doesn’t contain any information about the liquid cooling solution and the acoustical benefits compared to NVIDIA’s reference cooler.

In terms of accessories, the graphics card is bundled with mounting screws, 6-pin PCI-E to molex adapter, case badge, DVI-D to VGA adapter and installation guide. There’s also a driver’s disk which you should disregard, a copy of 3DMark, and other documentation. This is a great selection of items and provides everything you need to get started! The mouse mat is surprisingly high-quality and relatively thick.

Gigabyte GeForce GTX 980Ti Xtreme Gaming Graphics Card Review

Introduction


NVIDIA’s cogent strategy to launch the Titan X at $999 and subsequently release the GTX 980Ti with similar performance at a significantly reduced price was a master stroke. This made the 980Ti compelling value and a great choice for high-end consumers wanting the best possible experience at demanding resolutions. Admittedly, there isn’t a GPU on the market capable of driving a 4K panel at maximum details but you can attain 60 frames-per-second with reduced settings. Evidently, the 980Ti has proved to be a popular choice especially when you take into consideration that factory overclocked models can easily pull away from NVIDIA’s flagship graphics card. While there is some competition from the Fury X, it’s not enough to dethrone custom-cooled 980Ti models.

Some users might argue that the upcoming Pascal architecture built on the 16nm manufacturing process and utilizing HBM2 ultra fast memory is reason enough to hold off buying a top-tier Maxwell solution. However, the current estimate suggests Pascal won’t launch until Q2 this year, and there’s no indication regarding pricing. As always, any new product has a price premium and I expect enthusiast Pascal cards to retail at a high price point. This means purchasing a Maxwell-based GPU right now isn’t a terrible option unless you require additional power to enjoy 4K gaming and have deep pockets. One of the best custom-designed GTX 980Ti cards on the market is the Gigabyte G1 Gaming. This particular GPU rapidly gained a reputation for its overclocking ability and superb Windforce triple fan cooling hardware.

The latest addition to Gigabyte’s graphics range is the GTX 980Ti Xtreme Gaming sporting a 1216MHz core clock, 1317MHz boost, and memory running at 7200MHz. One major improvement is the use of illuminated RGB rings behind the fans which creates a very unusual, and stylish appearance. Gigabyte’s GPU Gauntlet is a binning process which selects the best performing chips with impressive overclocking headroom. Once discovered, the top chips are incorporated into the Xtreme Gaming series, and G1 Gaming. By default, the Xtreme Gaming is bundled with a hefty overclock and should offer sensational performance. Although I expect to see some further gains due to the excellent cooling and stringent binning procedure. Could this be the best 980Ti on the market thus far?

Specifications:

Packing and Accessories

The product comes in a visually appealing box which outlines the extreme performance and gaming focus. I really like the sharp, dynamic logo with bright colours which draws you into the packaging.

On the rear side, there’s a brief description of the Windforce X3 cooling system, RGB illumination, GPU Gauntlet, and premium components. The clear pictures provide a great insight into the GPU’s main attributes and it’s presented in such a slick way.

In terms of accessories, the graphics card includes a driver disk, quick start guide, case badge, sweat band and PCI-E power adapter. It’s quite unusual to see a sweat band, but I’m sure it could come in handy during a trip to the gym or intense eSports contest.

Mushkin Blackline Ridgeback DDR4 2400MHz 16GB Memory Kit Review

Introduction


The latest memory kit to arrive for review purposes is part of Muskin’s Blackline Ridgeback range. The sample we received opts for a 16GB capacity, 2400MHz speed and timings of 15-15-15-35. This is achieved with a very respectable voltage rating of 1.2V and showcases the modules’ efficiency. Mushkin is a pioneering memory manufacturer formed in 1994 and continues to release new DIMMs sporting unique designs at a very competitive price point. Unlike their competitors, Mushkin assembles and hand tests each memory kit in the USA while strictly monitoring their production line.

This results in exceptional reliability and minimizes the potential for customer returns. The company also selects low latency modules to find a suitable balance between raw frequency and operating latency. On another note, every Mushkin memory kit is backed with a lifetime warranty and approachable customer service team. Given their reputation in the industry, I expect the memory kit to perform admirably at stock values and have some good overclocking headroom. Let’s see how it compares to other dual channel alternatives!

Specifications

Packaging and Accessories

The product comes in a traditional blister pack, and adopts the slogan, “Gamers take notice, rivals take notes.” I quite like the background design which looks rather striking and adds a dash of colour.

On the rear, there’s a very detailed installation guide containing extremely clear diagrams. Inserting the modules into position is a very simple process, but it’s always good to include instructions for newcomers without any previous system building experience.

A Closer Look

The DIMMs convey a professional look via the neutral colour scheme and should suit a wide range of system builds. Additionally, the angled heat spreaders provide a distinctive appearance without straying too far from the sophisticated, understated design. I’m also really keen on the silver accents, which adds some visual flair. Sadly, the green PCB spoils the black theme somewhat and ruins the overall level of synergy. Although once the modules are installed, you shouldn’t notice the green PCB from a distance. Thankfully, there are some SKUs which already have the same heat spreader design and black PCB to complement it perfectly. Overall, Mushkin has done a commendable job in the product’s aesthetics and ensured it appeals to the core gaming demographic.

MSI Z170A GAMING PRO CARBON (LGA1151) Motherboard Review

Introduction


MSI has rapidly established itself as one of the most reputable motherboard manufacturers and constantly strives to enhance the user experience through an intuitive BIOS interface and marvellous reliability. Additionally, the company offers a huge range of products to suit various colour schemes and often creates extremely-unique designs as demonstrated by the gorgeous, Z170A XPOWER GAMING TITANIUM motherboard. Some time ago, MSI released the Z170A GAMING PRO which adopted a fantastic red and black theme to please the core gaming demographic. Although, it’s difficult to stand out when using this colour scheme because manufacturers tend to fixate on a safe, popular design. That’s not to say there’s anything particular wrong with utilizing these colours, but I’d prefer to see more vendors breaking the mould through truly unusual aesthetic choices.

Since the Z170A GAMING PRO’s release, MSI has listened intently to user feedback and decided to construct a brand new model entitled, the Z170A GAMING PRO CARBON. At first glance, the only difference appears to be the new carbon fibre skin. However, this isn’t the case because MSI has made a raft of changes to enhance the motherboard’s connectivity. More specifically, the Z170A GAMING PRO CARBON features two USB 3.1 Gen 2 ports, one being type-A while the other is type-C. Furthermore, the redundant PCI slot has been dropped in favour of a fourth PCI-E x1 slot. Thankfully, the 180-degree angle SATA ports have been removed and replaced with a more suitable arrangement using right-angled connectors. Finally, the USB 3.1 Gen 1 layout features two on the rear and four via an internal header while USB 2.0 options now contain a total of 8 ports through a front four rear four setup.

With a recommended retail price of £119.99, the Z170A GAMING PRO CARBON is destined to compete alongside the ASUS Z170 PRO GAMING. As a result, it will be fascinating to see how the different products compare and I expect the Z170A GAMING PRO CARBON to remain very competitive in synthetic testing.

Specifications

Packaging and Accessories

The motherboard comes is a visually appealing box which outlines the RGB functionality, and gaming focus. I particularly like the neon design from the background vehicle which corresponds with the sort of lighting embedded onto the motherboard’s PCB.

The rear portion is packed full of information regarding the motherboard’s layout, impressive software package and premium-grade hardware. This is presented in a really clean, and concise manner with statistics to help quantify the importance of each unique feature.

In terms of accessories, there’s a detailed user’s guide, product registration card, cable labels, CPU installation guide and driver disk. It’s great to see the inclusion of cable labels because they help with diagnostics if you have multiple drives in a RAID configuration and struggle to determine which is the boot device.

Here we can see the bundled I/O shield, SLI bridge and SATA cables. The I/O shield’s red lettering and MSI dragon logo evokes a luxury feel and emphasizes the motherboard’s target audience.

Patriot Viper 4 DDR4 3200MHz 16GB (2x8GB) Dual Channel Memory Kit Review

Introduction


DDR4 memory kits are steadily superseding DDR3 DIMMs due to competitive pricing and the advent of Intel’s LGA1151 chipset which supports speeds in excess of 3200MHz. Furthermore, DDR4 modules require less voltage to remain stable despite the typical increase in memory bandwidth. Recently, professional overclocker Shamino set an astounding world record and overclocked the G.Skill Ripjaws 4 to 4255MHz using a mere 1.3 volts. Clearly, this is an extreme case and the majority of DDR4 kits available to consumers range between 2400MHz and 4000MHz. Plus, the performance difference in gaming tasks primarily revolves around your system’s graphics card, and CPU. Nevertheless, it’s still important to select high-quality DIMMs to keep your PC perfectly stable and compliment the other components.

The Patriot Viper series is synonymous for offering excellent memory speeds at an affordable price point. Here’s a brief description of the product directly from the manufacturer:

“Patriot Memory’s Viper 4 Series memory modules are designed with true performance in mind. Built for the latest Intel® Skylake processor utilizing the 100 series platform, the Viper 4 series provides the best performance and stability for the most demanding computer environments.

The Viper 4 series utilizes a custom designed high performance heat shield for superior heat dissipation to ensure rock solid performance even when using the most taxing applications. Built from the highest quality Build of Materials, Patriot Memory’s Viper 4 Series memory modules are hand tested and validated for system compatibility.

Available in dual kits, 8GB, 16GB and 32GB kits, Patriot Memory’s Viper 4 Series will be offered at speeds from 2400MHz up to 3600MHz and XMP 2.0 ready. Hand tested for quality assurance the Viper 4 series is backed by a lifetime warranty.”

As you can see, the latest version of the Viper range comes in a variety of capacities and memory speeds to suit a wide range of user requirements. Given the impressive 3200MHz speed, 16-16-16 timings and respectable voltage, I expect to see some superb numbers which legitimately rival the best dual channel kits we’ve tested!

Specifications

Packaging and Accessories

Patriot have adopted a clean, bold design to the memory’s packaging which makes it easy to read the key specifications while admiring the DIMM’s colour scheme. Here we can see a visual run down of the memory’s speed, capacity, XMP version and other essential statistics. Many kits on the market utilize pretty plain blister packs which don’t enthuse a luxury feel. In this case, the packaging draws you in and leaves a very positive initial impression.

On the rear section, there’s information about Patriot’s lifetime warranty, a brief synopsis of the product, and links to the company’s presence across various social media platforms.

A Closer Look

From an aesthetics standpoint, the DIMMs have a rather understated look and targets the mainstream gaming audience. Any red and black heatspreader combination is going to become a popular choice, and the different shades combine quite nicely. Another striking touch is the contrast between the textured black finish and matte section towards the PCB. I’m also quite fond of the sophisticated Viper logo and small gap between the main heatspreader which creates an impressive visual effect. Sadly, the green PCB is difficult to overlook and detracts from the attractive design. If a black PCB was used instead, the memory would be the perfect choice for a high-end build. Despite these qualms, once the RAM is installed, you’re not going to notice the PCB colour in an enclosed chassis.

Smart me up Demos Real-Time Face Recognition Analysis @ CES 2016

CES 2016: Smart Me Up is a French company which created a highly advanced software suite to monitor a person’s age, gender, head pose and other essential statistics. This is also completed in real time and constantly updates as the software learns more about your facial characteristics. During a hands on demo, the software initially misread my age by a decent margin but as time progressed, the age score became almost perfect. Apparently, the face recognition is designed to be an integral part of smart technology in the home and provide a way to personalize various devices. It could also be used in robotics, medicine and other vital industries.

It’s still fairly unclear when the facial recognition software will be implemented and its wider appeal, although it certainly captured people’s imaginations during CES.

Would you like to see this kind of facial analysis become a part of modern homes?

Performance Overview: The Last AMD CCC 15.11.1 Beta

Introduction


An unforeseen turn of events has taken place over the last few months. AMD split up its Processor and Graphics divisions and we recently heard the demise of Catalyst Control Centre to make way for Radeon Software. I for one was not expecting to see a graphics driver before the Radeon Software: Crimson Edition was released. Why do I think this? If AMD is struggling as much as turnover figures and rumours suggest, why would it waste effort on something that is being discontinued for a new version. That’s like announcing HBMv2 will be released in January but releasing an entire range of graphics cards on HBMv1 in December. I’m not saying this is a bad thing, far from it, I welcome AMD driver updates because it shows that it is still in the running and recent news suggests that more funding will be invested into the graphics drivers in the future to level the playing field with NVIDIA. Early reports suggest that this new driver and the one just before, 15.11, are very good performance enhancing versions for newer games such as Star Wars Battlefront, Fallout 4, Assassins Creed, etc…

So today we take a look at the very last CCC driver, 15.11.1 Beta. If you are unaware, the naming is simply [Year].[Month]; the additional; “.#” is if there are two or more updates within a month and then it would just be named in chronological order. This makes it extremely easy to understand which is the latest drive to work for you and troubleshooting is technically made easier if you can only remember approximately when you started having problems (if driver related).

This new driver doesn’t really bring anything new in terms of features apart from an updated list of graphics cards that are applicable for higher support Virtual Super Resolution modes such as the R9 380 being able to support 3840×2160.

Indie Dev Will no Longer Offer Free Press Copies

Paul Stephen-Davis, the CEO of Retro Army Limited, has announced the company will no longer provide review codes to press outlets. This decision comes after a wave of scammers and key resellers exploited the developer and pretended to be a legitimate website. Indie developers struggle to deal with the PR side given their limited budget and the line between press and consumer has become quite vague in recent years. Now, each developer has to judge if a Twitch streamer, YouTuber or print press are eligible for press copy and the numbers of requests have risen exponentially.

As a result, I greatly sympathize with his position and feel too many “press” are being allowed access to pre-release games. Currently, many reviews come from individuals with another job who don’t invest their full-time into this career path. I believe, press keys should only be provided to “professional” reviewers who do this as their sole means of a living. The developer argues:

“Personally I think it’s unfair to players that buy the game when others(reviewers) are getting it for free.”

“Our main policy is to protect and take care of our players first.”

This is where I disagree with the comments made, and in quite a strong fashion. While being a member of the full-time professional press is possible, it’s very unlikely in 2015. Most reviewers are unpaid, on a pitiful wage or rely on Patreon funding. It’s absurd to ask reviewers to fund their the cost of reviews considering they are usually giving their time and expertise for free. Most job listings for a gaming website describe a review position as voluntary and argue compensation is provided through game codes and access to press events.

Consumers might be irked that the press receive games for free, but the reality is they are on a much higher wage than 99.9% of the gaming press. Writing is a 7-day a week, demanding job without 9-5 office hours. Unfortunately, the reputation of gaming journalists is atrocious and some of the perceptions are valid. Members of the press can lash out on social media or have potentially biased relationships without disclosing any information prior to publishing.

To reiterate, I understand how frustrating this must be for indies, but they must realize how difficult it is to forge a career in the gaming media. Ironically, low-medium tier press have virtually no money and work on a smaller budget than a tiny studio. In an ideal world, I would like the most talented and insightful journalists to be paid a good, living wage. However, with reducing ad-revenue, community media and Adblock, it seems this could be a dying art-form.

Do you think games critics should receive titles for free?

Super Trench Attack 2 is currently available from Steam for £4.99 and is a survival turn-based squad game set in a fictional world war setting.

Batman: Arkham Knight Performance Analysis

Introduction


Finally, Batman: Arkham Knight has been released to the world after many years of agonising wait. The next instalment of the franchise has been 2 years in the making and is set to top all of the previous games by adding amazing detail and unparalleled performance.

Following suit of our previous game performance analysis articles, we will take nine of the most up to date graphics cards available and pitch them against the game at three of the most popular resolutions.

As with all games at release, they have issues and setbacks; however, this particular game had more than a rough road to release and the developers have decided to remove it from the consumer purchase until it is fully optimised. Reasons behind this are still unclear, maybe it was a poor optimisation team or maybe the drivers didn’t support the game as expected, but we believe our copy and testing procedure demonstrates what the developers had hoped for originally.

Let’s begin shall we?

GTA V GPU Performance Review

Introduction


Grand Theft Auto V was released on Tuesday, along with that, NVIDIA has released yet another Game Ready Driver. However this time, AMD has decided to release one too, both offering optimised graphical drivers for most of their graphics card range.

Normally, we would only test a single driver and graphics card manufacturer, but because this is oriented towards GTA V performance, things are going to get a little more condensed.

Today we are going to be looking at seven of the top graphics cards from NVIDIA and AMD, pitting them against each other to see which performs the best in GTA V, under our own choice of settings of course.

NVIDIA has released the GTA V optimised driver, GeForce Game Ready 350.12 and AMD has released a Beta driver, Catalyst 15.4. either can be downloaded by clicking the associated link.

Let’s begin shall we?

3D Robotics Launch DroneKit Open Source API for App Development

It looks like 3D Robotics have launched the DroneKit API for drone app development as a free open source software. The API can be used to develop apps for drones or onboard drone software, having it be completely flexible and multi-platform oriented.

“Unlike other APIs for drones, there are no levels of access to DroneKit; it’s completely flexible and open,” said Brandon Basso, VP of software engineering for 3DR. “The platform works on laptops as well as mobile devices. Best of all, once an app is created, the app automatically works on any computing platform – the interface is always the same.”

The company is said to have released the API to the community so that people interested in drones are able to customise how they use them in the field. The DroneKit API is said to allow you to set waypoint flight paths, follow GPS targets, while also allowing the developer to view playbacks and log analysis of flights.

The above mentioned features are just an example of what the API brings to developers, having it come with a variety of feature which were previously unavailable to drone enthusiasts.

Thank you TweakTown for providing us with this information

NVIDIA Game Ready 347.52 Driver Analysis

Introduction


Driver updates, those seemingly pointless notifications at the bottom of your screen that always seem to pop up just as you start a movie or game. Some? Completely pointless, Others? Performance enhancing. Today we’re going to take a look at the brand new NVIDIA Game Ready Driver 347.52 and compare this to its most recent comparable driver update; Game Ready Driver 347.25. Using our brand new test bench, we’ll be retesting the GeForce GTX 980, GTX 970 and GTX 960 as these three graphics cards were outlined in the release notes. Almost all of the previous generations are included within this driver update also. All information regarding the 347.52 Driver can be found here. This is a chart taken from NVIDIA’s driver site, this states NVIDIA’s prospected gains among the GTX 900 series.

NVIDIA released the GeForce GTX 980 and GTX 970 during September 2014 and held out to release their storming, mid range GeForce GTX 960 at the end of January 2015. During our testing with our new test bench, all was completed using the most recent Game Ready Driver 347.25. This came to us as a Day One driver with the GeForce GTX 960 and proved to be a very stable driver update. During our testing, we will only analyse games; generally benchmark software isn’t affected by driver updates unless specified.

New Google Technology Can Automatically Describe Images

Google has unveiled an interesting new technology that can automatically analyse images and provide in some cases, accurate descriptions of what appears in those images.

The “intelligent computer vision software” can scan a collection of images and provide general descriptions of what can be seen. Those descriptions can then be tagged to the images and used to make a Google search more accurate or to, according to Google, “eventually help visually impaired people understand pictures” and “provide alternate text for images in parts of the world where mobile connections are slow”.

While the technology isn’t yet completely accurate for every single image, it does however produce rather amazing results for some images:

Source: The Next Web

Why You Should Never Buy A Knockoff iPad Charger

Blog writer Ken Sharriff has done a very thorough teardown of the official iPad charger and of one of the many cheaper counterfeit ones that are abundant on the market. The temptation to pick up the cheaper models is high, given that official offerings are close to $20 and the cheaper ones can be as little at $3-4. They may look identical on the exterior, but the same cannot be said for the internal hardware.

From the outside, the real charger (left) and counterfeit charger (right) are almost identical. If you look very closely, you can spot are a few differences in the text: The counterfeit removed “Designed by Apple in California. Assembled in China” and the manufacturer “Foxlink”[1], probably for legal reasons. (But strangely, the counterfeit still says “TM and © 2010 Apple Inc.”) The counterfeit charger displays a bunch of certifications (such as UL) that it doesn’t actually have. As you will see below, there is no way it could pass safety testing.”

Aside from the many manufacturing differences on the interior, which ranged from poor soldering, to cheaper and poorly installed components, his testing showed that the counterfeit doesn’t even put out as much power as it should, with a lot of noise on the power delivery. If this were a PSU like the one in your computer, your PC would likely be dead by now and the same is likely to happen to your device before long, and that’s if the charger doesn’t catch on fire first.

For more information I suggestion you check out the full article on Ken’s blog post here. One thing is for certain, it’s a clear reminder of why it’s not always a great idea to try save a few bucks with a cheaper product, it could prove to be damaging to your device, or worse, yourself.

 

Images courtesy of Ken Sharriff.

Analysis: Were EA Right About SimCity? The Evidence Says That They Were!

It’s been two months now since the release of the latest Sim City, and it sure has been an adventure in terms of PR for both consumers and EA alike. Yet two months is a long time in the gaming world and people are quick to move onto the next farce and start yelling at that, forgetting whatever was wrong with Sim City in the first place.

For those of you who don’t know, when Sim City launched it caused a huge uproar from the public, most notably because people who had gone out and bought the game with their hard-earned cash couldn’t play it due to what is believed to be DRM.

EA’s defense was that they didn’t have enough bandwidth on their servers, which prevented people from playing because the game is so inherently dependent on online access for its social features. EA says it’s just like an MMO, it needs the internet to live and breathe as it should, as this way their vision of how the game would operate and it couldn’t do it without the internet. This has been widely debated to be a cover story for DRM, although personally I don’t think it is DRM, but I do believe its a system that acts like DRM, either directly, or indirectly.

Yet with all that fuss, the rage of the consumer seems to have dwindled and what was once a riot has reduced to a dull roar, even EA’s Facebook page is no longer endlessly trolled with hundred of comments about “you should fix you f***ing game” on every status they post.

Yet was it worth it, after all this fuss about DRM, piracy, always online gaming and gamers that can’t play their game due to server issues, has the system settled into place and does it work? Apparently, yes! Or should that be annoyingly yes? I’ll let you decide that one.

A quick search on Google shows that there are seemingly illegal downloads for the new Sim City, but on close inspection, all of them appear to be fake, or a virus disguised as an “offline play” patch. Some players have hacked the game to play offline, but not without draw backs and it seems some extensive re-coding would be required to fully obtain such a system and I doubt EA will rush to do that anytime soon.

Next stop, torrent websites, if you’ve never heard of them, you’ve had your head stuck in a box for the last few years! The Pirate Bay and countless others provide a source for illegal downloads and just about any game you can think of ends up here pretty quickly… except Sim City, or at least not the new one. A quick search of several of the big sites turns up nothing, I did find a few false positives that were already voted as fake, but that’s it, I couldn’t see any legit torrents for this game.

So what ever EA has done, they seem to be fixing it, there are less and less, or maybe even no reports that people can’t play the game anymore and while some may complain it’s not that great a game compared to other Sim City titles, that  isn’t the subject I want to discuss. DRM or no DRM, it doesn’t matter, EA has pretty much stamped out piracy of this game and if I’m honest, I’m not sure if this is a sign of good things to come, or a sign of a dark future for PC gaming.

EA may have been successful in stopping piracy for this game, but that doesn’t mean the game has been successful. The game would likely have done better without the online features or DRM and overall I think this is a scar that will take a long time to heal in gamers hearts, if it ever does heal that is. Stopping piracy is one thing, but it could have more negative side effects than it does positive ones, most likely in the form of end-user feedback, or worse, a drop in sales due to protest.

It’s been suggested that no one has cracked the game yet because no one likes it, and while I’m sure there are plenty of haters out there, ok a LOT of haters, there will be a lot of fans too. I expect that someone somewhere will crack this game soon enough, it will be pirated, but for now, EA’s security is holding, the question is, for how long. Either way, stripping the DRM like features of Sim City is likely too little too late to save this game.

Are you still playing Sim City? Or have you chosen to never play it at all, sound off in the comments section below as I’d love to know how you still feel about the game.

Of course, there is still a cure for those effect by Sim City 2013 and you can find it here.