Ashes of the Singularity is a futuristic real-time strategy game offering frenetic contests on a large-scale. The huge amount of units scattered across a number of varied environments creates an enthralling experience built around complex strategic decisions. Throughout the game, you will explore unique planets and engage in enthralling air battles. This bitter war revolves around an invaluable resource known as Turinium between the human race and masterful artificial intelligence. If you’re into the RTS genre, Ashes of the Singularity should provide hours of entertainment. While the game itself is worthy of widespread media attention, the engine’s support for DirectX 12 and asynchronous compute has become a hot topic among hardware enthusiasts.
DirectX 12 is a low-level API with reduced CPU overheads and has the potential to revolutionise the way games are optimised for numerous hardware configurations. In contrast to this, DirectX 11 isn’t that efficient and many mainstream titles suffered from poor scaling which didn’t properly utilise the potential of current graphics technology. On another note, DirectX 12 allows users to pair GPUs from competing vendors and utilise multi graphics solutions without relying on driver profiles. It’s theoretically possible to achieve widespread optimization and leverage extra performance using the latest version of DirectX 12.
Of course, Vulkan is another alternative which works on various operating systems and adopts an open-source ideology. Although, the focus will likely remain on DirectX 12 for the foreseeable future unless there’s a sudden reluctance from users to upgrade to Windows 10. Even though the adoption rate is impressive, there’s a large number of PC gamers currently using Windows 7, 8 and 8.1. Therefore, it seems prudent for developers to continue with DirectX 11 and offer a DirectX 12 render as an optional extra. Arguably, the real gains from DirectX 12 will occur when its predecessor is disregarded completely. This will probably take a considerable amount of time which suggests the first DirectX 12 games might have reduced performance benefits compared to later titles.
Asynchronous compute allows graphics cards to simultaneously calculate multiple workloads and reach extra performance figures. AMD’s GCN architecture has extensive support for this technology. In contrast to this, there’s a heated debate questioning if NVIDIA products can even utilise asynchronous compute in an effective manner. Technically, AMD GCN graphics cards contain 2-8 asynchronous compute cores with 8 queues per core which varies on the model to provide single cycle latencies. Maxwell revolves around two pipelines, one designed for high-priority workloads and another with 31 queues. Most importantly, NVIDIA cards can only “switch contexts at draw call boundaries”. This means the switching process is slower and gives AMD and a major advantage. NVIDIA has dismissed the early performance numbers from Ashes of the Singularity due to its current development phase. Finally, the game’s release has exited the beta stage which allows us to determine the performance numbers after optimizations were completed.
The Far Cry franchise gained notoriety for its impeccable graphical fidelity and enthralling open world environment. As a result, each release is incredibly useful to gauge the current state of graphics hardware and performance across various resolutions. Although, Ubisoft’s reputation has suffered in recent years due to poor optimization on major titles such as Assassin’s Creed: Unity and Watch Dogs. This means it’s essential to analyze the PC version in a technical manner and see if it’s really worth supporting with your hard-earned cash!
Far Cry Primal utilizes the Dunia Engine 2 which was deployed on Far Cry 3 and Far Cry 4. Therefore, I’m not expecting anything revolutionary compared to the previous games. This isn’t necessarily a negative concept though because the amount of detail is staggering and worthy of recognition. Saying that, Far Cry 4 was plagued by intermittent hitching and I really hope this has been resolved. Unlike Far Cry 3: Blood Dragon, the latest entry has a retail price of $60. According to Ubisoft, this is warranted due to the lengthy campaign and amount on content on offer. Given Ubisoft’s turbulent history with recent releases, it will be fascinating to see how each GPU this generation fares and which brand the game favours at numerous resolutions.
“Far Cry Primal is an action-adventure video game developed and published by Ubisoft. It was released for the PlayStation 4 and Xbox One on February 23, 2016, and it was also released for Microsoft Windows on March 1, 2016. The game is set in the Stone Age, and revolves around the story of Takkar, who starts off as an unarmed hunter and rises to become the leader of a tribe.” From Wikipedia.
The initial unveiling of AMD’s Fury X was eagerly anticipated due to the advent of high bandwidth memory, and potential to revolutionize the size to performance ratio of modern graphics cards. This new form of stackable video RAM provided a glimpse into the future and departure from the current GDDR5 standard. Although, this isn’t going to happen overnight as production costs and sourcing HBM on a mass scale has to be taken into consideration. On another note, JEDEC recently announced GDD5X with memory speeds up to 14 Gbps which helps to enhance non-HBM GPUs while catering to the lower-mid range market. The Fury X and Fury utilizes the first iteration of high bandwidth memory which features a maximum capacity of 4GB.
There’s some discussion regarding the effect of this limitation at high resolutions but I personally haven’t seen it cause a noticeable bottleneck. If anything, the Fury range is capable of outperforming the 980 Ti during 4K benchmarks while it tends to linger behind at lower resolutions. AMD’s flagship opts for a closed-loop liquid cooler to reduce temperatures and minimize operating noise. In theory, you can argue this level of cooling prowess was required to tame the GPU’s core. However, there are some air-cooled variants which allow us to directly compare between each form of heat dissipation.
Clearly, the Fury X’s water cooling apparatus adds a premium and isn’t suitable for certain chassis configurations. To be fair, most modern case layouts can accommodate a CLC graphics card without any problems, but there’s also concerns regarding reliability and the possibility of leaks. That’s why air-cooled alternatives which drop the X branding offer great performance at a more enticing price point. For example, the Sapphire Nitro OC R9 Fury is around £60 cheaper than the XFX R9 Fury X. This particular card has a factory overclocked core of 1050MHz, and astounding cooling solution. The question is, how does it compare to the Fury X and GTX 980 Ti? Let’s find out!
Packing and Accessories
The Sapphire Nitro OC R9 Fury comes in a visually appealing box which outlines the Tri-X cooling system, factory overclocked core, and extremely fast memory. I’m really fond of the striking robot front cover and small cut out which provides a sneak peek at the GPU’s colour scheme.
On the opposite side, there’s a detailed description of the R9 Fury range and award-winning Tri-X cooling. Furthermore, the packaging outlines information regarding LiquidVR, FreeSync, and other essential AMD features. This is displayed in an easy-to-read manner and helps inform the buyer about the graphics card’s functionality.
In terms of accessories, Sapphire includes a user’s guide, driver disk, Select Club registration code, and relatively thick HDMI cable.
Rise of the Tomb Raider originally launched on November 10th and received widespread critical acclaim from various press outlets. Unfortunately, the game went under the radar because Fallout 4 released on the same day. This was a strategic error which hindered the game’s sales and prevented consumers from giving it their undivided attention. It’s such a shame because Rise of the Tomb Raider is a technical marvel when you consider the Xbox One’s limited horsepower. Even though it’s not technically an exclusive, PC players had to wait until after the Christmas period to enjoy the latest exploits of everyone’s favourite heroine.
The PC version was created by Nixxes Software who worked on the previous Tomb Raider reboot as well as a number of other graphically diverse PC games. The studio is renowned for creating highly polished and well-optimized PC versions featuring an astonishing level of graphical fidelity. Prior to release, NVIDIA recommended a GTX 970 for the optimal 1080p experience and 980 Ti for 1440P. Since then, there have been some performance patches from the developer and driver updates to help with scaling across various hardware configuration. This means it will be fascinating to see the performance numbers now that the game has matured and gone through a large number of post-release hot fixes.
“Rise of the Tomb Raider is an action-adventure video game developed by Crystal Dynamics and published by Square Enix. It is the sequel to the 2013 video game Tomb Raider, which was itself, the second reboot to its series. It was released for Xbox One and Xbox 360 in November 2015 and for Microsoft Windows in January 2016. It is set to release for PlayStation 4 in late 2016.
The game’s storyline follows Lara Croft as she ventures into Siberia in search of the legendary city of Kitezh, whilst battling a paramilitary organization that intends on beating her to the city’s promise of immortality. Presented from a third-person perspective, the game primarily focuses on survival and combat, while the player may also explore its landscape and various optional tombs. Camilla Luddington returns to voice and perform her role as Lara.” From Wikipedia.
So, let’s get to it and see how some of the latest graphics cards on the market hold up with the latest from Crystal Dynamics!
For a long time, Gigabyte has held the top spot for the most outrageous yet powerful graphics cards on the market. Just look at the simply huge 300+mm G1 Gaming series which has some of the best benchmarks that we’ve seen and then they can be overclocked to an entirely different level.
Today Gigabyte has officially announced the full line-up from the G1 Gaming, “XTREME GAMING”. On the face of it, the cards just look like a rebranded G1 Gaming line-up, using a similar Windforce style cooling solution, but these cards have been redesigned from the PCB out to give enthusiasts the very best chance to attain the best performance. This follows the extremely well-received GTX 950 XTREME GAMING edition released previously.
The most notable addition to the range is the custom cooling design for the NVIDIA Titan X, but the most interesting features are equipped to the GTX 980Ti model which features an additional 6 Pin power connector (yes on top of the pre-existing dual 8-pin) and LN2 BIOS for the ultimate overclocking experience.
The entire range up to the GTX 980Ti will feature RGB lighting to the WINDFORCE logo and an addition ring around the fans. Currently the only images of the range we can source are for the XTREME GAMING GTX 970.
The XTREME GAMING Series graphics card are forged with the top-notch GPU core through the very own GPU Gauntlet Sorting technology that guarantees an exceptional overclocking proficiency in terms of power switching. Along with the substantial factory overclock compared to the reference design, the memory is also overclocked to deliver the sharpest and smoothest gameplay results.
Built for ultimate overclocking, the GTX 980 Ti XTREME GAMING is further equipped with LN2 BIOS and an extra 6-pin PCI-E power connector, giving overclockers the maximum freedom to tweak the performance to the extremes with the simple press of a button.
Each air-cooling XTREME GAMING graphics card features the WINDFORCE 3X triple-fan system with composite copper heat-pipes, special fin architecture, unique blade fan design, and GIGABYTE ‘Triangle Cool’ technology, together keeping the graphics card nice and cool even when reaching its peak.
The WINDFORCE 3X triple-fan system also features 3D-Active fan, the first 0dB semi-passive fan design introduced by GIGABYTE back in 2007 and has been leading the way to date. This technology allows gamers to enjoy gameplay in absolute silence during light gaming or idle without any distraction by noise. Users can be well aware of the fan status with the LED indicator on the top of the card while gaming.
Both the TITAN X and the GTX 980 Ti XTREME GAMING WINDFORCE edition further adopt the alternate spinning fan design, as the middle fan spins in the reverse direction to optimize air flow to dissipate heat more effectively. This innovative design significantly enhances the thermal efficiency of the available cooling area, delivering a heat dissipation capacity up to 700W for accomplishing higher performance at a lower temperature.
The GTX 980 Ti XTREME GAMING also offers the WATERFORCE edition for those favor the liquid cooling solution. The exclusive all-in-one, closed-loop water cooling system features the full-coverage cooling module covering not only the GPU, but also the heat-generating VRAM and MOSFET. This design requires no additional fan like other competitions in its class for a much superior acoustic performance.
The robust FEP tubes can effectively prevent the leak and fare a lower coolant evaporation rate. Coupled with the 120mm silent fan and the low-noise pump, the graphics card is able to perform up to 38.8% cooler than the reference cooling in operating temperature for ultra-stable gaming in near silence.
Armored with a metal back plate, each XTREME GAMING graphics card is thoroughly applied with a breathable coating to shield against moisture, dust, and corrosion for a complete protection. This aerospace-grade coating is ideal for users who live in regions with high humidity, salt air pollution or extreme temperatures. Hardcore gamers who practice water-cooling or even LN2 overclocking could also benefit from this unique feature to reduce the risks of damages caused by coolant residue or leaks.
The XTREME GAMING graphics cards are also fitted with a smart power LED indicator next to the power connectors (available on selected models). When experiencing any power abnormality, the indicator will alert gamers by flashing light. Abnormal power-related events will recorded in the system as well through OC GURU utility software as a useful reference for troubleshooting.
In addition to the stylish metal back plate, the stealth aesthetics of the XTREME GAMING series graphics cards is now backlit with brand new RGB LED illumination, which corresponds to the excitement of XTREME GAMING with a premium gaming look.
The exclusive LED angle eyes are first seen in graphics cards that light up the cooling fans along with the WINDFORCE emblem (available on selected models). Gamers are able to customize their own builds that best suit their gaming ambience with multiple color options and light effects using OC GURU.
Compared to the reference design, the XTREME GAMING series graphics cards are equipped with extra power phases to make the MOSFET working at a lower temperature, whilst providing more stable voltage output. The featuring over temperature protection and load balance to each MOSFET can also effectively extend the card life.
The XTREME GAMING graphics cards are all reinforced with top quality components to bring gamers with the extreme durability ever. Using the highest-grade chokes and capacitors, these graphics cards deliver the ultimate integration of thermal, electric characteristics, digital signals, power circuitry for enhanced results and longer system lifespan.
XTREME User Friendliness
The XTREME GAMING graphics cards support GIGABYTE OC GURU to provide gamers with an unlimited OC capability and full control through its intuitive interface. Users can easily perform precise control on the graphics card, including core clock adjustment, fan speed, power or temperature targets, and LED illuminations. These main features not only increase the overclocking ability but also deliver superior presentation to let gamers enjoy the all-around gaming experience.
Built with fully automated manufacturing process, which eliminates human interventions, the XTREME GAMING graphics cards remove all sharp edges of the solder connectors and pins to provide a rather smooth PCB surface for a more user-friendly DIY experience when building or modding gaming PCs.”
For more information on the XTREME GAMING range, check out the product pages here.
Dual GPU graphics card have been common over the [ast few generations and it looks like Nvidia is about to launch another one. Set to use the Maxwell architecture, the new dual-GPU card will feature 2 GM200 class GPUs, the same ones that power the GTX 980Ti and Titan X.
Back in the Fermi generation, Nvidia had the GTX 590 which was followed up by the GTX 690 with Kepler. Both cards were relatively well received. Looking to make use of their Titan brand, Nvidia then pushed for the Titan Z, essentially 2 Titan Blacks. That card, unfortunately (or, fortunately, depending on where you stand) flopped heavily due to an exorbitant $3000 price tag. This time around, we will likely see a return to more sane pricing, with two GTX 980Ti equivalents priced about $1500.
Given the tight time frames, a Maxwell based dual-GPU flagship likely means Pascal won’t be dropping for a while. After all, Nvidia won’t want those who shelled ou top dollar for the new card to feel burned with Pascal drops with a new architecture, memory interface, and better performance. This does cement the fact that Nvidia will probably launch most of Pascal with HBM2, with maybe a few select cards using HBM1, similar to what AMD has done.
It’s interesting to hear of this Nvidia now so close to AMD’s dual Fury GPU. The R9 Fury X2 is set to feature two of AMD’s top line Fiji chips and would have likely dominated the market for single board graphics cards. With this new card, Nvidia will be able to offer some stiff competition given Maxwell’s strength. It is important to note that Crossfire does scale a bit better than SLI, leading to AMD’s 7990 and 295X2 doing quite well.
Thank you WCCFTech for providing us with this information
1080p, 1440p, 1660p, 2160p; just a random bunch of numbers with a ‘p’ after them can mean nothing to some people; however, to gamers it means a whole world of display quality goodness. For the last few years, 1080p (1920 x 1080 pixels) monitors have been the normal standard for a ‘decent’ gaming setup and what most graphics cards are tested at. Then we started moving up to higher resolutions such as 2560 x 1440p and 2560 x 1600p.
For some, this wasn’t enough; despite the pixel densities growing larger, as humans we wanted even more pixels. This resorted to users buying multiple monitors and connecting them one next to another and activating AMD ‘EyeFinity’ or NVIDIA ‘Surround’ to have an almost 180° viewing range. Even though the latter part of the pixel count didn’t change, this meant that monitor set-ups were hitting 5760 pixels wide by using three 1920 x 1080p monitors.
Then we move onto today, 1080p and 1440p has been surpassed by what has now become the new ‘standard’ of gaming, 2160p, or 4K. At this resolution, even the most powerful of graphics cards can struggle to churn out the desired 60FPS which we have come to accept as the acceptable standard. So what about when you put three 4K monitors next to each other and ask for 11520 x 2160 of pixelated goodness (or 6480 x 3840 if you prefer your monitors in portrait mode.)
Before we go rushing into things, there are some issues regarding our particular test system. The provided AOC monitors (U2868PQU) has known issues with AMD graphics cards and 60Hz refresh rate. Symptoms can present themselves as minor screen flickering to a complete system freeze. This was made worse when trying to display at 11520 x 2160; however, after multiple tests, we found the issue was subdued by putting the monitors into Portrait mode. This isn’t the ideal gaming set-up, however, in the interest of bringing you the information; I endured the pain of a 2″ thick bezel between each monitor.
To get to the technical nitty gritty, a typical 4K monitor at 60Hz refresh rate can present 497 Million information pixels per second, so this set-up can present almost 1.5 Billion pixels per second; yes 1.5 BILLION. To put that into perspective, the Samsung Galaxy S6 Edge has a screen size of 1440 x 2560 and a refresh rate of 60Hz; that works out to a mere 221 Million pixels per second in comparison.
The AMD Radeon Fury X looks set to be a truly amazing graphics card and details just keep cropping up out of the blue. We first saw it on 3D Mark Fire Strike and everyone just went crazy. Soon after, the rumoured GPU was officially confirmed and we even came across some amazing pics of AMD’s monstrous Fury X, NVIDIA Titan X’s true competitor. But something was interesting when we took a closer look at them. Can you spot it?
I’ll give you a hint. It’s near the humongous radiator. Still having a hard time figuring out? Ok then, we will tell you. The key element that we thought was odd is the 3-pin connector which can be seen in the pics below. We even highlighted it for you this time.
So what could this mean? Well, our first thought and the best bet is that it could come with a built-in pump. And guess what? We have found what fan is pictured too!
It’s a Scythe 120mm fan, which makes it just 25mm thick. This means there is plenty of room for a pump in the middle of the radiator and lots of space for a reservoir at the base. Clever, but I think we’ve seen something similar in the past as well. And we did! Antec had a similar All-in-One (AIO) series as shown in the pic above, so this approach may be seen here on the Fury X radiator too. This means we are looking at a solid water cooling solution here, making the Fury X true GPU beast.
Computex 2015 – While at the ZOTAC stand, a certain graphics card caught our eye. If you remember back to this article, it includes the render of a new hybrid cooling solution for ZOTAC called the Arcticstorm. This featured a typical heatsink and fan cooling design like the AMP! graphics card, but also had a water cooling block integrated. Well, I think I can safely say that it is now real and looks awesome. The article regarded a custom design fitted to a Titan X graphics card, this model is the GTX 980ti, but it shows that the design is real and could make its way to the Titan X if the ban has indeed been lifted.
We will keep you updated with any news and events from the rest of Computex 2015.
With Zotac teasing a custom hybrid Titan X yesterday, it seems that the ban has been lifted on the custom offerings of cooling solutions for the high-end graphics card.
The cooler will be an all in one unit, requiring little maintenance, you may have to clean dust from the fans and radiator every once in a while. It will feature a similar radiator to the AMD R9 295×2, equipped with a 120mm fan and wired into the graphics card itself rather than a motherboard header.
The actual cooling solution itself looks very similar to that of other reference design EVGA cards, a snug fit to the PCB length and a typical reference blower style fan.
The Titan X, as we know, is currently the most powerful single GPU graphics card on the market; however, the heat was a slight hindrance. This cooler aims to shave off around 30°c from the standard core temperatures thanks to the hybrid liquid and air cooling design.
EVGA will offer this graphics card off the self, but also offer the cooling solution separately for those who have already purchased a Titan X.
It’s about time NVIDIA lifted the ban on offering custom cooling designs. Would you be interested in buying this new cooling solution for your Titan X or would this now push you to purchase one? Let us know in the comments.
Thank you to VideoCardz for providing us with this information.
CyberPowerPC are making a sizeable impact in the UK PC market in recent years since they came across the water and settled in the sunny North of England. They contacted us offering not just a glimpse at one their flagship systems, but a completely built and ready to go review sample of which we were encouraged to “test”, and by test I mean game on.
Naturally we were on scene faster than you can shake a stick at and soon we were pulling up outside Cyberpower HQ eager to get our mitts onto this mystery machine. We were not disappointed when we were warned that it would take 2 men to lift it, and that I would need to put my back seats down as the box would take up half of the back of my car. As we returned to eTeknix towers we could see on the box what chassis we had, but it wasn’t until we got it unpacked that we seen what we had been given, and investigated the invoice properly. What they had in fact given us was a top end X99 system with not one but TWO GTX Titan X’s in SLI, with a bespoke water cooling setup, a price tag of £6400, and that’s just the beginning. Ladies and Gentlemen, hold onto your hats, I’m going in!
Nvidia is offering the Witcher 3, an action role-playing, story-driven, video game for free to all the owners of their flagship graphics card, Titan X, who are willing to utilize their application called “Nvidia Geforce Experience”. Titan Graphics Card haven’t witnessed any free bundled game since launch in spite of their price tag (over $1000), but things are bit changed now. Nvidia has been bundling the games with the high-end Graphic card offerings such as GTX 980, GTX 970, GTX 960 are enjoying the freebies like Witcher 3 and Batman: Arkham City from a long a time, but this time they have changed the way these games were delivered to the end customers.
Fret not, even if you have already purchased the graphics card, you can still claim it using the Geforce Experience (or GFE in short). Unlike previous method of distributing bundled games in form of vouchers which can be resold in market and sometimes the end customer gets disappointed due to these issues, the bundled game can be redeemed by all the Titan X owners and they just need to follow pretty straightforward instruction to get the game running. The instructions posted by Nvidia can be found right here. You will need a GOG.com account and grant Nvidia permission access to your email address, avatar, and username. This GFE Beta test end on June 19.
This change will alter the future of the distribution of the bundled games and finally it should have done long ago. This is an essential step towards the prevention of the sales of the free bundled games. What do you think about it? Let us know in the comment section.
Thank you Nvidia for providing us with this information.
EKWB already released their take on a full-cover water block for Nvidia’s GeForce Titan X graphics cards and now it’s AquaComputer’s time to do the same. The new AquaComputer Kryographics full cover water block for the GTX Titan X comes in six different versions to match your setup and preferences. It is CNC-milled from a 10mm thick high-purity electrolytic copper block and it covers both the GPU, RAM and voltage regulators.
It is CNC-milled from a 10mm thick high-purity electrolytic copper block and it covers both the GPU, RAM and voltage regulators. All relevant areas are covered by the flow path to make sure you have the best cooling everywhere for maximum potential.
The contact surface of the base is high-gloss polished and the Kryographics blocks allow for use of thermal grease instead of thermal pads on the RAM chips.
The water block comes with pre-assembled distance pieces to make sure you don’t damage your precious card when mounting your new water block. You can tighten the screws as far as they go without damaging anything and to achieve an optimal contact pressure.
The water blocks can be used with regular G1/4 fittings and the connection terminal offers threads in both directions. Those who run SLI setups can also replace the terminal with an optional Kryoconnect adapter.
There are also nickel-plated versions available and the Plexiglas cover of the acrylic glass edition and black edition is milled from a solid block, just like the base. To avoid the risk of cracks, Aqua Computer uses cast Plexiglas and the cover is held in place by a stainless steel frame instead of screws with drilled holes that could weaken the structure.
The six versions variate a little bit in price and the basic Kryographics for GTX Titan X will cost you €94.90. The acrylic glass edition, black edition, and nickel-plated version will cost you €104.90 while the acrylic glass edition and black editions with nickel plating will cost you €114.89. The new full cover water blocks are available now, so there’s nothing holding you back from taking your Titan X to the max; that is if you’re lucky enough to own one of these amazing graphics cards.
There has been a lot of hype with Rockstar’s latest Grand Theft Auto 5 title for the PC. Not to mention the debate on how the graphics will be and the AMD vs NVIDIA fan war. But nobody was looking at what the title eats up in terms of performance on 4K resolution and ultra high settings… until now!
Have you tried playing the title on a 4K monitor and all settings taken to the highest level? If you did, then you probably know that it eats up an unbelievable 14GB VRAM! Yes, you read it right… 14 GigaBytes of Video RAM is needed to keep GTA 5 running on a 4K monitor and ultra-high anti-aliasing enabled configuration.
It seems that even Titan X falls short at this one with its 12GB VRAM. It’s insane! Here’s a pic to prove it all.
I still can’t believe my eyes! I mean, what is going on here? Is Rockstar challenging Nvidia to roll out more powerful Titans in the future or what? Let’s review the titles specs:
Minimum specifications: OS: Windows 8.1 64 Bit, Windows 8 64 Bit, Windows 7 64 Bit Service Pack 1, Windows Vista 64 Bit Service Pack 2* (*NVIDIA video card recommended if running Vista OS) Processor: Intel Core 2 Quad CPU Q6600 @ 2.40GHz (4 CPUs) / AMD Phenom 9850 Quad-Core Processor (4 CPUs) @ 2.5GHz Memory: 4GB Video Card: NVIDIA 9800 GT 1GB / AMD HD 4870 1GB (DX 10, 10.1, 11) Sound Card: 100% DirectX 10 compatible HDD Space: 65GB DVD Drive
Recommended specifications: OS: Windows 8.1 64 Bit, Windows 8 64 Bit, Windows 7 64 Bit Service Pack 1 Processor: Intel Core i5 3470 @ 3.2GHZ (4 CPUs) / AMD X8 FX-8350 @ 4GHZ (8 CPUs) Memory: 8GB Video Card: NVIDIA GTX 660 2GB / AMD HD7870 2GB Sound Card: 100% DirectX 10 compatible HDD Space: 65GB DVD Drive
Anything missing from the above? I think so! They should add an “Insane specification”, right? Oddly enough, a triple 4K configuration, as TweakTown seems to have stated in their post, does not affect the VRAM requirements at all. At least that’s a relief, huh? If it requires more, then a quad Titan X could have been the case to handle this beast!
What do you guys think? Is you gear ready for Grand Theft Auto 5 in 4K at ultra-high settings? Let us know!
AMD is getting ready to launch their new 300 series very soon and Nvidia isn’t just standing by on the sidelines to watch, they want to be prepared. According to the latest leaks coming through Sweclockers, who have an impressive track record of being right on the spot with Nvidia rumours, Nvidia already has their new GeForce GTX 980 Ti ready, they just don’t want to release it yet.
The timeframe for the GTX 980 Ti is still set to the end of Q2 or Q3 2015, but with the option to switch it up and release it earlier in case AMD’s new flagship GPUs will kick their butts. I know that many people are waiting for the card, based on comments on our previous articles, but this is also good news. It will give Nvidia time to optimise and tweak the card, give board-partners more time to create better custom PCB and cooler solutions, but also to improve on it in case AMD’s 300 series cards will surpass the leaked performance figures.
The sad side of the news is however that it looks pretty much like the Titan X with half the memory. The GTX 980 Ti will feature the full version of the GM200 core with 3072 CUDA cores, 192 texture units and a 384-bit memory interface for the 6GB VRAM. Where the Titan X is running in at the $999 price tag, the GTX 980 Ti will most likely cost around $699.
Thank you Sweclockers for providing us with this information
The rumour mill is spinning with full force this morning, as the guys at SweClockers report that Nvidia will be launching a more affordable GM200 based GPU this summer.
The new card, expected to be the GeForce GTX 980 Ti, will feature the GM200 hardware with 2072 CUDA cores, the same as the current Titan X. However, the card is expected to feature less memory and perhaps a few other refinements to help bring the cost down to something more affordable.
What’s crazy is that the Titan X is already proving a big hit, Nvidia is actually selling their $1000 cards like crazy, despite the concerns most of us have when we check our bank balances or try explaining such a fanciful purchase to our SO. If Nvidia can get much of the performance of the Titan X into a price point that won’t be such a bitter pill to swallow, then that can only be a good thing for consumers, because I think we can all agree that flagship GPU prices are getting out of hand.
The 980 Ti seems like a sure thing, especially given it will no doubt launch at the GAME24 event and be around the same time as the AMD 3xx series launches; Nvidia would be mad not to have something to compete with at that time.
Yay, another update. You know what that means don’t you? That’s right, another driver analysis. Well this time around, it’s a little different. NVIDIA released the newest driver (WHQL 347.88) mainly to add compatibility for the absolute monster, GTX Titan X; there were a few other additions and bug fixes, but more on that later.
So what’s new? Not a-lot in a nutshell, that are no performance increases for any of the range of graphics cards, no advancements to any of the existing line up; so this is shaping up to be a rather boring driver. Oh wait, did I mention that this driver supports maximum compatibility for the highly anticipated Battlefield Hardline? That’s right folks, not only did we get a new driver and a new graphics card, but we also got given a new game too.
So what about the not so interesting stuff? Well NVIDIA are really pushing PhysX, by giving it away for FREE to developers. It’s already featured in hundreds of current games and is even a part of the Unreal Engine 3 and 4, so it sort of makes sense to go Green with some pretty big titles coming out using that game engine. Enough on that, let’s move onto a few graphs of how our current line up of Maxwell based graphics cards handle the new driver and new game.
NVIDIA, the company that brought you the Shield and the original Titan graphics card has now released the newest iteration in the Titan series; the Titan X. Going by facts, figures and a few leaked performance graphs; this mighty graphics card should prove to be the cream of the crop, the best of the bunch and whatever other analogies you can think of. As we’ve previously seen with the release of the Titan graphics cards, they have always smashed the opposition. However, all this performance comes at a price; this particular card is going to set you back around $999 or £899 if you live in Blighty.
So, the Titan X…What’s new? Well firstly it’s a typical reference NVIDIA cooler, but this time it’s all black with minimal silver finishing, it keeps the green “GeForce GTX” logo up the side and a light up “Titan” logo in the usual spot. So how about specifications? I’ll let the table explain:
So it’s bigger, better, faster and more expensive; just as you expected. Let’s see if the extras really warrant the extra price tag, especially compared to the AMD powerhouse, R9 295×2. I predict a knock for knock between the two cards, with the Titan X taking the lead in games like Metro Last Light and Hitman Absolution, where SLI/ Crossfire isn’t 100% optimised.
So onto the box, a very “nothing to see here” approach it seems. NVIDIA has designed this box to be as minimalistic as possible, not even a whiff of a specification list to what’s inside. The exterior is all black with minimal silver details. The entire box is finished in matt, where the TITAN X logo is in a glossy finish. Looking directly at the box in dull light, it would be hard to tell that there is any writing at all. Due to this being a NVIDIA released card, there are no extras hidden inside the box.
The cooling shroud is the same as the recent reference design NVIDIA cards, however, this is majority black not silver, which I really like.
Along the top of the card, very reference like, just the Green GeForce GTX logo, SLI bridge connectors and a set of 8 and 6 pin power connectors.
Around the back of the card, things get a little disappointing, with such a premium product, it would have made sense to use a backplate, even if it was basic, just something to cover the PCB. All those RAM chips on the back might get a little toasty too, maybe hot to touch.
As with the recent NVIDIA reference cards, a simple metal bracket with 3x DisplayPort, 1x HDMI and 1x Dual-Link DVI.
The Titan XXX features twice the VRAM of the Titan X, supports up to 12K resolution (though two in SLI is recommended for 4K and above), is overclocked out of the box and boasts a special 300w OC mode. The card also supports NVIDIA G-Sync.
The full specifications are as follows:
GPU: GeForce Titan XXX
Core Base Clock: 1268MHz
Core Boost Clock: 1493MHz
Memory Clock: 8600MHz Samsung GDDR5
Memory Size: 24576MB GDDR5
Bus Type: PCI Express 3.0
Memory Bus: 384-bit
CUDA Cores: 3072
DirectX 12: Yes
DVI Port: 1x Dual-Link DVI, 3x DisplayPort & 1x HDMI
The Titan XXX isnt exactly available just yet, as Overclockers UK are working out a final price and availability, but it’s setup with a holding price of £1,666.66 and a projected release date of 1st April.
If the regular Titan X isn’t enough for you, then this no doubt has you twiddling your credit card!
Nvidia has just released their new and quite impressive Nvidia TITAN X and EK Water Blocks introduced their new Full-Cover water block for the reference design (NVA-PG600) of the GeForce GTX TITAN X graphics card.
The EK-FC Titan X is a full-cover cooler meaning that it directly cools all the vital parts of the graphics cards: the GPU, RAM as well as VRM. Water flows directly over all these critical areas thus allowing the graphics card and its VRM to remain stable even when heavy overclocked.
EK-FC Titan X water block features EK’s unique central inlet split-flow cooling engine design for best possible cooling performance and it also works flawlessly with reversed water flow without adversely affecting the cooling performance. Moreover, the design offers great hydraulic performance allowing this product to be used in liquid cooling systems with weaker pumps.
The base is made from electrolytic copper, either bare- or nickel-plated depending on the variant while the top is made of either acrylic- or POM Acetal material. Plexi variants also feature two pre-drilled slots for 3mm LED diodes so you can light it all up and show off even more. Screw-in brass standoffs are pre-installed and allow for a safe and painless installation procedure.
EK are also offering a long list of backplates to go along with the new cooler and in a lot of colours. It’s available as normal Black and Nickel versions, but also Blue, Gold, and Red are options to choose from.
The new full cover water blocks should be available now at both the EK Webshop and Partner Reseller Network starting from €99.95.
Thanks to EKWB for providing us with this information
GTC 2015: For the most part, GTC has already been a huge event for NVIDIA with the unveiling of the new TITAN X graphics card to the market but while this is great for the average consumer, NVIDIA have a lot more to offer than just their GeForce segment and one aspect of this is deep learning, and actually is a segment from within their company that crosses over into GeForce and other more industrial sectors that NVIDIA have to offer.
Jen-Hsun Huang (CEO and co-founder of NVIDIA) took to the stage at NVIDIA GTC 2015 to talk through the theme of the event this year and deep learning was at the core.
You can see part 1 of the video here:
As you can see from the video, Jen-Hsun went through the various milestones of the company last year including GeForce, Shield, Tegra, enterprise graphics and of course what NVIDIA are doing from within the car industry.
A big highlight was what the new TITAN X GPU can do in comparison to other offerings on the market. A comparison of a 16-core Xeon CPU, original TITAN, TITAN Black and the newest TITAN X were shown training AlexNet and while the TITAN and TITAN Black managed to improve on what the Xeon CPU could do by cutting down the time it takes to train AlexNet, the new TITAN X dramatically improved even on that. This sees the AlexNet training take just 3 days opposed to the 43 days that the Xeon takes showing that NVIDIA are focussing on more than just gaming.
Here we are at NVIDIA’s GTC event in San Jose, California and with the buzz surrounding Titan X in the air, we thought that even though we haven’t got our review sample in our hands as of yet, it would be a good idea to bring you a collective list of reviews from some of our friends from within the industry.
This is a comprehensive list of sites that we know and trust the writers from within our small industry, so we invite you to check out their reviews and in the mean time, we will keep you up-to-date with our sample on our social media pages, so expect us to have a full review in the coming week ahead.
GTC 2015: If you’ve been under a rock for a while, then you’ve missed out on something huge happening within the market and for the most part we’re probably sure that you’ve heard about the new NVIDIA GTX TITAN X launching in some capacity or another.
If we cast our minds back only a short while to GDC, you’ll remember that NVIDIA unveiled the TITAN X graphics card, but while a lot was said about the card, there were still so many unanswered questions, including pricing, availability and much more.
Scroll forward to the present day, and CEO and owner Jen-Hsun answered these questions and officially introduced the TITAN X graphics card at this years GTC 2015 event. While he took to the stage, a promotional video was played showing us the new card in all of its glory, but one of the main speculative questions on everyone’s lips came down to pricing.
Some were basing pricing off of the TITAN Z that launched at last GTC in 2014, expecting the new TITAN X to be at least $2000, but with a shock moment and a stunned audience, the pricing of $999 came to light.
While $999 is considered ridiculously expensive by some, you have to remember that this is the world’s fastest GPU on the market with 8 billion transistors, 3072 CUDA cores, and a whopping 12GB of memory. This card allows 4K high quality gaming to be an easy task with room to spare and something that we are definitely looking forward to testing out.
AMD’s latest Fiji-based Radeon R9 390X cards have been reported to come in both 4GB and 8GB variants in order to compete with its NVIDIA rival, the GeForce Titan X with 12GB VRAM.
It has been revealed a while back that the company will be using High Bandwidth Memory on these cards. This means that the 8GB variant will not have the numbers to go mainstream compared to the 4GB variant, requiring 8Gbit chips, 1GB per chip, which will provide it with an outstanding 1024 MB/s bandwidth.
The Radeon R9 390X will be a true competitor against the GeForce Titan X and considering that the GeForce GTX 980 already handles 4K extremely well with its 256-bit memory interface, the 1024-bit interface and its next-gen HBM RAM should reveal some 4K numbers never seen before. AMD is planning to reveal the R9 390X at Computex in June.
Thank you TweakTown for providing us with this information
While NVIDIA tends to keep information about its latest Titan X graphics solution at bay, there are more and more sources coming forth with information about the card. This time, VideoCards comes forth with new benchmark results for the Titan X.
However, VideoCards is said to have revealed information such as testbed configuration, driver versions or other details about the tests performed on the card, nor whether the information comes from their testers or third-parties. Based on the results revealed by VideoCards, we see the following alleged results for the GeForce GTX Titan (performed at default clock speeds of 1GHz GPU and 7GHz memory):
22903 points in 3DMark 11 Performance, which is 34.8 per cent faster compared to reference GeForce GTX 980
7427 points in 3DMark 11 Extreme, which is 39.9 per cent faster compared to reference GeForce GTX 980
17470 points in 3DMark FireStrike, which is 36 per cent faster compared to reference GeForce GTX 980
7989 points in 3DMark FireStrike Extreme, which is 36 per cent faster compared to reference GeForce GTX 980
If the information above proves to be accurate, then NVIDIA’s GeForce GTX Titan X would perform as expected, namely 33% faster than the GTX 980 in most cases.
Thank you KitGuru for providing us with this information
We still don’t know the full specifications, but LegitReviews got hands on with the real card and from that we can glance a lot. It doesn’t have a backplate which means that we see a lot of VRAM and a single GPU.
The new graphics card should be running the full GM200 GPU and has slightly over 8 billion transistors. The use of just an 8-pin and a 6-pin power connector for this single chip beast really shows how power efficient the new Nvidia architecture is.
You can create both triple and quad SLI setups as the card comes with dual SLI connectors. Holy mother of total VRAM possibilities, this could mean 48GB of memory. Now couple that with what we’ve already heard about Microsoft’s new DirectX 12 API and the jaws start to drop!
The new Nvidia GeForce GTX Titan X features the basic set of display connectors we see on almost all cards: three DisplayPort, one HDMI and one DVI.
Thanks to LegitReviews for providing us with this information.