NVIDIA’s G-SYNC is a propriety module embedded into select monitors which directly synchronizes game performance with the monitor’s refresh rate. This creates a smooth experience and minimizes the stutter you would typically get from V-Sync. This also eliminates screen tearing and some users argue it’s a more seamless experience than AMD’s FreeSync technology. Evidently panels with G-SYNC incur a hefty price premium which means consumers have high expectations.
Recently, a software bug emerged which results in significant increases in the GPU’s power draw under idle circumstances. Bizarrely, the clock speeds ramp up too, but only at a significant amount on monitors with a 144Hz+ refresh rate. This notion was discussed by Ryan Shrout from PCPer and said:
“But the jump to 144Hz is much more dramatic – idle system power jumps from 76 watts to almost 134 watts – an increase of 57 watts! Monitor power only increased by 1 watt at that transition though. At 165Hz we see another small increase, bringing the system power up to 137.8 watts.”
“When running the monitor at 60Hz, 100Hz and even 120Hz, the GPU clock speed sits comfortably at 135MHz. When we increase from 120Hz to 144Hz though, the GPU clock spikes to 885MHz and stays there, even at the Windows desktop. According to GPU-Z the GPU is running at approximately 30% of the maximum TDP.”
NVIDIA have acknowledged the strange power draw issue and is currently working on a fix to be included in a driver revision. The NVIDIA response reads:
“We checked into the observation you highlighted with the newest 165Hz G-SYNC monitors.
Guess what? You were right! That new monitor (or you) exposed a bug in the way our GPU was managing clocks for GSYNC and very high refresh rates.
As a result of your findings, we are fixing the bug which will lower the operating point of our GPUs back to the same power level for other displays.
We’ll have this fixed in an upcoming driver.”
I’d love to hear from people who own a G-SYNC display. Do you feel the module is worth the added cost when selecting a monitor?
Google is known for rewarding its fans and developers who find and provide fixes to its Android operating system. Through its reward system, Google ensures that bugs are found and dealt with accordingly. Through its open-source nature, Android provides a lot of potential for both development and hacking.
The Android Security Rewards has been made for developers to submit any code that won’t fall into other rewards program covered by Google. The company is interested in AOSP, OEM, kernel and TrustZone OS bugs, but firmware bugs can also be submitted if they pose a potential security risk for the Android OS.
The new rewards program currently covers only two devices, namely the Nexus 6 and Nexus 9, but Google says it will add more in the future. Bugs that qualify for the latter devices should fall into the moderate, high or critical area, so if you find a complex bug or just a text misspelled wrong, they won’t qualify.
As far as rewards are concerned, Google offers between $500 and $2,000, depending on the bug severity and how well it is documented and/or patched. For example, Google will reward anyone who submits a critical bug, the patch and CTS test with up to $8,000. Also, Google offers a larger sum of money for functional bugs as follows:
An exploit or chain of exploits leading to kernel compromise from an installed app or with physical access to the device will get up to an additional $10,000. Going through a remote or proximal attack vector can get up to an additional $20,000.
An exploit or chain of exploits leading to TEE (TrustZone) or Verified Boot compromise from an installed app or with physical access to the device will get up to an additional $20,000. Going through a remote or proximal attack vector can get up to an additional $30,000.
If you like a challenge that would potentially bring you a lot of cash, you can head over to Google’s Android Rewards page here and see more details.
We’ve been waiting for details on the new memory architecture from AMD for a while now. Since we heard the possible specifications and performance of the new R9 390x all thanks to the new High Bandwidth Memory (HBM) that will be utilised on this graphics card.
Last week, we had a chat with Joe Macri, Corporate Vice President at AMD. He is really behind HBM and has been behind it since product proposal. Here is a little bit of background information, HBM has been in development for around 7 years and was the idea of a new AMD engineer at the time. They knew, even 7 years ago, that GDDR5 was not going to be an ever-lasting architecture and something else needed to be devised.
The basis behind HBM is to use stacked memory modules to save footprint and to also integrate them into the CPU/ GPU itself. This way, the communication distance between a stack of modules is vastly reduced and the distance between the stack and the CPU/ GPU core is again reduced. With the reduced distances, the bandwidth is increased and required power dropped.
When you look at graphics cards such as the R9 290x with 8GB RAM, the GPU core and surrounding memory modules can take up around a typical SSD size footprint and then you also need all of the other components such as voltage regulators; this requires a huge card length to accommodate all of the components and also the communication distances are large.
The design process behind this, in theory, is very simple. Decrease the size of the RAM footprint and get it as close to the CPU/ GPU as possible. Let’s take a single stack of HBM, each stack is currently only 1GB in capacity and only four ‘DRAM dies’ high. What makes this better than conventional DRAM layout is the distance between them and the CPU/ GPU die.
With the reduced distance, the bandwidth is greatly increased and also power is reduced as there is less distance to send information and fewer circuits to keep powered.
So what about performance figures? The actual clock speed isn’t amazing, just 1GBps when compared to GDDR5, but that shows just how powerful and refined they are in comparison. Over three times the bandwidth and lower voltage; it’s ticking all the right boxes.
There was an opportunity to ask a few questions towards the end, sadly only regarding HBM memory, so no confirmed GPU specifications.
Will HBM only be limited to 4GB due to only 4 stacks (1GB per stack)?
HBM v1 will be limited to just 4GB, but more stacks can be added.
Will HBM be added into APU’s and CPU’s?
There are thoughts on integrating HBM into AMD APU’s and CPU’s, but current focus is on graphics cards.
With the current limitation only being 4GB, will we see negative performance in high demanding games such as GTA V at 4K that require more than 4GB?
Current GDDR5 memory is wasteful, so despite lower capacity, it will perform like higher capacity DRAM
Could we see a mix of HBM and GDDR5, sort of like how a SSD and HDD would work?
Mixed memory subsystems are to become a reality, but nothing yet, main goal is graphics cards.
I’m liking the sound of this memory type; if it really delivers the performance stated, we could see some extremely high power GPU’s enter the market very soon. What are your thoughts on HBM memory? Do you think that this will be the new format of memory or will GDDR5 reign supreme? Let us know in the comments.
If you though your PC was great at playing 4K videos from YouTube and Netflix, you should put it to the test again, but this time at a higher frequency. YouTube has announced that it will be adding support for 4K videos at 60 frames per second for some time, but there are still only a few videos out there that come with the higher frequency. Still, it does look nice. So the question is, can your PC handle it? Even more important, can your internet bandwidth cope with it?
Thank you KitGuru for providing us with this information
We’ve recently reported on Linshof’s i8 smartphone offering, allowing users to utilize their 80GB of storage space through an interesting 64GB and 16GB memory module paring technology. Just now, news has come to light that this German manufacturer has announced that they will launch ‘clean Android’ offerings of these smartphones and tablets as soon as the first-quarter of 2015. This is an expanded news article, with more pricing and product information being made available to us.
This new OEM has been officially revealed to the public, utilizing Android’s 5.0 Lollipop software at a reported low price. Linshof claims that their i8 5-inch smartphone and their 10-inch tablet will be released on the market with an unnamed octa-core processor @ 2.1GHz, paired with 3GB of RAM and the aforementioned 80GB storage modules. They sent out an email this last Saturday, further clarifying that their 80GB modules are split up between a 64GB and 16GB chip – with this 16GB chip being a “super-high data rate” device, allowing for SSD-like upgraded performance contained within your mobile phone.
The tablet is set to be listed at $360 US, alongside their Smartphone at $380 US. These sharp-angled products are expected to see a slight change in pricing upon release according to reports, however it will be based around the same mark.
Linshof is looking to prove that Germans have what it takes to enter the phone market globally, can they compete with the likes of Apple and Samsung?
Want to view earth from space but haven’t got the time to train as an astronaut? Well now you can, via a webcam on the web.
The ISS High Definition Earth Viewing Experiment allows you to get a view from a window on-board the International Space Station as it circumnavigates the earth.
The project, which has been viewed by 32 million people since April, aims to give people a 24 hour view of the ISS on its daily orbit around earth. It is also a NASA experiment, which has been designed to test the effect of space on video cameras, with the objective of finding the best way to record video in zero gravity.
Quite often University or High School students are looking for a simple solution to type up their reports and complete various simple processing tasks to get them through the day-to-day grind.
ASUS have replied with their sub-$200 category laptop, the X205TA Signature Edition. Coming in at $179 US on the Microsoft stores (lower than the $199 announced RRP), this laptop should complete the task with ease.
Coming complete with Windows 8.1 displayed on an 11.6-inch panel at 1366×768 resolution, the Signature Edition utlizes Intel’s Z3735F Quad-Core processor running at 1.33 GHz. There’s also the standard turbo options up to 1.83 Ghz available and comes with a handy secondary cache of 2 MB. The laptop functions through 2GB of DDR3-1333 memory, built in eMMC storage and an expandable microSD card slot.
Given all of the specifications above and the overall look and ‘feel’ of the product, it seems that it will be $179 well-spent. We’d like to see it included with a little more ram however, as windows 8.1 might start to chug when running on only 2GB.
As for accessory and connection options, this model comes with two USB 2.0 ports, a 480p webcam, a combined audio and microphone port, HDMI output, 802.11n WiFi and Blutooth 4.0. Measuring at a total 11.25 x 7.61 x 0.68 inches at a weight of 2.11 pounds (0.96kg), this laptop will be small and light enough to carry around for days on end around campus.
Ubisoft has been in the headlines a bit recently, unfortunately not for the positive side of things. We’ve firstly covered how their developers believe that 60FPS “looks weird” and then pinning the blame on Microsoft saying that 30 FPS gaming is being pressured to be transferred over to PC’s by the console manufacturers.
In the latest of gaming news, Assassin’s Creed: Unity technical requirements have been released. The image above, said to be locked at 900p and 30 FPS, shows the graphical prowess of the game on release. Interested in giving it a crack? Unity will require you (at a minimum) to be running a GTX 680 or HD 7970 graphics card coupled with at least a Core i7-2500K or and AMD FX-8350. The stats you’re looking at aren’t a figment of your imagination – they’re actually from Ubisoft’s official blog which also mentions a minimum 6GB of RAM is needed.
Here’s hoping that the full PC game release won’t require a $2,000 computer to run at a measly 30 FPS. In what can be only described as outage from PC gaming scene, we’re all waiting with bated breath to see what Ubisoft will deliver next.
A certification listing on Sony’s Indonesia Postel website has given away the possible near-release of a new Xperia Tablet. The device is listed as SGP621 and contains a similar naming convention to it’s older tablet brothers in the Xperia series – The Xperia Tablet Z (SGP321) and the Xperia Tablet Z2 (SGP521). Oddly with no “SCP421” in the naming fashion.
“The number “2” in the model number suggests that the upcoming tablet will feature LTE connectivity. For instance, the Wi-Fi only edition of the Xperia Z2 Tablet has a model number SGP511.”
This new tablet will possibly be revealed this year at IFA in Berlin, Germany.
We’ve learned it’s unlikely that this new model will be smaller in size as rumors indicate that Sony is considering production of an 8-inch form factor. We have no further information on hand in regards to the specifications, price or release date – unfortunately, all we have is the above model number.
We do know however, that Sony’s last few tablets have all been of high-end quality and specification, so this new edition could very likely follow the same trend.
Here’s hoping all will be revealed in Berlin next month at IFA. When it does, we’ll let you all know.
Bell Labs, part of Alcatel-Lucent have achieved a jaw dropping world record of 10Gbps over a standard copper telephone line. Achieved as part of a research project to investigate the possibility of bringing Gigabit internet to broadband networks that encompass copper lines along with high-speed fibre, Bell Labs were able to maintain the 10Gbps speeds over a distance of 30 metres using two pairs of lines in a bonded connection.
Although the top speed was recorded over a 30 metre stretch of standard telephone cable, past this distance and particularly past 70m, the speed drops down to 1Gbps, however this is still good news as it means that 1Gbps connections in both directions may be possible for broadband users who have FTTC (Fibre To The Cabinet) type connections.
High speed connections over short runs of copper cables are a common sight these days thanks to the above mentioned FTTC type connections, with speeds in the region of 80Mbps possible depending on your proximity to the nearest cabinet. To go past this level of speed and on to the 1Gbps barrier, Bell had to rethink how data is sent across the copper and this meant using a new DSL standard known as G.fast. Further more Bell Labs have developed an extension of the G.fast standard known as XG-FAST allowing the faster speeds to be obtained over shorter distances – say 10Gbps over 30m.
Following the new record, Federico Guillén, President of Alcatel-Lucent’s Fixed Networks gave a statement saying:
“The Bell Labs speed record is an amazing achievement, but crucially in addition they have identified a new benchmark for ‘real-world’ applications for ultra-broadband fixed access. XG-FAST can help operators accelerate FTTH deployments, taking fiber very close to customers without the major expense and delays associated with entering every home. By making 1 gigabit symmetrical services over copper a real possibility, Bell Labs is offering the telecommunications industry a new way to ensure no customer is left behind when it comes to ultra-broadband access.”
Simply put, this news means that the prospect of having 1Gbps internet connections without the need for a FTTP (Fibre To The Premises) type connection is on the horizon and in the next few years Google (who currently offer Gigabit Fibre internet in the US) are likely to have a whole lot more competition to deal with. Should we be excited? Simply put yes – especially if you struggle to get high-speed internet right now.
Hong Kong based DRAM manufacturer, I’M Intelligent Memory, has announced its 8 GB DDR3 components with a single chip, which doubles the amount of memory per chip compared to other DRAM devices on the market. Based on the latter chip, the company is said to have introduced the 16 GB DDR3 UDIMM and SO-DIMM memory modules, having EEC error-correction technology as an optional upgrade.
It is said that the JEDEC specification JESD9-3 has always allowed a 8GB capacity for DDR3 devices. However, it seems that most manufacturers are waiting for the 2x nm process in order to fit smaller chips and bring high memory capacity. I’M Intelligent Memory has apparently made the leap by developing their own way of manufacturing 8 GB DDR3 components with a single chip using existing 30 nm technology.
The company states that their memory modules are compatible with the JEDEC standard pinout, timing and row/column/bank addressing. In addition to the latter, the company has made available devices including x8 (1Gx8) configuration in FBGA 78 ball package, a x16 (512Mx16) type in FBGA 96 ball package, a x32 (256Mx32) configuration in FBGA 136 ball package, as well as providing DDR3L low-voltage 1.35V versions, all of which are currently available on the market.
Given the 8 GB device, the company has released its first 16 GB DDR3 240 Pin unbuffered DIMMs and 204 Pin SO-DIMMs on the market, while also having them available in 72 Bit width for EEC error correction. The latter modules are said to be compatible with processors and micro-controllers from AMD, Cavium, Freescale, Tilera and others.
While not all processors used in desktop PCs are compatible with the high-capacity memory, Intel has offered support for the Atom C2000 series and Atom E3800 series with a new BIOS version available to download now. Also, ASUS has confirmed support for the latter memory on its X79-DELUXE, RAMPAGE IV BLACK EDITION and other ASUS X79 motherboards. Other manufacturers, such as ASRock, Supermicro, AIC and Portwell have already verified and approved the IM 16 GB DDR3 memory modules for a variety of their motherboards based on ADM, Tilera, Intel’s C2000 series and other processors.
I’M Intelligent Memory apparently has noticed the potential embedded markets, networking and telecommunication applications, as well as PCs and laptops, allowing all to reach a memory capacity previously untouched by any manufacturer out there.
Looking for someone with similar interest and intellectual ability as your highly developed self? Match.com have paired up with Mensa to create Mensa Match – allowing some of the worlds smartest to match up with one another.
Mensa is deemed by some to be a ‘snobbish’ community of intellectuals, but there is no denying that it’s hard to get your foot in. This elite society only allows in those of the top 98th percentile of IQ based intelligence which means approximately 1 in 50 people are eligible.
“American Mensa isn’t just about encouraging adults to think; connections over ideas and interests have always been a part of our culture, and now Mensa Match helps Mensans connect on a more emotional level. In partnership with Match.com®”
Some people may see this as elite level snobbery, but when some ‘Mensans’ were asked for information on their dating life and opinions on this new advancement, they replied positively.
“I’m looking for people who are intellectually curious. And when all you’re talking about is sports teams and barbeques … when you’re talking about physical traits and not existential philosophy, I’m not going to get the vibe.” Stated Mensa member Anne Sereg on CNN
According to Match.com’s information, 80% of their user base say that finding someone with a similar intelligence level as them is a must or at least very important. To take this to the extreme, Mensa member Peter Baimbridge remarked (on national Television) that those with an IQ of “around 60 are probably a carrot” as seen on Mirror.co.uk.
In addition to Mensa Match, Mensa members can also elect to have a special badge applied to their profile to flaunt their achievement.
Over the last few weeks I’ve been looking at a few of MSI’s latest gaming notebooks, each featuring NVIDIA’s brand new 800 series GPUs. As we look back at the two units that we’ve seen already, the first of these being the GS60-2PE ‘Ghost Pro’ – which, as a mid range system, wows me with the amount of power that I has tucked away inside its slim and lightweight frame, giving it some credentials that could almost give it the right to be known as a Ultrabook. On the other hand, the GP70-2PE ‘Leopard’ that we looked at more recently brought some fresher and more up to date components over its older brother the straight up GP70. Although it featured a full solid state boot drive and the latest run of 800 series graphics from NVIDIA, the balance between some of the components such as the high-end processor and entry-level graphics card, along with a couple of note around the chassis such as the track pad left me a little disappointed as it has the right foundation to be a cracking budget option for the rapidly changing market.
The third and last system from MSI’s gaming series range that we’ve picked out for review today is targeted right at the top end of the gaming notebook market. Compared to the Ghost Pro which looks for the balance in performance and cost, along with functionality and the Leopard which is aimed more at the budget conscious gamer, the GT70-2PE ‘Dominator Pro’ is a no holds barred out-and-out performance system. Featuring a top end 880m GPU from NVIDIA, an i7 CPU from Intel, SuperRAID SSD and the capacity to upgrade its performance even further, the GT70 is set to be their flagship gaming notebook.
As we will see as we work through our tour of the GT70, like the Leopard we will see that its design and basic framework has been brought forward from a previous generation, meaning that even though it may looked like an older system on the outside, on the inside it is full of young blood that is waiting to get to work pushing pixels about like there is no tomorrow. Naturally, like many other things in life, top end performance does inherently bring with it a strong price tag and at £1,799.00 (inc VAT), this notebook is not for the faint hearted. Ultimately though, the most important question that we must ask as we look around this system is “does spending this much really pay itself off?” After all, you’ve not to feel that pleased if you’ve bought a Ferrari only to find that its got the gut of a typical four-door saloon.
Looking down the crib sheet for this system, there’s no mistaking that MSI have got a ton of power behind the scenes, especially in the processing and graphics department. I’ll also note at this point that the SSD array and the stock memory has the potential to be pushed further and in some regions may come as a slightly different default configuration to what what I’ve got here today.
The package that comes alongside the notebook is strikingly similar to that of the Leopard, with a bundle of manuals and setup guides, a regional kettle lead and a 180W power adapter – this alone carries a bit of weight to it and considering the spec, it#s no wonder why.
The whirlwind of information and rumours regarding NVIDIA’s latest graphics card, dubbed the Titan-Z, has been leaving many of us wondering if the card will actually come to surface and if it does, when will it be? Additionally the price of the card has been a little hazy, however it following a spot on eBay, things may be making another step closer to reality.
Now obviously having something crop up on eBay first is a little suspicious, especially when we take into account what product has appeared, but an appearance has also been made over on the US Amazon pages, although the $1099 price tag should be taken with a pinch of salt. Either way, looking at the details on both listings we can see that both forecast a release date around 1-2 months for now with the eBay listing in particular showing a shipping date between the 10th and 16th July.
Personally I’m not convinced that these listings are the real deal as of yet, only when we start seeing the big names list this behemoth of a card will I take more of an interest – after all anybody can take the information on an upcoming product along with a couple of stock photos and create a listing on eBay and Amazon.
When it comes to home networking there are a number of big names that come to mind and fortunately I have been able to put a number of products from these names to the test, however there has been one particular brand that I have been keen to get in touch with and establish a line of communications – namely Linksys. Believe it or not it is not always as simple as firing an email at someone and instantly getting products sent back in return as some may believe. After a few months of patiently waiting and after having a meeting with a few representatives from Linksys at this years CES in Las Vegas, I can finally say that I’m glad to have Linksys onboard and I look forward to having a good sniff through the stack of products that they have to offer.
Link some of the other big names in the consumer networking market – Netgear and TP-Link being just a couple of the other big names, Linksys have a massive following and also have a big history to back a successful line of products. After being formed in 1998, Linksys was bought out by Cisco Systems in 2003 and in the next ten years that followed, their name became synonymous with the WRT line of networking products. To put it in a simple way, if you was into your home networking, then Linksys’ WRT54G was the way to go – the OpenWRT project which was founded to develop the hacked router caused the popularity to explode to a new level. On a personal Level I have owned a number of Linksys routers over the years, in particular the WRT54g, WRT54Gs and the ADSL2+MUE modem amongst others. The power and flexibility that was on offer set these products head and shoulders above all else. In the more recent years, Linksys went under a second acquisition as Belkin then purchased the company in the early stages of the last year, ready to take them to the next level. Today Linksys is branded under its own name with the enthusiast and power user at the heart of their design, whilst Belkin branded items target the home and entry-level user.
As we all know, wireless networking over the last couple of years has been going through a radical set of changes, at a similar rate as the core desktop components and sin the last five years we have seen wireless speeds rise from 54Mbps right up to the Gigabit WiFi speeds of over 1000Mbps that we are no seeing today. In simple terms we are looking at well over 20x times the wireless bandwidth that we saw only ten years ago. As technology has moved on and our homes have become more enriched and entangled in our digital lives, the amount of power and speed that we have been demanding from our home networks has risen to greater and greater levels, thus the reason why we have seen such a rapid growth in wireless technology.
The EA6900 router that I’m taking a look at today is one of the latest generation Gigabit wireless routers to come to market and with this it brings some of the fastest wireless speeds that we have seen to date. Like many other current wireless routers, we get a pair of dual band radios, offering both 2.4GHz and 5GHz wireless networking capacity with 802.11n speeds of up to 600Mbps on offer from the 2.4GHz band and the latest 802.11ac connectivity on the 5GHz band with a whopping speed of up to 1300Mbps on offer – yes that is faster than the current standard for LAN connections. Surrounding the super speed wireless connections the EA6900 also offers up four wired Gigabit Ethernet and Gigabit WAN port for super speed broadband connections, dual USB ports (1x USB2.0 & 1x USB3.0) for sharing storage and printers across the local network, topped off with a compelling user interface which offers up all the functionality that one would need from a high-end consumer router, but in an easy to use interface.
On paper things are looking good, but for me the real question is knowing if the Linksys that many of us knew in their hey-day has been kept close to heart or has this brand sadly become just another name on the shelf.
Inside the box, which itself gives us a good insight into what the router looks like and has to offer, we get a simple and to the point item list. Alongside the router and power cord we get three external antenna, a single CAT5e patch lead, a system resources CD and a quick setup guide to get things up and running. The packaging also points out that, like a few other routers that are now available, we have the option of downloading a mobile smart application for setting up and managing the router without the need for a desktop system.
If you recall, last night I talked about how both Asus and Lenovo have both brought a 28″ 4K monitor to the market with a seemingly remarkable price tag of sub US$800. Well eager to find out a little more about this panel I decided to pop by the Asus suite in the Trump hotel to get a hands on look at what this monitor has to offer and to see if a 4K resolution in such a small frame size (relative to its resolution).
When it comes to clarity and colour definition, first impressions look amazing. I’ve seem some impressive displays in my life, but even when looking up close as we can see below, it is easier to pick up particles of dust than it is to see individual pixels with a high level of zoom on the camera lens. Apart from the fact that the image is moving – which is pretty obvious considering this is a video stream that it is displaying – if the image were to be left still, and a dummy screen placed beside it, I could probably go as far as saying that at a glance it would be hard to tell the two apart. This is the only real way in which I can describe how good the picture looks.
Looking a further in to the exact specifications of the PB287Q, we note that due to the ultra high resolution, this panel packs a whopping 157 pixels into every square inch of the IPS panel and it features the same Splendid display characteristics but with a bit more refinement, thus the Splendid Plus title. We also note that there are infact three display inputs with dual HDMI ports (both with MHL capability), a single full sized DisplayPort and a legacy VGA input for older display adaptors. The IPS panel offers a contrast ratio of 100,000,000:1 (yes that is one hundred million to one), a brightness of 300cd/m² and a pair of 2W speakers with 3.5mm analogue audio in/out also on hand for older systems.
Having read and now seen this screen first hand, I’m even more excited to hopefully get one of these panels in for review in the near future. Asus say that the panels should be on the market in the early to mid part of February and I’ll tell you what, it’s going to shake up the display line-ups that we are already presented with for sure.
Early last month, Overclockers UK became home to a new breed of high-end systems that have taken the process of hand-picking components for sale-able systems to a whole new level. The ultra high-end systems have certainly rustled up the extreme performance community and whilst some regard them as having a steep price tag, it has to be considered that each system is painstakingly put together after each major component is hand-picked and binned based in its out of the box performance.
Following on from 8Pack’s launch of three ultra high-end systems, there has been a second line of systems in development that offer up the same essence of the ultra high-end systems, but with a lighter price point, that for many will make them more within reach of purchase. With inspiration coming from the essence of 8Pack’s dedication to quality and performance, the Infinity line consists of six systems with three water-cooled systems known as the Tesseract, Quasar and Eclipse.
The cheapest system of the three, the ‘Tesseract’ comes with a baseline price tag of £3071.99 (inc VAT) and is based around an overclocked 4770k and twin GTX780’s in SLI on a Gigabyte Z87X-OC with 8GB or RAM and housed in Corsair’s Carbide 540 High Flow Cube chassis. Taking a move up the performance ladder, the ‘Quasar’, like the Tesseract has an overclocked 4770k, but is built on an Asus Z87 Maximus VI Formula with 16GB of RAM and twin GTX Titan’s in SLI, this step up in performance will set you back just over £1000 extra.
At the top of the new range is the £4967.99 ‘Eclipse’. Built around a high performance 4930k overclocked CPU on an Asus Rampage IV Extreme motherboard, this system also comes with 16GB of 2133Mz RAM and twin GTX Titans, again all water-cooled and this time built into the huge Corsair Obsidian 900D chassis that has recently come to market.
All the systems come with a minimum of a 24 month return and collect warranty and are available to buy now at Overclockers.co.uk
Overclocker 8Pack Introduces His Three New Custom Overclocked Systems
It’s been a busy week for pro overclocker Ian ‘8Pack’ Parry. Following his achievement of breaking a benchmarking world record first thing Tuesday morning, followed by his appearance at the O2 arena in London to take part in an overclocking workshop he has now launched a line of three systems that are custom built to his own specifications with selected components specifically for these systems.
We’ve already shown you what these three systems entail here, and whilst some people have regarded them as being expensive, what has to be taken into account is that you are also paying for an after care service that is not like any after sales service program at the moment.
Following the unveiling of the three systems in Stoke-On-Trent, Ian was kind enough to give us an overview of his new systems and what each has to offer: