Ever Wondered What’s In Google’s Data Centres?

Google is known for a lot of things but the company was built on data, the storage and searching of information from all over the internet. Typically these things are locked behind closed doors but Google wants to show it all through an eight-minute video tour showing you everything you need to see in Google’s Data Centres.

First off you need security clearance, as even for Google employees the sites are normally locked down. After a small interview regarding all the different bits and systems that help ensure a 24-7 service of their systems. Stepping into the actual data centre requires more than just a pass as you need to get through a circle door locked by an iris scanner as part of the dual authentication.

Through the entire video, you can see how large a data centre is with it giving you just a small glimpse of the building. In an interview with Virginia, one of the people responsible for the network it’s revealed that a single building can support up to 75,000 machines while transmitting over a petabit of data per second.

They even go into detail about how data and drives are removed from the system. First, the drives are wiped only to then be placed in what is essentially a wood chipper designed just for hard drives.

Take the tour in the video below and see for yourself just how big a company Google is and how many steps it takes to protect both companies and customers data. Be warned though the video is a bit of an advert for Google’s cloud platform so it may be a little cheesy at times.

PIA Running Traffic Through Second VPN to Avoid BitTorrent Ban

After a number of large datacentres are now banning heavy BitTorrent traffic on their networks, popular VPN provider Private Internet Access (PIA) has started routing its traffic through another VPN which, while slowing connection speeds, ensures its customers are not prevented from downloading torrents.

Many BitTorrent users implement VPN services to keep their downloading private and prevent their IP address from being tracked by ISPs or third-party copyright infringement enforcers. Since it is one of the few VPNs to not keep logs on its users, meaning there is no data to hand off if served with a warrant, PIA is a favourite amongst torrenters.

“Certain regimes/regions and data centers have strict discriminatory policies towards the BitTorrent protocol. In order to provide a free and open internet to everyone, we were forced to create a technical fix,” a PIA spokesperson told TorrentFreak.

PIA believes that its “double VPN” solution is the best compromise for its customers, as it does not require invasive techniques, such as DPI.

“Due to the fact that packets were routed in an unidentifiable manner and double hop is a known and accepted technology by privacy advocates, we believe this technical solution adheres to the strongest of privacy ideals,” the spokesperson said.

“We want to make clear, that privacy is in fact our single policy. However, in order to help our users who are censored in certain regions, we needed to find a way to provide close servers while still being able to provide users with true and free/open internet access,” they added. “This was our solution and we still think that using technology to create a solution is better than waiting for politicians to fix this problem.”

PIA has posted a full statement on the matter to its website.

Image courtesy of FreedomHacker.

Lightning Strike Wipes Google Data Centre

Google has suffered permanent data loss after one of its data centres in Belgium, which was struck by lightning four times. The electricity surges from the lightning strikes wiped portions of data from the Google Compute Engine storage systems; some disks affected by the strike were later recoverable.

“Although automatic auxiliary systems restored power quickly, and the storage systems are designed with battery backup, some recently written data was located on storage systems which were more susceptible to power failure from extended or repeated battery drain,” Google said in an online statement.

Google’s GCE service provides users with cloud storage and virtual machine operations. It’s not yet clear how many customers could have been affected, but Google claims that only 0.000001% of its data was permanently wiped.

So, what caused the data centre to be struck by lightning an unbelievable four times? According to Justin Gale, the sheer surface area of such a building, with its plethora of power and telecommunications cables, is more susceptible than regular buildings. “The cabling alone can be struck anything up to a kilometre away, bring [the shock] back to the data centre and fuse everything that’s in it,” he said.

“Everything in the data centre is connected one way or another,” James Wilman, Engineering Sales Director for Future-Tech, added. “If you get four large strikes it wouldn’t surprise me that it has affected the facility.”

Thank you BBC News for providing us with this information.

SanDisk Showcase Industry’s First 2TB SATA and Portable SSDs at Computex 2015

Computex 2015 – The highlight for me at the SanDisk booth were the new, and Industries first, 2TB SATA SSD. The CloudSpeed Evo Gen II is optimized for cloud data services, video streaming, social media analytics and content repository.

The CloudSpeed ECO II is intended for data centres, but there is good news for consumers too as the also presented an external 2TB portable drive. Okay, that’s not entirely true as it only holds up to 1.92TB. The SanDisk Extreme 900 portable SSD doesn’t just look good, it’s also fast.

SanDisk set up a demonstration of the drive and we see transfer speeds of up to 714MB/s on sequential reads and 717MB/s on writes.

That is some seriously fast external storage, so fast that it beats what most have as internal storage

IBM Chip uses Light to Transfer Data at 100Gbps

After a decade of research, IBM has finally developed a new silicon photonics chip that can transfer data at the speed of 100 Gbps. These are designed for data centers and reference chips can transfer data over a distance of two kilometers using light pulses. It can be used in data centers to link storage, networking and servers. This technology is not going to be in Personal Computers or Handheld devices anytime soon as IBM is aiming to get it in data centers because of its advantageous high bandwidth optical fiber connection.

There is also demand for more computing power in servers with applications like analytics, machine learning and big data. Optical connections could help dozens of processors communicate on a server rack, making it easier to break up processes over multiple resources, said Richard Doherty, research director at The Envisioneering Group. “Optical connections could make servers much like storage drives, which can be easily hot-swapped depending on processing needs in data centers” he added.

The Optical technology used in telecom is different from what IBM offers, their silicon photonics chip is cheaper and is meant for shorter distances. IBM’s single-fiber implementations are considered to be better than Intel’s MXC optical cables.

Thank you PCWorld for providing us with this information.

Image courtesy of WallSide.

Zynga Shuts its Data Centers, Returns to Amazon

Remember those Farmville and Cityville game requests on Facebook? Its creator, Zynga is provider of social game services which for various platforms like social networking sites to Smartphone like Android, iOS and Windows Phone. They had to change their business because it did not work out as planned. Games are unpredictable. The world has moved from those web-based Facebook offerings to Mobile games.

They were Amazon’s customer of cloud-computing services, but then they built their own Data Center because its operation cost were cheaper than paying Amazon. But the price to performance ratio did not meet the expectation and it took $100 Million to build those Data Centers, meaning they now they have to shut it down and move back to Amazon’s services “Running a data center is expensive, there are lots of mouths to feed when you have your own data center.”, he said.

“There’s a lot of places that are not strategic for us to have scale and we think not appropriate, like running our own data centers, We are going to let Amazon do that.” Zynga CEO Mark Pincus told investors on a conference call.

When Zynga built its Data Centers back in 2011 they were still dependent on Amazon for some tasks. A software called zCloud was created for Data Centers. It was made on open-source software called Cloud Stack. It enabled them to easily switch between Amazon’s servers to their own. Zynga may have simply decided that it wanted to be a gaming company, not a technology company.

Thank you Wall Street Journal for providing us with this information.

Image courtesy of Compstak.

From A Router To The Cloud: How Gaming Companies Manage Online Bandwidth

The development in online streaming and multiplayer gaming has progressed rapidly in the past few years which is great for gamers but has also meant that the hardware and software used to power these new advances has had to progress fast as well. Superfast broadband and suped-up fibre optic speeds have helped more and more people connect effortlessly with each other, and it is the providers who have felt the pressure in having to bump up their offered services to ensure that their customers get the very best from their seamless streaming capabilities.

The rising popularity in massively multiplayer online role-playing games (MMORPG) has meant that players can connect to millions of online players from around the world and spend hours and hours playing until their hearts’ content. This of course means that there is a terrific strain on the broadband and if the rest of the family wants to hop on the web and do their own thing, they may have a few issues if you aren’t hooked up to a suitably speedy broadband service.

From the likes of StarCraft, to World of Warcraft, to online card rooms – which have millions of players every day – brands have to invest heavily to ensure that they can accommodate and function with such large volumes of players.

Coping With Volume

At PokerStars, a site which has over 50 million members, platforms are heavily invested in to deal with the large surges of players. With over 700 hands dealt every second, and potentially almost half a million people seated at a time, bandwidth congestion could be a major issue. Yet, at the PokerStars Data center, that issue is resolved with an incredible infrastructure that is similar to the likes of those in place by Google, Microsoft, and Amazon.

It’s almost like building an internet on top of an internet, with a sea of servers at their HQ in the Isle of Man keeping gameplay up to speed as well as avoiding the possibilities of a loss of connection whether it be via mobile or desktop.

And loss is of major importance to them – or more, avoiding that. Alongside the servers to keep players connected and playing fluidly, the brand also has plenty of storage which saves every hand played on the real money tables.

Although it isn’t just online casinos that have hugely advanced systems in place to keep gameplay free-flowing, and users happy and communicating with each other in their multi-player communities.

Gaming providers have to take the brunt of the millions of online gamers looking to piggy-back off their servers in order to play and compete in the colossal online arenas of MMORPG. And let’s not forget game streaming services provided by the likes of Twitch.tv and Steam Broadcasting which are becoming very popular amongst the gaming community. Twitch’s service in particular allows the live streaming of lots of gaming-related content including live coverage from some of the biggest esports tournaments taking place around the world. Twitch users can broadcast their own channels of gaming session, playthroughs, speedruns and more. Steam is now doing much the same and already has over 100 million users. So with all this major usage and live streaming, the impact on these providers’ servers will surely be immense. But that is the power and the brilliance of cloud gaming.

Cloud gaming allows you to play high-quality games anywhere you have an Internet or WiFi connection. Once you have the connection, you can tap in to it with most modern devices effortlessly. Much like your music or podcast collection, you can access various games, new or saved, directly from the cloud’s library and play or continue your gameplay from whichever device you wish. What also makes this a popular choice for a lot of gamers is the memory they save on their tablet, smartphone or hard drive. There are no downloads, no installs and no need to constantly update games and your system with upgrades and patches.

This can save all its players a significant amount of memory and drive space, something that is usually packed to the brim with other apps and memory sucking data, leaving players struggling for space to squeeze in another game or two. There is no hardware needed and there is no overly complicated set-up involved for you to get going. You just login and away you go. Now what gamer in the world won’t find that an attractive prospect?

A Little Closer To Home

Of course however, it’s important that a gamer has their own connection sorted too.

For avid gamers, the last thing you want is a monthly cap on the amount of bandwidth you use. There are providers out there that have fixed restrictions on the bandwidth they offer. A certain amount would be capped and then billed on top of your normal monthly rate if you happen to go over. Some providers even go as far to limit traffic at certain times of the day and then offer unlimited broadband at others. These caps and unnecessary restrictions are certainly not ideal, especially if you’re in the middle of an intensive game of World of Warcraft or Call of Duty.

Typically there will always be particular periods of the day and of the week that are noticeably more traffic heavy than others. We often find a slight dip in broadband speeds at our offices when the schools empty out and most people return home after work. Characteristically, most people would be heading on to the internet on their computers or handheld devices in their downtime after a hard day’s work or during the weekends.

But periods such as Christmas will have another major impact, particularly for the online gaming industry. Christmas day and the week that follows is a particularly popular time for gamers to spend online. Most people have time off work during this period, not to mention it is the peak period for brand new and exciting games being gifted to each other – so it is the perfect time to try them out.

But with the likes of PokerStars having their ‘internet on top of the internet’, it creates fewer issues, whilst when it comes to World of Warcraft, they have a host of different realms which you can play, taking into account the population in there at the moment, whether there is a queue, and also the main language in there.

It’s a clever way to run what could otherwise be slow and lethargic gameplay. By including different rooms and realms, as well as investing millions to ensure that users get the best experience, coupled with a good connection at home from router to the cloud, we are enjoying the quickest and smoothest gameplay we’ve ever had.

Big brands will continue to push boundaries as demand for gaming soars. We want higher quality, high definition titles, and with that CPUs and higher bandwidths are required. Brands with their hundreds and hundreds of servers are constantly improving to make this happen; it’s just whether we can get our own broadband connections up-to-date enough to keep up.

Researchers Achieve Fastest Data Transfer In The US

Data centers rely on being able shift huge amounts of data around on demand and as global demand for data grows larger, both in terms of the amount of data we use in our daily lives and in terms of file sizes, the need to move that data around even faster becomes more and more important.

Scientists at the University of Illinois may have found one solution, lasers. The laser devices known as VCSELs are faster, more efficient and more accurate than wired solutions, but the ones tested at the University have proven capable of transmitting data at a rate of 40 gigabits per second!

“Information is not useful if you cannot transmit it. If you cannot transfer data, you just generate garbage. So the transfer technology is very important. High-speed data transfer will allow tele-computation, tele-medicine, tele-instruction. It all depends on how fast you can transfer the information.” said Professor Milton Feng an electrical and computer engineering at the University.

Feng said that with some adjustments the laser system could operate at 60 gigabits per second. This technology could pave the way for more efficient and powerful cloud computing solutions.

Thank you Gigaom for providing us with this information.

Image courtesy of Gigaom.

Part Of The Internet Archive Burns Down

The internet archive, an impressively huge data center that holds every webpage created and indexed on archive servers and runs completely non-profit, partially caught fire on November 6th. The archive center has a total of 30 builds, of which one was lost in the fire among other surrounding buildings that were just damaged such as a church, local business and residence.

Although the fire destroyed some of the servers that Archive.org runs to store data on, none of the saved data was actually lost. The saved data was saved again to prevent any loss in case it was lost in the first place. Still keeping up? I think I’ve lost myself.

Some of the equipment that was also lost in the data center during the fire includes: a scanner, high def camera, some lights and a few sets of office desks and chairs among some other lost items that can’t be accounted for due to the shock of the fire.

Kahle, the founder of the internet archive, said that there was no estimation of the cost of the lost property and the situation is “quite a bummer”.

The employees that previously worked in the now burned out building have been relocated to a different office to continue their work. The digital archive states that they’re not expecting any further disruptions of service, including website downtime and employee inaccessibility, meaning that the site will be functioning as normal for the foreseeable future.

If you’re interested in contributing to the non-profit organisation, you can donate over here: https://archive.org/donate/index.php

Thanks to Mashable for the information and image!