IPv6 Adoption Still Shockingly Low Despite 20th Anniversary

The IPv6 specifications aren’t new on the block, in fact, the Internet Engineering Group and the Internet Engineering Task Force, IETF, officially announced the IPv6 specifications back in December 1995 which was 20 years ago by now. Despite its age, the IPv6 adoption rate is still shockingly low and only a few have embraced it so far.

It has long been known that we are running out of IPv4 addresses and most people connected to the internet these days are sent through internal ISP networks before ending up on a shared public IP address. It works, so why change it, right? Upgrading hardware will result in increased costs, which in effect could be moved onto the consumer. This is after all where the final bill usually ends.

When we take a look at Google’s statistics, we see that the global IPv6 IP address percent of all assigned addresses only accounts for about 10.41%, which only can be described as very little and a very slow growth since birth. So why is that you may ask? Well for starters, the two standards aren’t directly compatible with each other, but they aren’t incompatible either. You just need to create a setup that can handle both and all modern system can easily do that.

A few select ISPs have made the switch along with large companies like Google and Facebook, but other than that the adoption rate is sparse. Another possible reason for the lack of willingness by the ISPs to upgrade to IPv6 could be the benefits they have from the current setup. Increased income. Back in 2012 when the EU range of IPv4 addresses was exhausted, ISPs began to hike the prices for customers with static IP addresses. I know in my own case I got a 400% price increase just to keep my static IP address.

Maybe 2016 will be the year where IPv6 takes off, although I’m not holding my breath for it.

“2G Tuesdays” Arrives For Facebook Employees.

Facebook is launching a new initiative by the name of 2G Tuesdays, sounds like a tech version of TFI Friday, which will give all employees a taste of a super slow connection to better emphasize the current speeds in countries including the developing market place of India. While this implementation is certainly essential to a better understanding of the parameters for designing and testing the Facebook App in areas that offer atrocious speeds, I can see a text-book example of slow connection rage.

Surely the speeds cannot be that slow, well engineering director Tom Alison remembers the first time he opened Facebook on a phone with a 2G connection, he exclaimed that “It definitely tested my patience — it felt like parts of the product were just broken”. While US citizens are accustomed to a faster 3G or even 4G, millions of people are accessing the World Wide Web with 2G where a single webpage can take around 2 minutes to load, or as western audiences would say $@%$@.

This is why Facebook’s team of “emerging market engineers”, yes, apparently they have a division dedicated to this, have spent an extensive amount of time re working Facebook’s News Feed for slow connections.

So, how will 2G Tuesday work? Well, when a Facebook employee logs into the app on a Tuesday of every week, “they’ll see a prompt at the top of their News Feed asking whether they want to try out the slower connection for an hour”. For that hour their computer experience will be akin to a person residing in India or any other slow connected country.

A better understanding of varying speeds throughout the world has led to some fascinating projects including an Open-Sourced Network Connection Class System, (sounds like a citizen reviewed social class status), that lets Facebook and its app figure out how fast your connection is with the aim of then conveying a different news feed depending on the speed.

Facebook reckons a large proportion of employees will opt into this experiment, what mood they will be in by the end is another matter. On a side note, while many tech employees enjoy the freedom to develop with a comparable connection for their area, they may fall into the mindset that the whole world is the same, by slowing them down it speeds up a unique process with the aim of benefiting consumers who suffer from appalling speeds to the web.

Solar-Powered Plane Getting Ready to Fly Around the World

The solar-powered Solar Impulse 2 is about to take off on a journey around the world after years of planning. The aircraft has been designed from the ground up with the latter mission in mind, having over 17,000 solar cells line in its wings which supply a series of electronic motors and charge four on-board lithium batteries.

The aircraft is said to be designed in such a way as to be endlessly powered by solar energy and thanks to its batteries, the plane is said to be able to fly day and night. The trip is going to take around 25 days split into 12 legs, starting and ending in Abu Dhabi.

Swiss aviators Bertrand Piccard and André Borschberg are the project’s organisers, who stated that energy efficiency testing is the main objective here. While this project seems to be a step towards testing how renewable energy could help fly planes in the near future, Piccard and Borschberg stated that the project is more of a publicity stunt rather than a technological milestone.

Gathering more information on the plane itself reveals that it boasts only one seat with a built-in toilet, with no heating or oxygen. In terms of food provisions during the flight, “dehydrated and vacuum-packaged” seems to be the key description of what the pilots should expect. No wonder the trip has been broken down into smaller chunks.

Summing it up, the aircraft has a long way to go in order to compete with the more comfy Boeings and Airbuses everyone is accustomed to. However, the project does reveal the potential of renewable energy and its impact in the near future. If the project will come to be a success, industry scepticism regarding future powered solar planes might dissipate and debates regarding its future could reopen.

Thank you Gizmodo for providing us with this information

What Happens Inside a DSLR Camera at 10,000 fps

Cameras are really a modern marvel. How they capture such high-resolution images under a host of different conditions is incredible. But quite often their operation is so quick and so small that we don’t see how they do everything at once.

Well now we can, thanks to this great video by YouTube’s ‘Slo Mo Guys’ who slowed down the process of taking a picture, at different shutter speeds, giving us an amazing look at how it all works.

Slow motion video recording is becoming more and more advanced, allowing us to see usually mundane things in ways we’ve never seen before. Just this week we reported on the story that scientists had managed to capture the effect of a laser traveling through the air for the first time – recorded at an unbelievable 20 billion frames per second.

Well this video was shot at a more reasonable 10,000 fps, but it’s still amazing to see something that we all take for granted in such detail.

Source: The Verge

First Video of a Laser Beam Travelling Through The Air

Light travels fast. Very fast. So much so, that it just seems instant to us. But, for the first time ever, we can see for ourselves that it isn’t and that it does indeed travel.

Researchers at the Heriot-Watt University in Edinburgh have managed to capture a laser beam travelling through the air for the first time. In the video bellow, which was recorded at 20 billion frames per second and lasts 6 nanoseconds, we see the laser beam coasting through the air, with the protons of the beam reflecting off particles in the atmosphere.

Source: The Verge

Story Develops – iPhone TLC and MLC Memory Tests

Yesterday we reported on initial tests of Apple’s TLC and MLC memory capabilities. This was due to the news being released that Apple had disabled all TLC memory in various iPhone models due to a high failure rate – without any warning to consumers or word of a replacement.

Through these findings, we were able to determine that TLC is a much faster ‘burst’ option of flash memory. Greatly out performing MLC in the beginning to middle of a ‘zero fill’ test on a 64GB iPhone 6, but falling off majorly toward the end. The conclusion of this information is that TLC memory is good for opening applications quicker and processing small amounts of data, whereas MLC will provide you a steady rate of transfer or speed no matter what the task. If you’re looking to run multiple applications at once – MLC is for you.

New data has come to light thanks to ‘Gforgames’ reporting on further findings where the TLC results are quite interesting. This goes to prove that you shouldn’t always judge a product just one method of testing – look into it further before making a choice. A random fill test was completed, seeing random amounts of data poured into the same iPhone models as reported on yesterday, this time the results were significantly in MLC’s favor – seeing a steady upwards curve in transfer speed, whereas TLC stayed consistent for the most part, with a slight decline toward the higher sized transfer files.

Following these results, they also reported that while these transfers were happening, memory usage of each device was quite interesting. Below you will see a MLC operational phone pouring most of it’s resources into the data transfer (left image). This is comparable to the TLC alternative which you can see has over 200mb of inactive memory sitting idle (right image).

We will continue to report on these findings as the story develops.

Images courtesy of Chiphell

Apple has Disabled TLC Memory

Due to a high reported rate of functional defects, Apple has decided to stop using TLC NAND flash memory technology in their iPhone 6 devices. According to insiders talking to IThome, on November the 6th Apple decided to deactivate all TLC NAND flash technology, believed by them to be plaguing their 64GB iPhone 6 and 128 GB iPhone 6 Plus models with defects directly due to the nature of the flash memory chip.

This chip is manufactured by Anobit, commonly known for their SSD manufacturing facilities. Anobit was acquired by Apple in 2011 and utilized to create their TLC NAND solid-state flash memory.

Why did Apple choose TLC NAND if it’s so unreliable? It’s cheaper and reads/writes data faster than SLC and MLC equivalents. Apple’s future plans have now been based around MLC NAND memory, including it’s announced iOS8.1.1 update alongside their 100,000 units sold in South Korea and Taiwan alone.

Reportedly, not all latest-gen iPhones have TLC NAND installed. MLC NAND is located within the 16 GB iPhone 6, iPhone 6 Plus and some 64 GB editions, however all 128GB models have been released with TLC NAND hardware.

The apparent adoption of TLC NAND is due to cost saving constraints for the 128 GB edition of Apples latest release, so we may expect a higher sale price once MLC NAND has taken over. So, if your iPhone has been performing a little slower out of the blue – it’s possibly not the apps your running or your 40 open Google Chrome tabs; it could be because Apple has disabled your TLC NAND.

Rightfully so, many users are outraged. A growing number of Korean customers have been spamming Apples online community portals asking for their slowing iPhones to be replaced for free – we think it to be rightfully so.

Image courtesy of IThome

Samsung Bug in Evo 840 SSD

During the last couple of weeks, more and more reports have started to surface about problems with the Samsung 840 and Evo 840 drives. The trouble in question is extremely low speeds when reading old data, meaning data that has been written over a month ago. Freshly written files will have the same performance as the drive has advertised.

The biggest collection of information about this bug is over at the Overclock.net forums. The thread is close to 750 replies at the time of writing, and more are coming in all the time, and the thread itself started about a month ago. As more people get aware of the problem, more reports pop up all over Reddit and other user forums as well.

The HD Tach graph below illustrates the issues with these drives. The odd part for now, is that the bug only affects LBA’s that have old data associated with them. Freshly written data has the full speed. This explains very well how such a severe bug could have been hiding for so long. The good news is, it’s most likely a firmware issue and can be fixed with an simple update.

Anandtech has reached out to Samsung via phone and it seems they are both aware of it and working on a fix. Presumably the bug has been located. There’s sadly no ETA on a possible fix yet, but it’s great to see Samsung working hard to fix this issue. But then again, it is in their own interest to do so.

I am running a 1 TB Samsung Evo 840 in my personal system, so of course I had to test this out myself. I can verify that the bug is present on my drive as well. This could very well attribute to some of the lags and slow loading times I’ve been having when working with my PC. Doing a normal disk-to-disk copy of fresh data results in about 250 MB/s, which is normal for that folder. An old data copy however swings between 300 kb/s and 2.5 MB/s with peaks up to 6 MB/s. That’s not what I’d call SSD speeds.

Update September 25: Estimated time for a firmware update is set to October 15, we’ll keep you updated.

Thank you Anandtech for providing us with this information

Image courtesy of Anandtech.

Leaked Benchmarks Show NVIDIA’s GTX 750 Ti Slower Than The GTX 660

After a few rumors about the Maxwell chipset and its release date in February, we now hear about statistics regarding the GeForce GTX 750 Ti suggesting that NVIDIA’s first Maxwell GPU is slower than the current GTX 660, according to WCCF‘s article. Take the news with a grain of salt though, nothing official has been released regarding the Maxwell GPU, nor its performance. However, nobody can confirm, nor deny its authenticity.

We see the GeForce GTX 750 Ti is around 10 to 15 percent slower than the GTX 660. A bit of a good news comes from WCCF, stating that the benchmarks were made using Single Precision format, while the new Maxwell GPU is known for its Double Precision GPU. Also, it makes no sense for NVIDIA to release a graphics card which is slower than its previous series.

Whether the benchmarks are valid or not, it remains to be seen. In the meantime, we are awaiting more official information about the Maxwell GPU and possibly some more benchmarks of the GTX 750 Ti after its release. Maybe we will even see the GTX 750 Ti perform in the new PCMark 8 that was released a few days ago.

Thank you WCCF for providing us with this information
Image courtesy of WCCF

Japanese Camera Manufacturers Suffer Losses

Panasonic and other Japanese camera manufacturers struggle to keep up with today’s smartphones. Since most people nowadays prefer more portable and mobile devices, and most importantly, everything compacted, it appears that the smartphones and tablets are the best solution. Therefore, giants such as Panasonic and Fujifilm Holdings have been losing money and have suffered a fall of more than 40%, with only 59 million cameras sold.

On the other hand, sales of single-lens reflex (SLR) cameras are flourishing as users prefer connectivity above picture quality. Looking at a mid to long term, the imaging companies need to turn around and rethink their strategy or face the consequences in the future, and they aren’t looking pretty. Panasonic is said to have had a downfall of 0.7 form 3.8% to 3.1% during the course of this year.

“If you look mid-to-long term, digital camera makers are slipping and the market is becoming an oligopoly,” said Credit Suisse imaging analyst Yu Yoshida. “Only those who have a strong brand and are competitive on price will last – and only Canon, Nikon and Sony fulfil that criteria,” he added.

In the meantime, Panasonic, Fujifilm and Olympus are trying to fend off the smartphone threat by cutting compacts, targeting niche markets such as deep-sea diving, and launching the higher-margin mirrorless models. The mirrorless format promised mid-tier makers an area of growth as the dominance of Canon and Nikon all but shut them out of SLRs, where Sony is a distant third. Neither Panasonic nor Fujifilm makes SLRs, and Olympus stopped developing them this year.

“SLRs are heavy and noisy, whereas mirrorless are small and quiet. While some people say SLRs still have better image quality, mirrorless (cameras) have improved to the point where they’re equivalent, if not superior,” said Hiroshi Tanaka, director of Fujifilm’s optical division.

However, Sony still keeps up to date with its two QX lenses released this quarter. These come with their own sensors and processors, and clip onto smartphones through which the user operates them wirelessly. They are pocket-sized and produce photographs of a quality rivaling that of a compact camera. But Sony appears to have connected with consumers as demand soon outstripped production. Some are even using the lenses in a way Sony didn’t intend, like place them at a distance while they press the shutter on their smartphone to take self-portraits, or selfies.

“We had no idea how much the QX would sell initially when we put it out. We didn’t set any targets,” said Shigeki Ishizuka, president of Sony’s digital imaging business. “There are so many consumers that were hungry for Sony to do this,” said Chris Chute, IDC’s digital imaging research director. “They’ve (waited for Sony) to come out with something really innovative, almost like the Walkman (portable music player).”

Thank you Chicago Tribute for providing us with this information