Facebook’s campus in Menlo Park, California has a super wi-fi. What do we mean by a super wi-fi? Well, anything that servers speeds of over one gigabit per second would count as super. This means its more than 100 times the average speed of a typical house’s internet speed in the US!
Facebook doesn’t want to stop there, looking to expand the test to a large scale system in downtown San Jose later this year and then in other areas around the world. Jay Parikh, head of infrastructure and engineering at Facebook says that rolling over other high-speed options, such as Google Fiber, can prove to be difficult in urban areas, so creating a wireless infrastructure would be both cheaper and easier to deploy.
The problems title is named Terragraph and is based on the technology known as WiGig. By placing WiGig hubs on light poles and common street furnishings, Facebook hopes to create a fast wireless network that anyone can use to send and receive information on the 60 GHz radio waves the systems designed around.
We’ve all heard about 3G and 4G, the standards that define the technology that has helped shape mobile communications and mobile phones for the last generation. Samsung looks to get ahead with the next generation by hosting a meeting in hopes of standardizing standards for the next generation, 5G.
Hosting the 3GPP RAN (3rd Generation Partnership Project – Radio Access Network) group, Samsung Electronics hopes that the meeting taking place in Busan, Korea, will help encourage companies to “discuss ways to support the effective integration of new services such as IoT (Internet of Things) into 5G, and measures to ensure the compatibility of 5G technologies”.
5G is not a new technology, having been in development by Samsung since 2011, but with more and more companies looking to have the first standards ready for June 2018, we could soon see a network that could see speeds of 1.2 Gbps for moving vehicles and 7.5 gigabytes for anyone who stands still for a minute.
With companies looking at rolling out the technology for 2020, the meeting hopes to cover everything from energy and cost efficiency to security and availability, all key factors in releasing a successful piece of technology that people not only accept and pick up but support years down the road.
Content provider CloudFlare is no stranger to the spotlight, with being accused of protecting pro-ISIS by Anonymous causing it some issues. Now it would seem that they are instead on the throwing end of a claim, saying that requests they get from the Tor network (a network designed around allowing anonymous browsing on the web) are malicious 94 percent of the time. Tor accuses CloudFlare of mischaracterizing their users and blocking its network, with it going so far as to impact normal traffic.
Tor claims that its users are often getting stuck in CAPTCHA loops or outright failures, stopping them from accessing content in even the simplest of ways. In external research, Tor states that CloudFlare was found to block at least 80 percent of IP addresses from its service, with the number increasing over time. The CAPTCHA loop is caused by a measure CloudFlare has introduced that requires users of the Tor network to fill out CAPTCHA’s, but only users of the Tor network will see these.
Tor isn’t happy about this accusation and wants to see evidence regarding their 94 percent figure. Many are wondering how they reached this figure, or even how they deem if a connection is trustworthy. With so many people now using networks and systems like Tor, blocking or making the experience worse for users can’t be seen as a positive step when it comes to providing content.
Which console do you play on? The choice is impacted by a lot of things, from being able to play those exclusive releases to the unique hardware options one console features. One of the biggest factors for consoles is often the ability to play with others, with groups of friends often getting the same console in order to play together. Microsoft wants to enable cross-platform play and recently invited others to join them in the plan, an invite Sony has now responded to.
Sony’s statement includes that they have been “supporting cross-platform play between PC on several software titles starting with Final Fantasy 11 on PS2 and PC [since] back in 2002”. The statement goes on to say that they “would be happy to have the conversation with any publishers or developers who are interested in cross platform play”.
Given that Sony and Microsoft are responsible for the leading consoles when it comes to online multiplayer it could be interesting to see the two gaming networks merge together to create a united gaming network, enabling people from every console to play games together with nothing but their skills separating them.
With friends who own one or both of the consoles, I think that creating a uniform platform for cross-console play could only put more focus on the hardware that people use and the skills they hone to play their games.
A few years ago Sony had a rather bad hack, which affected around 70 million of their customers. In the wake of the hack, Sony offered to renew its efforts to increase security alongside some gifts to appease players who suffered during the 23-day outage. As of March 2nd, you may find that the promised free game codes have finally arrived.
Depending on the services you were signed up to when the hack happened (PlayStation Network, Qriocity and Sony Online Entertainment), you can claim a variety of rewards. As part of Sony’s initial scheme people were offered to grab a game, but don’t worry if you didn’t manage to grab one all the time back then, you can grab two now.
The games available vary based on which of the available platforms you wish to collect your reward for, with the Playstation 3, Vita and PSP all being offered free rewards as a sorry. If you want to grab a game you can now get inFamous, LittleBigPlanet and even the God Of War HD Collection for free but they will be limited to the aforementioned consoles.
With the lawsuit spawning this reward scheme valued at $2.75 million, Sony must be happy that they can get away with a few free games or even a little account credit or PSN time almost five years since the hack began the security awareness that so many companies are still suffering from.
In this day and age, most laptops and devices come with a wireless adapter built-in, even the latest Raspberry Pi includes wireless. This is lucky for when you can’t get to the router in your house, or the cable just won’t reach your favourite seat, or when your ethernet connection is disabled by an update.
Some people found this out the hard way this morning when Apple published an update over the weekend which disabled the ethernet port in the El Capitan distribution of their operating system. The reason for the disabled port? An update for the System integrity protection system, a system designed to keep your computer secure by disabling malicious kernel extensions (kext) (the equivalent of drivers for Windows or Linux users). Sadly a small update blacklisted the ethernet ports kext.
While an update was quickly released to fix the issue, some people still had the issue of it disabling their system before they were able to update to the latest version. The idea is that it’s all done behind the scenes, without you having to select the update or even know about it, kexts are updated silently. These updates run even if you have disabled the standard automatic update.
Do you use cabled connections or are you constantly on the wi-fi? Could you live with the other? What would you do if someone accidentally disabled the wireless in your computer, laptop or even your phone?
Buffering, downloading, pausing, even trying to make out the shapes on a low-resolution video have become common place for so many people as their internet speed caps out, normally before they are anywhere near their advertised (and purchased) speeds. It seems that we aren’t the only ones annoyed by this though as a group of business leaders have spoken out now, accusing the UK government of creating a “poverty of ambition” for internet speeds.
The Institute of Directors (IoD) is formed from business leaders within the UK, and in their report titled Ultrafast Britain, they state that the UK is lagging behind when it comes to enabling faster broadband connections. The government states that 90% of UK properties have access to superfast speeds, with that reaching 95% by next year.
The IoD don’t think this is good enough, with them calling for speeds of 10 gigabits per second (Gbps) by 2030. Currently, the government wants just 10 megabits per second by 2020, a speed which many are already getting.
This isn’t the first time that the internet as a structured provision has been discussed this week, with Ofcom telling BT that its cable network should be opened up to other companies. Currently, BT contains two parts, the core company and Openreach, the part of the company responsible for the cable, fibre and network infrastructure that the UK relies on for its internet.
What is your internet speed? Is it ever what you were actually advertised to be getting? Do you know anyone with super fast/slow internet and does it have a big impact on them?
Computers are weird things, they get smaller each year and yet still their power and what each of them can do increases every time we blink. A prime example of this is the recent surge of mini-computers, with some hardware being as small as your phone while also letting you add and customise to your heart contents. From touchscreens to the next generation of robot wars, the small component has inspired a generation but without wireless technology, it seemed to lack something. That could change with an FCC document showing that the next generation of Raspberry Pi may solve that problem
First let’s be clear, you can connect the older Raspberry Pi’s to the wireless network but you needed to buy a wireless dongle, which means another thing you can forget and a USB port that you’ve got to take up in order to use it. The documents show that not only will the next Raspberry Pi include everything you need for wi-fi connections but it will also include Bluetooth.
The documents don’t really show that much difference, with everything else pointing to the same specification as the Raspberry Pi 2, but that doesn’t mean it won’t change.
Do you use a Raspberry Pi, or maybe something similar and if so what do you use it for?
Among our recent upgrade to 10 Gbit networking here in the eTeknix review section wasn’t just the awesome 12-port smart switch that we saw a little while ago, D-Link also supplied me with a DXE-820T dual-port 10 Gbit Ethernet Network card to be able to connect with the switch in full speed. After all, what good does a 10 Gbps switch do when my test rig doesn’t have the same kind of performance.
The D-Link DXE-820T is a dual-port 10 GBASE-T RJ-45 PCI Express high-performance adapter designed for the high-speed PCI Express bus architecture. This adapter offers the increased bandwidth needed in modern environments as well as being a reliable and functional PCI network card. It has been specifically designed to allow throughput at rates up to 40 Gbps, thus eliminating the bottleneck that exists with current 32 and 64-bit PCI bus architectures.
The DXE-820T requires a PCI-E v2 x8 or x16 slot for enough bandwidth, but you don’t need to provide any extra power connectors or other things besides your network cables, naturally. The card is capable of a transmitting distance of up to 100m with Cat 6A or higher in 10 GBASE-T mode and up to 100m with Cat 5 type cables in Gigabit mode. This makes the placement of the switch a lot easier.
The DXE-820T features TCP, UDP, and IP checksum offloading functionality, which transfers the checksum processing tasks from the computer’s own CPU and onto the network card itself. The DXE-820T’s ability to handle the checksum processing means that the CPU’s processing power can be used for other tasks while still achieving 20 Gbps network speeds. It also means that the network card needs some extra cooling in the form of an active fan as well as a passive heatsink.
The adapter also features an onboard screening of 802.1Q VLAN tagged Ethernet frames, allowing you to assign multiple subnets to each server and isolate devices within each VLAN from the rest of the network for better traffic control and security. With support for advanced features such as 802.3x flow control, jumbo frames, and SNMP for network management, the DXE-820T can easily interoperate with your current networking equipment.
With two ports at your disposal, you can increase the network throughput even further than the 10 Gbit per second on each port. With Smart Load BalancingTM, the DXE-820T can configure multiple adapters to work as a team, sharing traffic and ensuring data reliability. This both creates a faster network and provides fault tolerance resulting in a stable and efficient network.
The low profile design allows the deployment in space restricted areas and D-Link also included a low-profile slot cover right away. The card itself is powered by Broadcom’s BCM57810 chip.
The card naturally supports Jumbo Frames for optimized setups and it allows for values up to 9K. The DXE-820T is compatible with all major operating systems with drivers available for both user systems and server systems such as Windows Server 2012 or Solaris 11.
High-Speed data transmission at rates up to 40Gbps allow for seamless data transfer.
10GBASE-T technology supports distances to 100m over CAT6A or better copper cable.
Advanced Features: 802.3x flow control for traffic management, 802.1Q VLAN tagging for increased security, and checksum offloading to reduce CPU processing burden.
Bandwidth Management: NIC partitioning enables administrators to manage bandwidth for greater network efficiency.
Packaging and Accessories
The DXE-820T NIC card comes in a very neutral box that only really reveals that we have a D-Link product inside. But it’s a network card and it isn’t like you would put the box in the display after installing the card anyway, so simply is good.
We do find a little sticker on the rear of the box that reveals what is inside, the model number along with serial, mac address, and hardware version.
There is both a driver disk and a quick installation manual inside the box, and D-Link also included a low-profile bracket for use in small-form-factor systems. Everything you need to get going with that extra speed.
Building a datacenter can be a costly and time-consuming endeavor, but the latest project from Microsoft may have solved quite a few problems all at once. Typically it can take two years to build a datacenter on land, and even then you’ll find that they need to be built quite far from built-up residential areas and city centers, which then leads to increased latency for users. Then you’ve also got the issue of cooling the datacenter, as all that hardware generates a lot of heat and the cost of cooling it can quickly become a headache. Microsoft’s solution to all of this? Build the datacenter at the bottom of the ocean!
While it may seem like a wacky idea, it’s pretty clever. The ocean water is a very efficient way of cooling the datacenter and it’s passive too, so there are no ongoing running costs for the cooling as there would be in a building that used air conditioning systems. Microsoft said that during their testing, they observed no heading of the marine environment beyond a few inches of the device, so there should be no major impact from these units.
Microsoft’s Project ‘Leona Philpot’ was deployed about 1 kilometer off the pacific coast, where it stayed for 105 days and worked perfectly. Having it deployed in water like this means that it can be located close to populated areas, reducing latency and not taking up valuable land space around cities. What better, for Microsoft at least, is that these units can be deployed in just 90 days, much quicker than land-based datacenters. Microsoft’s researchers are already working on a follow-up to Leona Philpot, where they will deploy a unit three times the size and perform further tests. It will be interesting to see how well these datacenters perform and if they’ll become more popular than the current land based ones although we suspect that may not be for a long time.
Artists are always finding exciting and fresh ways to promote their artwork; this is, well, no exception, after the China Youth Network recently reported concerning a craftsman who conveys his artwork on eggshells. The artist in question is based in China and goes by the name Huangwei Xiang; he is 64 years old this year and below are images of this quite amazing yet delicate artwork.
As you can see, the first image below is of this gentleman’s latest series entitled “108 Water Margin Heroes” these were carved in 108 days. the fragility of the eggshell means the need for a delicate hand when applying pressure.
The next image below is of a phenomenal carving within a hollow eggshell, the intricate patterns and imaginative design certainly opens the door for a range of possibilities. On a side note, not long ago a “Hong Kong business person spent 30,000 yuan (£3,195.89) with the aim of buying two hollow egg carvings as well as 12 zodiac egg carvings”.
Below are two images which convey a close up of two of the Margin Heroes, patience is certainly needed when attempting to sculpt the artwork within the shell.
It is certainly unique and also requires a high level of skill to achieve these results, I would not have thought this would have been possible when you consider how fragile an eggshell is, but, it just goes to show what can be achieved within the world.
Some like PC, some like Console. People choose different platforms for different reasons, but the creators behind the ever popular PC gaming platform Steam came up with an idea to bridge the gap. Enter Steam machines, a combination of PC and Console, which meant that you could play games like you were on a console but you could upgrade it like a PC. So why not grab your controllers and play on the latest SteamOS version.
Steam OS is based on the popular operating system Linux. One of the main selling points behind Linux is that it is open source. Open source is when software is freely available in both the finished product and the code that builds it up. This means that you are able to see how it works and add functionality as you wish.
The latest update, 2.60, for SteamOS features not only security patches and updates to the Linux system as a whole but options for an extra controller. The Xbox One Elite controller features a variety of buttons, including some hidden underneath, where your hands naturally rest while holding a controller. If you feel up for giving the controller a spin on SteamOS you now can but sadly only if its wired.
If this wasn’t enough why not check out its new Bluetooth support. Being able to connect a range of devices through Bluetooth you could soon be playing with Playstation 4 controllers and headsets without a wire in sight!
While updates will help the platform, SteamOS recently took a hit when it was revealed that it played games worse than Windows 10 did. We might see that changing soon and with updates coming out more and more it might be worth retesting that comparison soon.
Steam is a global name in video games. As a platform for everything from selling games to networking players, the service enables you to download a small client and regain access to a collection of thousands of games. Not surprising then that they’ve recruited Level 3 communications to increase their network speed.
Level 3 deliver a collection of high-speed connections all around the world, a service that Steam users will be able to enjoy soon as Valve has approached Level 3 to upgrade their network to include their “100 gigabits per second” connection. They state several good reasons for this upgrade, the first of which is the service’s growth year on year. With a 75% increase year on year, you can imagine how their servers must be with new games released causing massive spikes of downloads. 400-500 petabytes of data are downloaded worldwide per month with 4-5 exabytes being downloaded per year, a figure that will only increase with games increasing in average size year on year. With Steam games coming from MB’s to GB’s the “standard” for Steam is roughly 10-40 gigabytes per user download. This is quite hefty given the service has over 100 million users, with users often being online at peak times such as during sales in which it’s not uncommon for millions of users to be online at the same time.
During those busy times, you may quickly notice that your connection stays at peak performance with such an array of upgrades coming soon. Now if only we could all get stable, fast internet at homes it would help make that game time less stressful.
There has been a huge explosion of online ransomware within the last year or two which has seen a huge number of consumer’s, unfortunately, falling victim to this ever present and growing technique. Now, there is a new technique which is being served to consumers via the PopAds network and it contains the Magnitude exploit kit via pop-under ads.
For those who are unfamiliar with a Pop-under ad, this is a type of online advertisement that appears behind the main browser window and remains open until the user manually closes it. Consumers who failed to update their version of Flash Player (which we are constantly being informed to do) were immediately infected with the CryptoWall ransomware.
The infection campaign began around the 1st January 2016 with ads being placed within avenues that included both NSFW and also video streaming sites. Below is an image to convey the geographic location of infections that have been caused by this new technique, as you can see, Spain is in the lead with 14.3% with the Netherlands, France and Poland that are next and are level with 11.4% each. The spread of countries according to this data is mostly within Europe, although an exception to this is South Korea.
Once a user has been infected they will typically see a CryptoWall ransom page window that will state the following as conveyed by the image below, it is a bit of an insult to say “Congratulations, you have become a part of large community Cryptowall” Users will need to pay a ransom as is commonly associated with these typical types of ransomware infections.
These cases highlight the need for a strong and reliable backup system which will help to mitigate in the event that your hard drive is encrypted, also, it is always essential to keep your browser, plugins and various system updates current for your OS. If you wish to add further defenses then it may be worthwhile to either disable or uninstall Flash Player as well as running an up to date Anti-Virus and Malware scanner.
These types of infections will become more and more advanced and also very common in 2016 and vigilance is required by users in order to help to avoid such attacks.
It is one thing to buy a wireless network card and set it up, but a lot of the time you can improve the performance of these cards with a few aftermarket items. Today I’m taking a little look at some of the upgrade offering that SilverStone offers, the WAB1B Magnetic WiFi antenna base, the WAD17 7 dBi high-gain antenna, and the 9 dBi WA219 high-gain antenna.
An antenna base can help you with two things: Move the antennas to a more open space where it’s less crowded than behind your PC with all the add-in cards and cables running there. The second bonus is that you can move the antennas to a place where they will get a better reception than they would behind your chassis that’s possibly located under your desk. SilverStone’s WAB1B antenna base is also magnetic which allows you to place it on the side of things, such as your PC chassis if it is made of metal. This increases the placement options even further. To finalize the whole thing, SilverStone added gold-plated connectors at both ends to ensure the best possible signal transfer.
The included antennas might not be total to your liking when you buy a network card, for one reason or another. The most common reason to get aftermarket antennas is the higher performance that they offer over the mainstream and included antennas. Antennas with higher gain also come with an increased size as you can see on the photo below. The smallest antenna is the default antenna included with SilverStone’s ECWA1 PCIe card and the other two are the WAD17 and WA219 respectively.
The WAD17 is a dual-band antenna for use with both the 2.4GHz and the 5GHz band while the WA219 only is a single-band antenna for use with the 2.4GHz band. Both have their usage scenarios, so which you pick comes down to what you need. The long WA219 has a performance of 9dBi while the WAD217 has a performance of 7dBi, both quite a bit better than default 5dBi antennas, or worse.
China’s air quality is poor to say the least, recent reports of families having to use an air purifier in their own homes to avoid breathing the air is quite shocking. The atmosphere in China has become worse over time to the point whereby it is difficult to recall a situation when the countries skies were not full of pollution, luckily, China and its Weather Network has produced a visual representation and it’s certainly worth a look.
Below is a series of snaps which has been stitched together and conveys the Beijing sky conditions most for most of the time in 2015, it is quite revealing when you consider how toxic the pollution is and is particularly evident when you look at images labelled 2015-12-1 and 2015-12-25.
Image Below appears to be a colour chart of images that represent around 300 days of 2015, how do I know? yes I did count them and I came to around 290 days which I rounded up to 300, so give or take it is close to a year in total. The image also conveys the levels of extremes that exist and how it can be a danger to anyone’s health.
Below is an image which is quite fascinating, the sky here looks pretty natural and there is a good reason for this, “during the Chinese People’s Anti-Japanese War and the World Anti-Fascist War commemoration of the 70th anniversary of victory”celebrations, the government imposed air “quality protection measures”, this included a shutdown of factories and a ban on cars and high emission facilities. The result was a dramatic change that started from August 20th and continued until August 24th, 2015.
These images are interesting because tech has been used to document a hot topic climate issue in today’s world.
Facebook is certainly a phenomenon which has travelled to huge swathes of the globe and, in turn, gained huge adoption. From celebrities being able to instantly update followers regarding their latest escapades to Mr and Mrs Blogs posting various life events, including cat pictures, the tech giant has certainly been a force. But! Did you know Facebook has been around for 46 years? Well, a new bug has been unearthed congratulating many users on reaching the milestone of “46 years of friendship on Facebook”.
As you would expect, this is slightly odd considering the social network has not even been around for 15 years, below is an image to illustrate the notification which many confused users have received. So, what is going on? Facebook has not disclosed the exact cause of the bug but has released the following statement.
“We’ve identified this bug and the team’s fixing it now so everyone can ring in 2016 feeling young again.”
There is speculation that the bug originates from Unix, let me explain, the aforementioned Unix “underlines many of the world’s servers and it keeps time by counting up from zero at one-second intervals from the date with which the clock began, this is also known as the epoch. Therefore, this conveniently happens to be January 1st, 1970 at midnight Greenwich Mean Time, or 46 years ago today going by Eastern Standard Time. Every second from that point on is known as epoch time and thus is why some gadgets may in theory switch to December 31st, 1969.
Has Facebook suffered from its very own Y2K or millennium bug? If it has then it’s been very small considering it has been subsequently rectified at speed. On a side note, even users who are significantly under 46 years of age have also seen this message, perhaps someone at Facebook has been on the New Year drink too soon.
Christmas is a time for giving and sharing. More often than not these days games and consoles are shared, resulting in a very busy period where everybody is online creating new accounts and downloading their new game updates. Sony’s PlayStation network (PSN) seems to be having some problems with the Christmas boom, resulting in slightly slower than average speeds for getting the required emails.
The Playstation network appears be having trouble sending the verification emails you get sent at the start when you first create a new account or a sub-account, meaning that you won’t be able to create a new user without a major delay. What we describe as a major delay though is that these emails appear to be taking longer than an email to be received.
As late as 5 PM GMT, AskPlayStation, the official twitter handle for Playstation support questions, tweeted that they were still looking into the issue regarding account validation and password resets.
We're still looking into the issue with new PSN account validation and password reset emails being delayed. Thanks again for your patience.
While users are still able to play single-player games and even download the updates to their games, you won’t be able to play multiplayer games or even visit the Playstation store to spend those gift cards that you received and purchase new games straight to download. While this should be a temporary glitch, how long it will last for is anyone’s guess.
This is bad news for Sony after problems with PSN hacking in the past and now this, it’s no surprise that people are upset with Sony and the PlayStation franchise and you can see why Microsoft’s Xbox is a market leader in certain regions and countries across the world.
Have you been affected by this delay? Do you know someone who got a new Playstation device this Christmas or even a new game that they have been longing to play online? Let us know in the comments below.
Cyber security and the integrity of applications are essential for consumers to have confidence their details will be kept safe and not intercepted by a third-party. Well known internet hardware company Juniper networks have issued a warning concerning a discovery it has made within its firewall software, which could have led to a third-party being able to decrypt data which has been sent through an encrypted VPN (Virtual Private Network)
During a recent internal code review, it was discovered that “unauthorised code” had somehow made its way into Juniper’s ScreenOS software, it’s interesting to note that many ISPs (Internet Service Providers) and also large firms implement the companies routers and network switches. The vulnerability could have allowed a third-party, or as the company refers to the threat as a “Knowledgeable attacker”, could be 12-year-old for all we know, to gain administrative access to NetScreen devices and to decrypt VPN connections.
The unwanted slice of extra code has been present within different versions of ScreenOS since 2012. Juniper has confirmed that it is not aware or received any reports of the vulnerabilities being exploited and urges everyone running the affected devices to quickly apply the released patches with the aim of stripping the unauthorised code out of its firewall software ASAP.
It’s a serious breach and questions will surely be asked concerning how the code managed to make its way into the software.
Virgin Media recently implemented a significant speed increase for customers on 50Mbps, 100Mbps and 152Mbps connections to 70Mbps, 150Mbps and 200Mbps. Initially, the speed boost appeared to be a free promotion to encourage more customers to join Virgin Media. However, Virgin Media has admitted there will be a price hike by up to £3.99 per month. Additionally, the line rental fee will also increase by £1 which signifies a 5.4 percent change. The company explained on social media why the higher prices are necessary and said:
“The price changes are put in place to sustain and build on our network to give you even better quality”.
Virgin Media’s managing director, Gregor McNeil, told The Register:
“We are doing everything we can to keep prices as competitive as possible. Through the continuing investment in our network we are again upgrading our customers’ broadband speeds and providing unlimited downloads – meeting the growth in data consumption we see.”
Unhappy customers can cancel their Virgin Media subscription as the increased prices are technically a breach of contract. I think the company has to be extremely careful as consistently increasing the package prices will result in mass cancellations. Faster broadband speeds are vital when you consider the huge data demands with modern gaming, and 4K video streams. However, the packages need to remain affordable or people will simply go for a cheaper, slower option.
Internet connection speeds have been somewhat of a hot topic over the last decade or so, consumers who demand ever-increasing speeds while internet service providers have been particularly lagging in certain regions of the world. Well, researchers who are developing new super-super-fast standard 5G mobile technologies have what has been described as a “playground” which they can visit in South Korea.
It has been reported that Service provider SK Telecom (South Korean wireless telecommunications operator) has unveiled its Research and Development “5G Playground” on Thursday with partners including Ericsson, Nokia, Intel and Samsung Electronics, it was also announced that a collection of regional standard bodies would host a series of events with the aim of building a global consensus on the emergence of 5G.
5G is potentially an important breakthrough after SK Telecom and Nokia demonstrated the possible capabilities of this network which ran at a super quick 19.1 Gbps (per second), SK plans to launch a 5G trial service in 2017.
As technology is becoming more advanced so is certainly the need for a turbo charged infrastructure, this new standard is expected to be completed by 2020, although it might vary as to the rollout progression speed per country. Until at least then, many people will have to make do with current speeds.
Just over a week ago, TalkTalk’s website fell victim to a cyber-attack and revelations have emerged regarding the company’s poor security infrastructure. It seems these attacks are becoming more prevalent as today, Vodafone admitted a data breach involving 1827 customers’ personal information. This include their bank details and telephone number. A spokesperson from Vodaphone confirmed the attack, and reaffirmed that it wasn’t due to their security measures:
“This incident was driven by criminals using email addresses and passwords acquired from an unknown source external to Vodafone. Vodafone’s systems were not compromised or breached in any way.”
“Whilst our security protocols were fundamentally effective, we know that 1,827 customers have had their accounts accessed, potentially giving the criminals involved the customer’s name, their mobile telephone number, their bank sort code, the last four digits of their bank account,”
“Our investigation and mitigating actions have meant that only a handful of customers have been subject to any attempts to use this data for fraudulent activity on their Vodafone accounts.”
A number of sources on Twitter have suggested the attack came from The Dark Web:
Vodafone says seen attempts to access 1,827 customers accounts after data theft – but says data came from dark web, no breach of its systems
The spokesperson went on to discuss the data loss’ ramifications and said:
“However, this information does leave these 1,827 customers open to fraud and might also leave them open to phishing attempts,”
“These customers’ accounts have been blocked and affected customers are being contacted directly to assist them with changing their account details.”
I do find Vodafone’s excuse to be fairly laughable and they have to take responsibility for the data loss. Professionals aren’t going to hack a major network without some form of protection and will make it difficult to trace. The information gathered is more than enough to cause chaos in terms of a person’s bank balance and can be used to help find other details like an email address.
Facebook suffered a major outage, which at the time of writing is still ongoing. It’s unclear what the issue is at this time, but it’s immediately apparent that it’s a major problem.
The site went down just shortly after 5pm BST, with the desktop version of the site throwing up the error message seen in the image above. Often, the mobile site will still work if the main site is down, but that isn’t the case at the moment and as you can see from the image below, many people are simply getting a blank feed.
Of course, Facebook being down isn’t all bad news, as Twitter traffic is now booming, with people flocking to the service to complain about their other social media of choice.
BT’s Chief Executive Gavin Patterson, has promised broadband speeds between 300 and 500Mbps by 2020. Currently, BT is one of the major UK internet service providers and aims to provide super-fast broadband to over 10 million homes. The company also said they will offer a 1Gbps service to the cope with severe network demands from heavy users. This could include 4K streaming, downloading huge games or backing up data on a home server.
2020 seems like an ambitious figures for rural areas which struggle to even access relatively low speeds of 5Mbps. BT is hoping the combination of their G fast technology and Fibre-to-the-Premises (FTTP) connectivity can help revolutionize the service’s internet speeds. Patterson argued speed increases are integral to BT’s market strategy:
“BT would ‘never say no’ to providing faster broadband to communities, promising the company would instead explore innovative funding and technical solutions.”
Even if BT manages this feat, I’m not entirely convinced it will be able to beat Virgin Media’s network speeds and a great deal depends on network traffic management. It’s unknown if the latest BT network will begin to throttle speeds after so much is downloaded or during peak times. This is becoming a more well-known phenomenon, and customers should access the speeds they pay for all the time.
Thank you The Next Web for providing us with this information.
The Skype client is currently experiencing a whole range of connection issues and sporadically showing contacts as offline. Skype has released a statement surrounding the network hiccups and clarified:
“We have detected an issue that is affecting Skype in a number of ways.
If you’re signed in to Skype, you will not be able to change your status and your contacts will all show as offline even if they are online. As a result, you won’t be able to start Skype calls to them..
A small number of messages to group chats are not being delivered, but in most cases you can still instant message your contacts..
If you aren’t signed in to Skype, you may be experiencing difficulty when attempting to sign in. Any changes to your Skype account such as your Credit balance or your profile details might take a little while to be displayed..
You may also have difficulty loading web pages on the Skype Community. For that reason, please check back here for future updates..
We’re doing everything we can to fix this issue and hope to have another update for you soon. Thank you for your patience as we work to get this incident resolved.”
I’ve personally tried to access the Skype service for around 3 hours to no avail but still able to send messages despite appearing as offline. Therefore, it’s possible to communicate with others through text but this could greatly depend on a number of factors, including your location. I would also recommend using the web interface which doesn’t seem to exhibit any network flaws at this time. Whatever the case, the Skype client is suffering quite badly and angered a great deal of its users.
Planning on joining one or maybe building one yourself? Knowing what it takes to make remote teams work is an invaluable asset. From the Clarity blog, Zapier.com and Fast Company, here are some ways you and everyone else on your team could keep in touch throughout the workday:
Offering cloud-based video solutions, Blue Jeans has a presence in many international markets, such as UK and Australia. The service is easily available and convenient to use. Because it’s cloud-based, you won’t have to worry about hardware upgrades or hiring IT guys. This will run on any smart phone device. You and your mates won’t have trouble figuring this one out.
It’s free. So if you and your team want to save up on costs, this is the way to go. Do video calls, voice or text chat with everyone without putting a dent on your company’s pockets. Group call feature lets you have up to 9 people in one call. Better features come with the subscription, like forwarding calls or calling phone directly. But if basic communication is what you had in mind, this one is gold.
Need an easy way to organize everything? Hand out tasks? Inform your team of a new project? This remains one of the best ones out there. It looks like an online version of a project board, making it easy for everyone on the team what their assignments are for the day. And yes, like many of the great collaboration tools out there, it’s free.
Connecting with this one is easy. All you have to do is sign up for a Gmail account and you basically get one. Like Skype, video calling allows for as many as 9 people in one call. However, if your team’s a bit bigger, you may want to look elsewhere. This one’s also free which is a major plus point for small teams that just need an easy way to stay connected.
Share and manage files with ease. Keep records and collaborate on projects. Manage the daily task distribution. You can do all these with Huddle. The first 14 days are free. After that, you’ll need to sign up for a subscription.
Create projects, assign tasks, join threads and more. Staying in touch is easy with Basecamp’s features. Every person, project and file is accessible. However, only the first 45 days of the service is free. The next ones will set you back at least $20.
This is one of the best file sharing tools out there. You can organize your docs, spreadsheets, pdfs, epubs and more. Send or receive files without any trouble. The best thing about it is that you and your team can read and edit files in real time. Need to send out a proposal? You won’t have to keep 99 drafts of a single document on your desktop. Google Docs keeps a revision history so reviewing previous versions is easy. Best of all, it’s free.
Need a virtual office? This one works great. A Slack group chat room can accommodate the entire team. Make sure to explore the channels. However, too many of you in one channel could get too noisy. No worries though if that happens. Simply sectioning off rooms will solve the problem.
Create projects and send assignments to your team. The ticket history is on there as well so it’s easy to track where the projects are or if one fell through the cracks. Make sure everyone has a steady internet connection though. Also, too many people in one project could slow down Asana. The free version already works great but if you want to explore more options or want to add more of your team, going with the paid version is a great idea.
The thing about GoToMeeting is that it’s basic. If you want fashionable, hip and cool, you won’t find it here. But if you and your team of 10 need to do a video call, this one packs solid audio and video quality. If you want to video call 2-3 people, Skype works great. But if you’re 10, this option is better.
Working remote often means you work with people on different time zones. That means people work late at night or early in the morning. With f.lux, you won’t have to worry about constantly adjusting your screen—and wasting minutes every day just to get the right tint back for nighttime or early morning work settings. The app adjusts the tint so your eyes can work without the glare.
These are simply some of the apps and tools you and your team can use to work, stay connected and improve collaboration. You don’t need to use them all. Just make sure to explore your options to find which ones your team will go for.
The Tor network is commonly referred to as ‘The Dark Web’ and perceived as an encrypted space to exchange illegal goods or engage in unscrupulous activities. While this is generally true, it only accounts for a specific portion of TOR users and there are legitimate case scenarios. This viewpoint is shared by the Internet Assigned Numbers Authority (IANA), Internet Corporation for Assigned Names and Numbers (ICANN) and the Internet Engineering Task Force (IETF).
These are three major internet regulators publicly advocating the use of Tor in certain circumstances and designated the .onion domain, for sites hosted on the Tor network. Additionally the .onion domain was described as a “Special use Domain” which enhances its legitimacy. Richard Barnes, Mozilla’s security head for Firefox told Motherboard:
“This enables the Tor .onion ecosystem to benefit from the same level of security you can get in the rest of the web,”
“It adds a layer of security on top.”
This also means that sites can be verified to see who the real owner is through SSL and TLS security certificates. Using Tor is a contentious issue as many users feel it’s a mysterious and unknown portion of the internet. Governments have overstepped the mark and intruded on people’s privacy in the last couple of years. Therefore, Tor could bring about improved privacy and protect individual’s data. Although, there are concerns about the type of individuals using ‘The Dark Web” including drug smugglers and other criminals.
Thank you Motherboard for providing us with this information.