Many will remember playing RuneScape, the game being a popular favourite way back before large multiplayer online games flooded the video game market. From the village of Lumbridge to the battlegrounds of Castle Wars, players levelled and traded all within their browser for the past fifteen years, but that is all to change with the games relaunch, featuring new game engine and all.
Using a new visual engine and game client, RuneScape will no longer be played in your browser, instead sitting on your computer awaiting your adventures. The new adventure isn’t going to just change its location with a wide range of technical improvements including support for DirectX12 and Windows 10.
The graphics are clearly on a whole different level to the pixels and blocks that once strained your eyes as you mined for copper and tin, with new draw distances, water effects and dynamic lighting and shadows now welcoming you into the world on a whole new level.
Jagex isn’t stopping with the new game client, with Jagex promising further enhancements to the game’s visuals, including the inclusion of volumetric lighting, improved animations, and higher-resolution textures.
I remember starting back on RuneScape many years ago, and the new graphics definitely look to bring the urge to boot it up again to the surface. If you’re interested you can download the new game client here.
We’ve all seen those huge URL’s, be it for a website or a document you have saved in the cloud, they just seem to go on and on with no sign of ever stopping. Then you spot the tiny URL they offer you instead, short and sweet with only a few letters and numbers to copy and paste before you can open your document anywhere you want. Why not use it? well for starters that small URL may be creating just as easy a path to spy on your data!
Research conducted by Martin Georgiev and Vitaly Shmatikov suggest that looking at the abbreviated “short URL’s” used by companies such as Google, Microsoft, and even bit.ly, a company dedicated to creating and sharing short URL addresses, revealed that using a simple trial and error method they were able to gain access to your cloud storage files.
In particular, Georgiev and Shmatikov were able to find and access files shared through Google Drive and Microsoft’s OneDrive with short URLs. If this wasn’t scary enough, someone could place malicious code in the files that had write permissions enabled, allowing them to infect and spread their effect all through one of your files stored in the cloud. Estimating that around 7 percent of the accounts on OneDrive and Google Drive they scanned were vulnerable to this flaw, it’s scary, to say the least.
More worrying may be companies differing responses to be being alerted about this result, with Google doubling the character length of their short URLs, while Microsoft stated that the vulnerability “does not currently warrant an MRSC case”, while quietly removing the short link function on OneDrive so not to expose others to the problem while they no doubt investigate.
Home security is an ever growing trend as technology gets smaller and better, and at the same time, it has also become a lot for affordable for the everyman. Today I’m taking a look at one of the best options in this category as I’m having D-Link’s DCS-2630L Full HD 180-Degree Wi-Fi camera in my testing area.
Where most home surveillance cams still come with a 720p resolution, the D-Link DCS-2630L adds on top of that and goes for the full 1080p HD experience. While 720p already was a great step up from the old 480p CCTV resolution, it just doesn’t beat the full 1080p experience. The Wi-Fi camera doesn’t just feature a higher than usual resolution, it also has a wider field of view, allowing you to monitor an up to 180-degree wide area. Basic cameras only feature 90 or 120-degree view, which allows the DCS-2630L to monitor a lot more real estate that might prevent you from having to set up multiple cameras. The wide-view feature is also a lot better to look at than a traditional wide view fisheye view that distorts things.
Recording video during daylight is an easy thing, but you’ll also want to monitor what is going on when it is dark. After all, that is the time of day we usually associate with the need for protection. D-Link added night vision capabilities to the DCS-2630L with the help of infrared LEDs. The camera is able to see in the dark at a distance of up to 5 meters (16 ft) with as little as 0 Lux light. The IR LEDs can be turned on manually, but the camera also features a light sensor that determines when the LEDs are needed.
Next to the six infrared LEDs it also features two PIR sensors that detect infrared radiation when a person or animal passes for an enhanced motion detection. And that is another one of this cameras features next to just being able to show and record what is going on. With motion and sound detection, the camera is able to start recording automatically and also push notifications to your Windows, Android, and iOS device to let you know that something is going on.
D-Link didn’t just built-in a high-quality microphone to picks up on loud noises like the breaking of glass and the ability to send notification alerts about this to you, it also features 2-way audio that lets you send audio back. A high-quality speaker lets you respond to what you see using your mobile device.
The Full HD sensor features a 1/3-inch 3-megapixel sensor to deliver images in sharp and rich detail while the camera lens is made of glass for the best possible results. All this coupled with the de-warping technology provides you with a clear and ultra-wide 180º field of view surveillance. The camera comes on a wall mountable metal stand with flexible tilt for the perfect angle. It also allows for 360-degree camera rotation to ensure that your video is perfectly level no matter where you mount it.
It isn’t just an ordinary camera on the Wi-Fi field either and it sports a proper IEEE 802.11ac dual-band connection for reliable connection and streaming. It also features a convenient WPS button for quick and simple connection with the rest of your network infrastructure. Everyone can press a button. Besides the wireless connection, the DCS-2630L also features a micro SD/SDXC card slot with support for up to 128GB memory cards. This is enough for up to a week of continuous clip recordings in 1080p.
A status LED lets you know what your camera’s doing, which is a simple but very useful thing. The whole camera in itself is very power effective and only requires a USB connection, portable battery, or power adapter to run. That also allows for a very flexible placement.
2 PIR Sensors – Enhanced motion detection senses when a person or animal passes for accurate
microSDXC card slot – Record continuous, scheduled, or detection triggered video clips directly to a
microSDCX card up to 128 GB
Unique De-warping technology maximizes the video quality with less distortion to eliminate a fish-eye view
Built-in two-way audio
mydlink app support for iPhone, iPad, Android and Windows phone
Packaging and Content
The D-Link DCS-2630L is a consumer camera and the package is made to be as eye-capturing as the camera catches images. The front displays the camera and usage scenario as well as the product highlights such as 1080p and wide eye lens.
On the rear, you will find more detailed information about the functions and features, all in full colour.
Each of the two sides is full of information too. On one side you find the features listed above the minimum requirements for usage.
On the other side, you’ll find a simple representation of the camera’s easy setup and usage as well as what’s inside the box.
Besides the Wi-Fi camera itself, you also find the Quick Install Card with default information and QR-code for easy setup as well as the quick install guide, a GPL code statement, and an assistance card if you should have trouble.
There’s also an AC/DC adapter included with a plug fitting the region where you bought it. Due to the clever usage of micro-USB, the camera is also easily powered otherwise.
In this day and age, keeping your customers up to date is as important as getting them on board in the first place. Reports started circulating yesterday that Yahoo users weren’t able to access their email accounts, and all they got for their troubles was a single tweet.
Originally reported in the thread titled “Yahoo Mail has been down for 14 hours, affecting thousands of users in Europe”, users went from saying that having their service shut down without any response being unacceptable to the barrage of comments from users asking if the service was ever truly running these days or how many people were actually affected by the problem (including a rather large barrage joking about the use of Yahoo mail for business reasons).
After checking out Yahoo Mail’s twitter page (the quickest way to update people these days on issues it would seem), the page was filled with nothing but advertisements spread out over days with no communications regarding the reported outage. That was until we checked out their support page, Yahoo Care. Amongst a slew of advertisements for their fantasy baseball teams was a single tweet saying that some users were experiencing issues.
Some @YahooMail users are experiencing issues due to an undersea cable cut by a 3rd party. Fix is a few days out. We’ll keep you updated.
In this day and age, taking days to fix a problem with little to no support for your users seems like a quick way to lose people to other webmail solutions like Gmail and Outlook. We will try to keep you updated (as well as we can with the little information that seems to be available at this moment).
We’ve all seen the competitions you can enter online, ranging from entering a competition on a forum to having to create and upload a piece of work. A common type of online contest is where you upload pictures, but be warned, some people may own patents to the entire concept of online contests.
Ruth Taylor is a Pennsylvania-based photographer who often runs photo contests on her website, BytePhoto. Along comes Garfum.com, a video website owned by New Jersey’s Michael Garofalo, who claimed that the competitions run on the site infringe on US Patent No. 8,209,618. The patent refers to the ability to create user accounts, upload content, organise the content and have users vote on the content, all rather vague terms given the digital age.
Initially requesting $50,000 in the lawsuit, Garofalo’s lawyers reduced this to $5,000 and then $2,500 later on. In an attempt to defend herself Taylor got in touch with the Electronic Frontier Foundation (EFF), a group that deals with electronic rights, who took up the case pro-bono. Filing a motion to dismiss the case the EFF claimed that it should be thrown out of court under the Alice Corp precedent, a precedent that claims just because something is done via software the patent needs to cover something more than an abstract idea.
Garfum dropped the case before it went to court, however, the EFF didn’t end it there, filing a motion to seek attorneys’ fees for the case. EFF lawyer Daniel Nazer stated, “the idea that you could patent an abstract idea, find innocent enthusiasts online and demand settlement money—and then slink away once challenged and before the court issues a ruling—goes against any sense of fair play”.
The total cost to cover the fees would come close to $30,000, with even more added because of the latest motion. Something that could soon become a reality sooner than expected with US Chief District Judge Jerome Simandle stating in an opinion that due to their “unreasonable” behaviour during the case, Garfum should end up paying the fees.
With the release of their new operating system, Windows 10, Microsoft has been keen on getting users to use their new software. Not just their operating system is new, though, with Edge replacing the demonised Internet explorer. One thing that has kept users from accepting and using the new browser is its lack of extensions, something that is set to change this year thanks to a tool Microsoft is currently working on.
It’s been clear for a while that rather than open another market for extension developers to create their tools in, Microsoft would look to bring Chrome’s extensions to Edge. In a tweet from Jacob Rossi, an engineer working on Edge, the picture becomes a little clearer on how they want to do this.
Lots of questions on this: yes we're working on a porting tool to run Chrome extensions in Edge. Not yet finished and not all APIs supported
So it would appear that they are working on a tool that will enable you to port your favourite Chrome extensions over to the Edge browser. While a further response showed that they would still be working on creating a list of extensions directly for Edge.
@jacobrossi extn's in Store will at first be a carefully selected set covering top scenarios and API coverage, opening up to more in future
In this day and age, people enjoy customizing their experience with everything. The same goes for their experience when browsing online and with the likes of Mozilla’s Firefox and Google’s Chrome offering countless extensions Microsoft’s latest browser, Edge, seems to be lacking the feature. That looks to change with Microsoft looking to test browser extensions for Edge soon.
The revelation comes from a change on their Edge extensions website, giving us an idea about now just what’s going to happen with their extensions but also what the first three will be.
First up will be a translation tool, followed by an extension for Reddit and finally an extension based on mouse gestures. If this doesn’t interest you the next part may, with the extensions compatible with Chrome as well.
Given their recent decision to end porting of apps from Android to Windows phones, the ability to use the same app’s on Edge and Chrome could entice people to explore the browser a little more, even if it does come with Windows 10.
Using extensions may have to wait though with the feature looking to be inserted into a future insider preview meaning that those of you who want to keep running a “stable” operating system may have to stick to Firefox or Chrome for that personal web feeling.
When people hear search engine there are a few that jump out at them, with many instantly going straight to Google. The popular search engine has helped create everything from the web browser Chrome to self-driving cars. One of their latest endeavours will be to recommend articles directly into your web browser.
Currently still in testing, the new feature is not available for public use or even beta but would see a list of articles recommended based on your most-visited sites. Recommended articles would appear on the new tab page for their Chrome browser, meaning that opening up a new tab could bring to a site you never even thought about visiting before.
Currently, the feature can only be uncovered by reading the tickets on Chromium Code reviews, something which VentureBeat has done with amazing detail. Amongst the discovered tickets the feature (currently known as “ChromeReader” or “Morning Reads”) uses a hard coded set of search parameters, meaning that everyone would see the same results no matter what they visit or see. This will obviously be changed before its release and would be required for the “snippets” to become something most people would use.
Snippets would include everything from a few words in the header to a brief description, with recommended features being changing how often they would fetch snippets and information based on how much power your device has or what time of the day it is.
There is no knowing what you might find on the internet, sometimes a quick ten minutes at your computer can turn into 20 minutes of YouTube videos of cats playing music before you realise what you were originally going to do.
Do you think that a new feature like this would help you? Would it just be a gimmick to give Chrome another feature on an already impressive arsenal?
Often the simplest domain names are the best and unfortunately for Elon Musk, Tesla Motors has had to settle for the teslamotors.com domain instead of the more highly prized tesla.com. The tesla.com domain has been held since 1992 by a Nicola Tesla fan, Stu Grossman, however, it has remained largely unused for the last 24 years. Now a Tesla spokeswoman has confirmed that the electric car company has taken over the domain, with tesla.com now redirecting to the existing teslamotors.com.
It is unclear what prompted the acquisition of the domain after settling for the use of the Teslamotors.com domain for so long. The domain issue was raised by Elon Musk during the launch of Tesla Motors’ Tesla Energy wing when asked if he had plans to rebrand the company to simply ‘Tesla’ to match its new scope. With this solved, a Tesla rename could be coming up as well as giving Tesla a better grasp on the web for both their electric car business and battery units, as well as whatever Musk may have planned for the company’s future.
What caused Grossman to give up the Tesla domain is currently unknown. According to Bloomberg, John Berryhill, an attorney in Pennsylvania who represented Grossman in a past dispute with Tesla Industries Inc stated that Grossman had bought the domain for personal use due to his affinity for the inventor Tesla. Berryhill said that “Grossman had been approached by many people about giving up the name.” He went on to surmise that Tesla Motors’ acquisition of the tesla.com domain name was part of a voluntary arrangement between the two parties.
Whether it was due to a voluntary arrangement or a lapse in occupancy of the domain by Grossman, the tesla.com domain undoubtedly belongs to Tesla Motors now. This occupation of the tesla.com domain could kick-start a rebrand of the Tesla Motors company and allow them to expand their web presence beyond just cars, whether that expansion will include Musk’s desired electric VTOL remains to be seen.
Twitter didn’t have the best stability record when it first launched and there were so many downtimes that we even got a nickname for them, the fail whale. Recently Twitter has upped the service standard and the outages have become rare, especially a complete one like it is being experienced this morning.
The web service, mobile services, and the APIs for Twitter were down and inaccessible starting at 8:20 am GMT, with users getting error messages warning that the network was over capacity and also suffering an internal error. Roughly 40 minutes later Twitter confirmed the outtake, but via a Tweet on the @support channel that no one could see, because the service was down. They later emailed the same statement to several news sites to let people know that they are aware of the issue and working hard on getting it resolved.
“Some users are currently experiencing problems accessing Twitter. We are aware of the issue and are working towards a resolution.”
Twitter’s own status board also began updating with the status at 9 am this morning, confirming the outage. Four out five public APIs went down at the same time, but the company hasn’t revealed whether it was hardware issues, software updates that failed, or whether they experienced any form of attack. We’ve seen quite a lot of high-profile sites being hit with severe DDoS attacks lately, including the BBC.
At the time of writing, the servers are responding again, but the service isn’t available yet. Visiting the site will show the well known “Something is technically wrong. Thanks for noticing, we’re going to fix it up and have things back to normal soon.”
Are you missing Twitter or wouldn’t you have noticed it at all if we or someone else hadn’t told you? Let us know in the comments section below.
Recently Dell has received a lot of attention regarding their security, to be more precise it was due to a digital certificate. These are small pieces of code that are used to encrypt the traffic between your system and any website or online system you use, remember that little padlock in your URL bar on the browser? That means that it’s used a certificate to verify that this is a legitimate website and not a fake website.
The problems started when Dell shipped their systems with a certificate, private encryption key included, on their systems. This is like giving somebody the mold to create their own keys, or even conduct man-in-the-middle attacks, where you are able to act as a midway point for communication, and with the encryption details you could easily read the information being sent.
When Duo Security, a digital security company, continued to search they found at least 24 IP addresses which had certificates with the a different digital fingerprint but the same name, eDellRoot. Different lock, same name.
The problem with this is that some of the systems appear to be SCADA (Supervisory Control and Data Acquisition), a system seen as pretty important given it is often used in energy and manufacturing industries. While these systems are normally closed off from the internet, no access = minimal risk, the systems could have been misconfigured but still have a potential risk.
You can try the sites yourself, but popular up-time checker Down for Everyone is showing similar reports. We’ve even seen Chillblast reporting on Twitter that their services have been down, but seem to be back up at the moment.
What’s weird about this attack is that its specifically PC system integrators and retail websites in the UK, so it seems someone has picked this market as flavour of the week. Scan have reported that they received an email asking for Bitcoins if they want their service restored, but it seems that none of the sites hit so far as stupid enough to pay, as that would open the door for further attacks, nor guarantee their recovery in the first place.
Could this be the same? It certainly seems so and we’ll be keeping an eye on the situation as it develops.
Have you noticed any major UK tech websites down today? Let us know in the comments section below.
Typical Microsoft, the tech giant has more backdoors than Disneyland and World put together; the latest vulnerability that has been unearthed by researchers is a pretty serious breach and allows an attacker the option to steal e-mail authentication credentials from major organizations.
So what is it this time? The Microsoft Outlook Web Application or OWA in question is an Internet-facing webmail server that is being deployed within private companies and organisations, this then offers the ability to provide internal emailing capabilities. Research and subsequent analyses undertaken by security firm “Cybereason” has discovered a backdoor of sorts in the form of a suspicious DLL file. This file was found to be loaded into the companies OWA server with the aim of siphoning decrypted HTTPS requests.
The clever part of this attack is the innocuous nature of deployment in the form of the file name that was the same as another legitimate file; the only difference was the attack file was unsigned and loaded from another directory. According to Cybereason, the attacker (whoever it might be, mentioning no names) replaced the OWAAUTH.dll file that is used by OWA as part of the authentication mechanism with one that contained a dangerous backdoor.
Thus, this allowed attackers to harvest log in information in plain decrypted text, even more worrying is the discovery of more than “11,000 username and password combinations in a log.txt file in the server’s “C:\” partition. The Log.txt file is believed to have been used by attackers to store all logged data”.
The attackers ensured the backdoor could not be removed by creating an IIS (Microsoft Web Filter) that loaded the malicious OWAAUTH.dll file every time the server was restarted.
Indeed, yep, same old same old then, breaches of passwords is worryingly common in the digital age, there needs to be a radical re think of security infrastructure. I do feel companies are using tech as a cheaper alternative without investing in system protection or even real-time analyses, servers and communication lines are being ignored to the point whereby attackers have free reign over such systems. I wonder as I write this as to what else is being siphoned to individuals and attackers, if I see next the formula for Coke in China own brand cola, then it will make sense.
Thank you cybereason for providing us with this information.
Gianna Gnesa is a security consultant with Ptrace Security, a company based in Switzerland. He was set to speak at the Hack in the Box GSEC Conference that was to be hosted in Singapore. He has since decided to cancel his presentation.
The presentation was set to reveal the breaches in security systems that utilize internet connected video games (IP cameras). Gnesa has since cancelled his presentation after “legal pressure from manufacturers affected”. In the talk, Gnesa was set to “expose vulnerabilities found on major surveillance cameras and show how an attacker could use them to stay undetected”.
Traditionally security consultants work on a “responsible disclosure” policy, in which they only release date about defects or issues with security once the manufacturers or developers have had time to develop and release patches to fix these issues.
I think we can all say that Adobe Flash Player is very much being knocked to its knees in recent months, from endless, and I do mean endless, vulnerabilities which put countless users at risk to the annoying aspect of running a plug-in which enjoys crashing and breaking functionality on a regular basis. Well, now the BBC has also seen the light and are implementing the HTML 5 web standard language within its BBC iPlayer service.
The move is seen as progress and an update which modernizes the service and security aspect of the site. The BBC state that it is “now confident [it could] achieve the playback quality you’d expect from the BBC without using a third-party plug-in such as Flash player”. Users have also been invited to visit a BBC site where they can set a cookie in their browsers that will allow them to access the HTML5 player when they visit iPlayer in future. However, the Flash version will remain available.
Security analysts have responded positivity to the news but have also confirmed that Adobe Flash still has a role; this has been echoed by security expert Chris Green, who says “The industry has moved on from trying to shoehorn one thing in, whether that is Flash or Microsoft’s Silverlight. It continues to be very effective in delivering rich content into web pages.”
The BBC is testing the new more improved player on a range of browsers, these include Firefox 41, Safari on iOS 5 and above, Opera 32, Internet Explorer 11 (Good luck with that piece of, let’s say junk, as this is a family site) and Microsoft Edge on Windows 10 (Good luck with that piece of, to be fair I have not as yet tried edge but anything with the words browser and Microsoft in the title concerns me) and Blackberry OS 10.3.1 The BBC added that it was also going to “move away from the BBC Media Player app on Android devices” with users invited to join a limited beta test
HTML 5 is considered the standard in content delivery and the BBC are implementing this with the aim of modernizing the service, it will be interesting to see how it works and also how rapid the decline of Flash will be in the coming months and years. It is worth noting that Flash is used by Amazon and Hulu among others, which is positive for them, it’s just frustrating for consumers who have to put up with a range of exploits which make services insecure.
Thank you bbc for providing us with this information.
Mozilla has officially unveiled the latest version of Firefox which incorporates an intriguing messaging service called, ‘Hello Beta’. According to Mozilla, this is the world’s first communication tool embedded into a browser which allows users to send and receive messages during a video call. The company said about this latest venture:
“Firefox Hello Beta, developed with our partner Telefónica, is the global communications system built directly into a browser and it will now let you send and receive instant messages when you’re in a video call in Firefox for Windows, Mac and Linux.”
While this might be true, I’m fairly certain other browsers have implemented something similar a long time ago. Also, many users might feel that this could make Firefox take up more system resources and become rather bloated. It’s certainly an interesting addition, but I highly doubt many people are going to use it for an extended period. There has been some confusion regarding this announcement and to clarify, this only works during video calls.
Ideally, I’d love to see Firefox adopt a simple layout without unnecessary features and optimizing RAM usage to make for the most efficient browser out there.
Bit of Tubthumping play on words in the title there, readers who remember the 90s will get this, surely, oh come on I am not that old! Anyway, many web services are intrinsically integrated with each other to bring benefits to consumers, all well and good then right? Yes and no, as having many services which rely on a single destination, whether it be a server or software, provides its own challenges; this includes a domino effect to any technical glitches that would inevitably affect other connected pages.
Yesterday (20th September 2015) there was a problem with a server in Virginia which affected most of the north-east of the US. This glitch in turn killed the infrastructure for many popular products and services including Netflix, Social Flow, Group Me and Amazon Echo among others. The error was described as an “Elevated API Error rates” but has since been resolved to normal operating functionality within the same day.
Any outage in these services for the US giant could lead to a painful financial loss; let’s take a 2013 technical outage as a study example. AWS suffered a similar problem which took services that included Instagram, Airbnb and Vine off-line, it was reported that Amazon accumulated a loss of about $1,100 dollars per second in average net sales.
To keep track of any potential errors there is a handy website by the name of “Amazon web services health dashboard” which publishes up-to-the-minute information on the health of services within four tabs, these are North and South America, Europe and Asia-Pacific. It’s pretty interesting and I have said before that in fact I do have a life, just intertwined with stats and tech that is all.
“Amazon reminds us of the good times We sing songs that remind us of the best times” ha!
The Massachusetts institute for technology (known as MIT) is known throughout the world for its technological prowess and skills. Producing proud graduates, it is known for being at the forefront of the information technology that we as a world use on an everyday basis. Once again it has scored first, this time, however, this is not good news.
Conducted by Security Scorecard, an information security assessment company, the company tested an assessment for several high-value universities and nearly gave MIT a failing grade. MIT scored low in several areas, including; hacker chatter (this measures the number of times the school was mentioned in online forums used by hackers and the amount of user details that were revealed online on these forums), patching cadence (how quickly reported patches were applied to deal with the vulnerabilities reported during the scan’s period) and IP reputation (the amount of malware communications that were coming from IP’s registered with the school).
MIT did score high in several areas, though, such as its Web Application Security, the health of its DNS records and finally the quality of its security at its endpoints. As with all things security is not something that can be considered fixed and left alone, it should always be considered and updated.
Over the past few years, online streaming services have boomed and most recently have started to take over from mainstream television channels, with companies like Netflix and Amazon Prime creating and buying up series to entice users to their services. The BBC is not one to shy from online services, with iPlayer offering up most of their shows from both radio and TV for users to watch and even download to play later through their own app should they be on a long journey with no internet. With television licenses being discussed, it’s not surprising that more information about the BBC’s digital future is coming to light, with more services focused on a greater crowd.
One of the steps the BBC want to make is to open up their iPlayer streaming platform to other content creators, meaning you could soon be watching shows run on other channels through iPlayer or variations of the service. Other internet-led plans include the creation of a video-led mobile news channel, titled BBC Newstream and the creation of iPlay, a variation of iPlayer designed for use by children.
Amongst the plans are also a new digital music discovery service, possibly expanding on the already large series of music that is used on their website and radio shows on a daily basis.
Do you use iPlayer or do you avoid it? If so why and could these features entice you to come back and use the BBC’s features on a more regular basis?
If I were to ask you to quantify the world’s economic activity and the regions which experience the biggest growth, you might say, I am not sure and what are you doing in my kitchen! But there might be an easier way thanks to Harvard University’s Owen Cornec who has devised a web-based map which allows the user to explore Earth’s economic relationships through 3D “confetti.”
Below are two screenshots of the map in action, when you first land on the Interactive Map page, it asks you to either experience a tour of the website or if you wish to skip this you can jump straight to the start to “visualize over 15 trillion dollars worth of trade”. If you click the “Visualize” button, you will receive the following screenshot of lots and lots of multi coloured dots being dropped onto the globe, each dot represents $100 million dollars worth of exports. You can also view the globe by clicking on the virtual earth or you can “Select a country” to specifically view information relating to trade.
The below screenshot is the “map view” by clicking this link you will see a flat map which might make it easier to see more data in one place, either option has the ability to utilize a “Full Screen” feature. At the bottom of the page, there is a colour coded key which illustrates trade within sectors that include Metals and also transportation along with other commodities.
This map adds a new dimension to the world’s trade while being immersive and thought-provoking, the colour scheme looks to be a giant firework party, but it’s certainly worth taking a minute to view the range of features and options available within this educational yet fun website.
Thank youharvard for providing us with this information.
For a while now we have been introduced to many games which are called “exclusive”, games which are normally built especially for a single console or series of consoles. The rarity often means that people will flock to those consoles to play these games, and this often means that the games themselves are made in part by the company that owns that console. Examples include the likes of Killzone for Sony’s Playstation (even if Sony do have a legal battle over advertisement) or even the Gears Of War series for Microsoft’s Xbox series (New remastered version available for PC and Xbox). One of the more popular series is Halo, originally built for the PC and Xbox, it has quickly become an exclusive for the Xbox series, with the occasional game for mobiles and tablets. Now thought you can enjoy all of Halo’s main menus (music included) in your web browser.
Halome.nu hopes to scratch that itch you have when you leave your Xbox and TV on just so that you can listen to the soothing music that once welcomed you to the field of battle with the Covenant. With the menus and music from not only Halo 1, 2 and 3 but also ODST (orbital drop shock troop), Reach and Halo 4, its possible to listen to the music for hours on end with a surprisingly calming and relaxing tone.
With series like Halo: Forward Onto Dawn and Nightfall expanding the halo universe from books to movies and series, soon you might hear just a little bit more about Master Chief.
Ofcom has conducted a survey into the browsing habits of UK citizens and discovered the majority of users prefer mobile devices for internet browsing. Previously, laptops held the top spot with 40% of the vote while smartphones only attributed to 22%. The 2015 results signify a major shift as 33% of people said their smartphone was the primary device to get online compared with 30% who chose a laptop. In Q1 2015, 66% of households contained a smartphone and managed to overtake laptop ownership which stands at 65%.
Interestingly, the report says smartphone users spend an average of 144 minutes browsing the internet, which is almost twice the figure of a Laptop or PC at 69 minutes. The research suggests smartphones are more of a digital companion than a communication tool. For example, 42% of those surveyed watch video clips, 21% stream TV shows, 45% engage in online shopping and 44% have setup online banking.
The largest increase stems from tablet PCs and it’s surprising to read that 54% of households now own a tablet device. Although, the growth of ultra-cheap Android tablets shouldn’t make this a shocking revelation.
Rather alarmingly, almost half (48%) of smartphone users admit to being “hooked” on a mobile phone and rate their addiction a 7 or above out of 10. The data emphasizes how revolutionary connected smart devices have become. Consumers cannot exist without checking their social media, browsing for the latest deals and watching a variety of their favourite programmes. This doesn’t mean the laptop is dead as productivity through a normal keyboard is significantly higher. However, for casual usage, the smartphone looks set to dominate.
We see all kinds of cool gadgets pass through the eTeknix office, but the Netatmo Welcome is something I’ve been really looking forward to testing. We first saw the product demoed at CES 2015, but in a busy trade show environment, it can be tricky to get a really good look at a product. The Welcome is a relatively simple device, it acts as a home security camera, but with a twist. Most of you will be familiar with the kind of security cameras that you fit to the wall, as full on surveillance equipment, the welcome is a much more casual device that than.
Pop the welcome in your house, it detects when someone comes home, sends a message your smartphone via a custom app, tells you who is home, be that your kids, your partner, or an unknown face.
You can tune into live video feeds or view recorded footage to make sure your kids remembered to wipe their shoes when they got home, or whatever you need to double-check.
Sounds pretty cool right? What about when you’ve got someone unwelcome in your house?
The packaging is pretty straight forward, showing the welcome camera as well as the accompanying mobile app, which is available for Android and iOS.
On the side of the box, you can see some of the major features of the Welcome. There’s facial recognition, notifications, privacy customisation, night vision and more. An important aspect though is that there are no monthly fees what so ever, so you can use the app and features free of charge!
The Welcome camera its self is very nicely design, a slim vertical cylinder shape with a brushed aluminum finish that oozes high-quality.
There’s a wide field of view camera towards the top. It features a HD sensor, high sensitivity for low-light condition and it’ll make easy work of seeing what is going on in your house.
Towards the bottom is a small LED indicator, as well as a small Netatmo logo.
There’s very little around the back, just a microSD slot, MicroUSB port for main power and an RJ45 port for those who want to use the device wired, although wireless technology is built-in.
There’s an 8GB ADATA MicroSD pre-installed in the back of the Welcome, but you could always upgrade this if you feel the need for more storage.
The top and bottom of the Welcome are finished with glossy white plastic. The whole unit has a good amount of weight to it, it feels robust overall and the minimalist design means that you’ll find it incredibly easy to operate.
Chrome, Internet Explorer, Opera, Firefox and Safari. These are the five big names when it comes to web browsing, and each of these comes with both their own strengths and their weaknesses. Engineers at Google, Microsoft, Apple and Mozilla however are putting their arms down and working together to create WebAssembly, a piece of code that looks to speed up web browsing up to 20 times.
The concept behind WebAssembly is that is will be closer to machine level code (a series of numeric codes) than it is to higher level languages (such as Java, C#, Python, ext..). With a lower level language the aim is that both desktop and mobile browsers will be able to read it quicker than your average web page.
Being able to browse the internet at 20 times its current speed would greatly reduce how much time people have to wait in your average day and with companies like Mozilla, Apple, Google and Microsoft taking the helm you know that they are serious about trying to get this technology developed. Personally………PLEASE WAIT WHILE LOADING
Thank you Ars Technica for providing us with this information.
It looks like Facebook has brought its Messenger counterpart to web browsers. Though users can chat with their Facebook friends inside Facebook.com as they did up until now, the Messenger.com looks to work and look like a mobile version of Facebook’s Messenger, but for the Web.
A Facebook representative told Re/Code that the Messenger for Web is dedicated towards messaging, having it not display all other News Feed distractions that you would have in the Facebook.com page. The company also does not have plans on removing the messaging feature from Facebook.com, at least for now.
The chat feature was previously removed from the mobile Facebook app and wired to work alongside it with the help of Messenger app in order to add more features, according to Facebook. Since then, the Peer-to-Peer payments feature and a new Developer Platform have cropped up for the messaging app.
It turns out that Messanger.com will include support for all the features available in the mobile version, as well as multiple languages. However, the Web variant of the Messenger currently has support for the English language, with more to come in the near future.
Thank you Re/Code for providing us with this information
Though the spirit of Christmas has passed, Google brings it back with an open-source announcement of its Santa Tracker app code. For those unaware, the Santa Tracker is an online fictional progress of Santa’s journey throughout the world in real-time during the Christmas festivities.
The code is now available on GitHub for anyone who wants to implement it on whatever they desire. Developers can download the web version from here, while the Android version can be downloaded from here.
The general idea of tracking Father Christmas may seem a bit of pointless unless you are making an app for kids, but the code does contain some pretty nice libraries and samples that can be adapted and used in various other applications.
Thank you The Register for providing us with this information
Mozilla users will soon see a new message in Google’s search engine urging them to switch their default search engine to Google. Users can also choose to ignore the message and hide it until clearing the browser’s cache by pressing the “No, Thanks” button.
This comes as a result of Mozilla changing its default search engine to Yahoo! in November 2014. Default search contracts are the main route to monetizing third-party browsers. Search providers like Google and Yahoo pay browser-makers tens, or even hundreds of millions of dollars for the unique access, as it is a major driver of search traffic from modern browsers.
Firefox has been working with Google as its default search engine since 2004, but the recent change terminated the long partnership with the top web search engine. Google’s new popups make it clear that the company isn’t happy with the shift and it’s also clear that it considers Firefox search traffic a primary target. Firefox is the third most-used personal computer browser after Google’s Chrome browser and Microsoft’s Internet Explorer.
Thank you Daily Tech for providing us with this information