Tesla Pick up Tesla.com Domain

Often the simplest domain names are the best and unfortunately for Elon Musk, Tesla Motors has had to settle for the teslamotors.com domain instead of the more highly prized tesla.com. The tesla.com domain has been held since 1992 by a Nicola Tesla fan, Stu Grossman, however, it has remained largely unused for the last 24 years. Now a Tesla spokeswoman has confirmed that the electric car company has taken over the domain, with tesla.com now redirecting to the existing teslamotors.com.

It is unclear what prompted the acquisition of the domain after settling for the use of the Teslamotors.com domain for so long. The domain issue was raised by Elon Musk during the launch of Tesla Motors’ Tesla Energy wing when asked if he had plans to rebrand the company to simply ‘Tesla’ to match its new scope. With this solved, a Tesla rename could be coming up as well as giving Tesla a better grasp on the web for both their electric car business and battery units, as well as whatever Musk may have planned for the company’s future.

What caused Grossman to give up the Tesla domain is currently unknown. According to Bloomberg, John Berryhill, an attorney in Pennsylvania who represented Grossman in a past dispute with Tesla Industries Inc stated that Grossman had bought the domain for personal use due to his affinity for the inventor Tesla. Berryhill said that “Grossman had been approached by many people about giving up the name.” He went on to surmise that Tesla Motors’ acquisition of the tesla.com domain name was part of a voluntary arrangement between the two parties.

Whether it was due to a voluntary arrangement or a lapse in occupancy of the domain by Grossman, the tesla.com domain undoubtedly belongs to Tesla Motors now. This occupation of the tesla.com domain could kick-start a rebrand of the Tesla Motors company and allow them to expand their web presence beyond just cars, whether that expansion will include Musk’s desired electric VTOL remains to be seen.

Internet To Leave US Control

While it is designed to be independent, avoiding control from any country of government the internet is a little bit different from that dream. Sadly, like with any large system, someone has to be there to help maintain and support the complexity of the system, something connecting the entire world is no different in this respect. Now, the Internet could soon be leaving US control.

The Internet Corporation for Assigned Names and Numbers (ICANN) is a non-profit company that manages internet protocols and domain names. With the ability to register custom domain names and with new protocols like IPV6, the internet is expanding with new services and systems taking up everything from bits to petabytes.

The transition from US control, if it goes ahead as planned, will change hands on September 30th. While there will be no change in the fundamental workings of the internet, the control that the US had will be gone leaving for a more global service, something that countries like Russia and China have been requesting goes to a global body like the UN.

Many who use the internet believe in a principle known as Net Neutrality. This principle is that all traffic on the internet, no matter the destination, content or type should be treated the same. This means that if you and your neighbour were both watching content, one football one League of Legends, neither of your connections would be chosen above the other. This leads to everyone and everything on the internet being treated, above all else, equally. Many countries don’t employ this, with giant firewalls and companies looking to find new ways to prioritise connections.

While sharing control all over the world is a good thing, making sure that people don’t use the new control to enforce restrictions, censorship or global monitoring is also important. The freedom of one cannot come at the cost of another.

World Wide Web Inventor Says “No” to Internet.org

Tim Berners-Lee, the inventor of the World Wide Web infrastructure for the internet, is vehemently opposed to Mark Zuckerberg’s Internet.org plan to bring a limited internet to poor countries, an initiative that has long been criticised for violating net neutrality and branded an internet “ghetto”.

“When it comes to compromising on net neutrality, I tend to say ‘just say no’,” Berners-Lee said, regarding Internet.org. “In the particular case of somebody who’s offering […] something which is branded internet, it’s not internet, then you just say no. No it isn’t free, no it isn’t in the public domain, there are other ways of reducing the price of internet connectivity and giving something […] [only] giving people data connectivity to part of the network deliberately, I think is a step backwards.”

After getting so much bad press, Zuckerberg, founder of Facebook, changed the name of Internet.org, which launched last year, to ‘Free Basics’, but the same problems remain. Users will only be given access to sites that Free Basics deems appropriate – likely those that sign up to financial agreements with the initiative – restricting free use of the internet, flagrantly flouting the rules of net neutrality. Free Basics is still operating in India, despite a walkout by a number of its publisher partners.

Thank you Times of India for providing us with this information.

Image courtesy of Wikimedia.

HTTP/2, The First HTTP Update Since 1999, Is Complete

HTTP, the fundamental internet protocol used to transmit formatted data across the web, is getting its first update in 16 years. The new standard, HTTP/2, was completed on Wednesday, according to Mark Nottingham, Chair of IETF HTTP Working Group. After a series of editorial stages, HTTP/2 will be published as the new standard for websites and browsers across the globe, becoming the first update to the protocol since HTTP 1.1 back in 1999.

HTTP/2 should speed up page loading times, strengthen connections, and help servers push data to your cache. But the most important change, a burden on developers since the inception of the internet, is the introduction of multiplexing. Previously, multiple HTTP requests at once would slow servers down, sometimes preventing page loads altogether, but HTTP/2 will allow simultaneous requests with no slow-down.

Source: Gizmodo

Shorter .UK Web Address To Replace Some Sites Next Year

New .uk web addresses are looking to be introduced next summer as an alternative to .co.uk and .org.uk domains. Nominet, the non-profit organisation in charge of the naming system, said bringing in the shorter suffix was the biggest change for years.

The domain name can be used in addition to or instead of an existing address. Websites who already have a .co.uk or .org.uk site will be refused for up to five years. Brand new .uk web addresses, where there is no existing equivalent, will be given out on a first-come, first-served basis.

Where one person owns the .org.uk domain name and another owns the .co.uk, priority will go to the owner of the .co.uk site. Nominet had initially scrapped the plans amid concerns that .uk domains would be confusing for some users, but is now going ahead with the scheme. The new domains will cost £3.50 per year for one-year registrations and £2.50 per year for multi-year registrations.

Many countries already have a similar system, including France, which uses the .fr extension, and Germany, where websites are given the .de suffix. It was also announced last week that London is to get its own web suffix. The number of generic top-level domains, such as.com and .org, is set to expand massively from 22 to more than 1,400.

The Internet Corporation for Assigned Names and Numbers agreed the move last month, saying it wanted to “promote global innovation, competition and consumer choice”.

Thank you Sky for providing us with this information

Server-Free Internet Could Become A Reality In The Future

Researchers at Cambridge University have developed a proof-of-concept for a new server-free Internet architecture. The prototype was developed as part of the €5.2 million project PURSUIT that comprises representatives from European research institutes, universities and telecommunication companies. The revolutionary new Internet architecture is designed to meet the ever-growing traffic requirements of web services and security concerns of Internet users.

As of today, online data is stored on servers residing at different locations around the globe. Data requests made by client devices like PCs, tablets or smartphones are fulfilled by the geographically closest server, making the information exchange quick but server dependent. This centralized approach opens the door to problems like server attacks and traffic overloading. Also, users have less control over how and when their data is accessed.

PURSUIT users wouldn’t have to face these security and privacy problems as the architecture does away with the need of individual computers connecting to dedicated servers. Instead, it uses a peer-to-peer information sharing technique which enables individual computers to copy and republish data on receipt.

This, if deployed, would replace the existing client-server based TCP/IP networking model and could radically change the way information is stored, searched and shared online. Users would be able to fetch the requested data (or smaller data fragments) from a wide range of computers around the world and online search engines would look for URIs (Uniform Resource Identifiers) rather than URLs (Uniform Resource Locators).

The project builds upon other successful projects like TRILOGY and PSIRP, and won the Future Internet Award 2013 at FIA (Future Internet Assembly) in Dublin, earlier this year.

Thank you TechSpot for providing us with this information.
Video courtesy of TechSpot