IBM’s Watson Now Has A Cook Book

Who doesn’t like their food? From the simple sandwich to a Sunday roast, there are plenty of meals you can make to enjoy in anything from five minutes to five hours. When it comes to trying to make something new, most people including professional chefs, prefer to go with using combinations and mixtures they know and like. IBM decided that they didn’t quite like this and tasked Watson, their cognitive computing system, to create a culinary cook book and it delivered.

Available now, the cookbook is a combination of Watsons and the Institute of Culinary Education’s experience creating over 65 different recipes using a combination of classical chef talent and cognitive science. Watson generates the recipes list of ingredients, with the result presenting a combination of scientific flavours while a Chef combined the ingredients to create recipes that even a computer could love.

Starting with a Baltic Apple Pie, Kris Naudus of Engadget found out the hard way that some of the recipes are a little more tricky than the originals they were based upon. The first thing that surprised Naudus was the inclusion of pork to the Apple pie, and the two sauces and garnish included in the recipe only add to the restaurant feel the book looks to create.

With recipes like Indian Tumeric Paella, which “brings simple Indian flavors to a classic paella”, and Turkish Bruschetta, a simple meal that would now include spices and even Japanese eggplants (also known as aubergines).

You can find the cook book on Amazon for £26.88 and so far the reviews seem to be coming in good, for the most part.

IBM Acquire Resilient Systems and Gain Security Expert Bruce Schneier

IBM has announced today that they will be acquiring Resilient Systems and as well as the company, they will be bringing one of the biggest names in the security world on board, Bruce Schneier.

Resilient Systems specialize in developing an incident response platform that orchestrates and automates incident response processes in the case of cyber incidents including security breaches and loss of devices carrying vital data. Integrating the talents and platform of Resilient Systems into IBM Security gives them the first fully end-to-end system that combines analytics, forensics, vulnerability management and incident response in the industry said IBM.

Part of the deal for the acquisition includes plans by IBM to bring on board Resilient’s full staff of around 100 people, including Bruce Schneier, cryptography and security expert and CTO of Resilient. Exactly when the deal would be closed was not revealed by IBM, nor were any further details of the terms between the two companies.

This is just the latest step by IBM to bolster their abilities in the field of security, already hiring over 1000 security experts in the last year and appointing Mark van Zadelhoff as the manager of the security division. Monday also saw the launch of IBM X-Force Incident Response Services which aims to work with clients to assist them in planning for, managing and responding to cyber attacks. The Resilient Incident Response Platform, as well as IBM’s QRadar Security Intelligence Platform, will both be a key part of these services, with the technologies planned to be integrated across IBM’s full security portfolio.

In the modern corporate world, where it is quickly becoming a case of how to respond to and handle cyber-attacks instead of just defending against them, the acquisition of Resilient helps IBM to provide an even greater security service to their customers.

IBM To Create MMORPG Based On Sword Art Online

There are many kinds of people in the world, some who will finish the day with a drink, some will finish it with a movie, some even a game but for some there is nothing like losing yourself in a good anime. One of the largest animes in 2015 was created from a light novel written in Japan, titled Sword Art Online. The game follows a series of players who become trapped in a video game, with their very lives at risk from the monsters and players that gamers shrug off with each respawn in games. Now it would seem that the game has attracted the attention of IBM, who want to create an online game from the series.

MMORPG’s (Massively multiplayer online role-playing games) were made famous by the likes of World Of Warcraft and have been featured in a range of manga and anime’s in recent years. The name that is currently being thrown about by IBM’s Japanese branch is Sword Art Online: The Beginning and will be (much like in the series) a virtual reality game.

While many will be thinking that this is clearly a gimmick to create a game based on a game from a manga, the game has already titled at least one unique feature. The twist that is currently being advertised is that players will be scanned to create a 3D model that will become their avatars in-game, talk about putting yourself in the action!

Alongside a VR headset, the game looks to include motion capture technology to drag players into the battles. The promotional video for the game so far shows more design elements from the game and clips from the anime but you can see where they are going with the video.

If that wasn’t enough for you, the game is starting a closed beta in Japan for 208 lucky gamers next month and will be powered by none other than Watson, IMB’s natural language, and machine learning platform.

IBM Intend to Solve Computational Scaling Using “5D Blood”

One of the biggest obstacles currently facing the advancement of computing and electronic engineering is density. While current computer chips are already incredibly small, the hardware needed to power them and allow the heat generated by them to be dissipated safely, avoiding heat damage. IBM hope to solve both of those issues in one sweep, using a fluid termed “5D Blood”.

Over the years, while chips have got smaller and smaller, they have become less and less able to handle heat. A smaller surface area means less contact can be made with heatsinks, and chip heat generation is not uniform, with more used sections of the chip generating hot spots. Reduction of the chip’s size also puts the hot spots closer together and it is harder to draw heat away from them. Processing chips are also power-hungry, with most pins on a modern CPU only being used to provide the chip with a stable amount of power. These two issues combined limit how densely packed chips can be and all but forbid the stacking of chips safely for most usage scenarios.

This is where the 5D Blood comes in. The dimensions are not those usually thought of, instead being part of a 5-dimensional computing model. This model involves stacking 2D chips into a 3D pile, with the extra dimensions being power and cooling. 5D Blood contributes two of these dimensions, being an electro-chemical liquid capable of both carrying an electrical charge to a chip and carry heat away from the chip. And while it is currently at a very early stage of development, IBM scientists have so far been able to deliver 10 milliwatts of charge to a chip. And with liquid cooling being a far more developed field within IBM due advancements for the use in supercomputers, the real challenge is to expand the amount of power the fluid can deliver and make it easily rechargeable. And according to IBM’s papers, the projected numbers look favorable, showing the charge-discharge cycle having efficiency of over 80% and the capability to carry around 1V.

The reason for it being called blood is its inspiration in biological processing. Mammalian brains use power on the scale of orders of magnitude less than some of the greatest supercomputers, while still fitting equivalent processing power into a far smaller package. If brain-like computers were to be developed from thousands or millions of tiny chips, they would still need a medium for power delivery and cooling to be carried out. A medium that, for us, already exists in our bodies, blood.

As computational engineering aims to become more and more efficient like natural processes, what developments to emulate nature could be next?

Even Yahoo Turns to Google as Revenue Falls

AMD, IBM and now Yahoo. It seems we currently live in an age where large organisations rather thrive or sink, and sink fast at that, hitting the ground as they fall. According to Sky News Yahoo has announced a deal with rival Google in a surprising turn to work together on advertising and internet search after its latest financial results disappointed.

Chief executive Marissa Meyer has pledged to cut costs further and focus on a new strategy for growth. Marissa went on to state the following
“We see a unique moment and opportunity for Yahoo as we move into 2016 to narrow our strategy and focus on fewer products with higher quality to achieve better growth and better results.”

The third quarter figures, which showed another three months of declining revenues, prompted a 2% fall in its share price in after-hours trading. Yahoo revealed an 8% decline in sales on the same period a year ago whilst in the last week Yahoo just like IBM and AMD  has been hit hard with a significant drop in shares.

In an unexpected and risky move, Yahoo plans to grow revenues by sending some traffic to the Google’s search engine while prudently still using Microsoft Bing. I’ve got to admit other than going on Yahoo to look at the odd news article, I haven’t used it much at all these last ten years, whilst google continues to grow stronger.

Personally I’m even in the process of filtering out my yahoo email account in favor for Gmail. Do you even Yahoo Bro? Let us know in the comments.

IBM Sinking at a Staggering Rate as Customers Transition to the Cloud

IBM are losing shares rapidly as customers transition to cloud computing, it seems the only escape may be a process called “Mergers and Acquisitions”  FBR Capital Markets Daniel Ives said the following:

“It really comes down to M&A. If they went big on big data, cyber security, cloud that’s the only — in our opinion — solution to put fuel in the tank for growth. It’s not going to happen organically,” FBR’s senior analyst told CNBC’s “Squawk Box.”

IBM shares fell more than 5 percent since after-hours trading on Monday and continues to decline! IBM has now had to lower its full-year profit forecast and it seems IBM is going to need to re-think its business plan in a dramatic restructure to be in with a chance of salvaging some value back. IBM is having to shift from making hardware to cloud computing with their main aim is to be established in internet-based software and services sales to compete with companies such as Salesforce.com and Amazon.com’s web software units.

According to Daniel Ives, IBM should consider picking up big data firms like Splunk and Tableau, cybersecurity outfits like Fortinet and CyberArk, and enterprise software companies like Workday and NetSuite. Daniel Ives followed with:

“Big cap tech is in a horse and buggy in the right lane and all these companies are passing them in the Maseratis and Ferraris in the left lane,”

He then made the example regarding Dell’s announced takeover last week of EMC, saying EMC CEO Joseph M. Tucci would not have had to sell had he made acquisitions sooner. Daniel continued on and said the legacy tech companies have become accustomed to blaming their results on currency headwinds, but in the end, earnings come down to core execution and mature products offerings.

IBM’s shares have been declining quite rapidly since April 2013 but nothing in comparison to this scale, despite what IBM is doing to counteract this it clearly has had little effect. Whatever has gone wrong at IBM has gone drastically wrong and IBM needs to be sharp on their toes with a solution before it’s too late.

This is pretty drastic news for the 104-year-old tech giant, one of the first computers I ever used was made by IBM. what are your thoughts on the subject? let us know in the comments below.

IBM Wants To Teach Robots Some Social Skills

The exploration and development of Artificial intelligence is a boundary which is consistently being pushed, the scientific and academic communities are furthering their studies into robotic interactions in many different directions. One such path is focusing on IBM and their efforts to incorporate “machine learning to teach robots social skills like gestures, eye movements, and voice intonations” through the Watson Project.

During a keynote speech at a conference held in San Jose, California, this week, Robert High, chief technology officer of Watson at IBM, conveyed techniques his team are working on using a small humanoid robot. During demonstrations the machine, a Nao model from the company Aldebaran, appeared to successfully speak with realistic intonation. According to Oxford dictionaries, “Intonation” is defined as “The rise and fall of the voice when speaking” The robot also achieved appropriate hand gestures, a little impatience and sarcasm which included looking at its watch, for example, when asking High to hurry up with his talk.

Unfortunately, these interactions were pre recorded and not conveyed live on stage, this was down to the system’s failings in successfully working in noisy environments. The team behind the R&D have implemented machine-learning algorithms which learn from video footage with the aim to associate appropriate gestures and intonations with different phrases.

Artificial Intelligence is viewed as a soulless entity which is mechanical, it cannot be related to in any way, we humans on the other hand use subtle cues when we communicate with each other, our voices change pitch and our hands reinforce our points of view, muscles in our faces react to a conversation or a feeling of emotion. If you could download social skills into a robot, you would have a more believable form which tricks our brains into identifying a believable norm. This research is still in its early stages; one has to wonder where robots will be in 10, 20, 50 years time?. Will there become a situation in my life time whereby a debate would centre on a legal definition of an acceptance of a robot being classified as he/she.

It makes you contemplate the lengths to which AI development can reach and the implications on us.

Thank you technologyreview for providing us with this information.

Image courtesy of aldebaran

Analysing Your Brain Could Be 30 Times Faster Than A Supercomputer

The human brain, fascinating, exciting and full of possibilities, the notion to create, form an opinion and challenge the environment which we live in, is truly exceptional. We now might be able to find answers as to how powerful the human brain is after a project which is designed to compare a supercomputer with that of a brain.

An Artificial Intelligence project which has been devised by two PhD students from the University of California Berkeley and Carnegie Mellon University, will be the first of its kind to compare the human brain with the world’s best supercomputer. The AI Impacts project aims to determine how fast the human brain sends signals in its internal network compared to that of a supercomputer.

The scholars compared the power of our brains with that of IBM’s Sequola supercomputer which is in the top 3 of the most powerful supercomputers. “Sequola has a TEPS (Traversed Edges per Second) benchmark of 2.3 x 1013 TEPS”. The estimates suggest the “AI Impacts are that the human brain should be at least as powerful as Sequoia in the lower limits and for the upper estimates, therefore the human brain could surpass the IBM Sequoia speed by 30 times at 6.4 x 1014 TEPS”.

Which is both a lot to take in but also equally and potentially incredible, evolution has formed an instrument which is quite amazing, and it begs the question, what else will we find as research and tech advances with the aim of exploring us. It is also interesting to note if the wiring of for example a genius brain, think Stephen Hawking, is different to that of an average mind or the best sportsman evolved differently with more advanced genes, or if are we all capable. If we spent enough time learning a skill to be able to adapt to anything? Its compelling none the less.

Thank you aiimpacts for providing us with this information.

Image courtesy of fossbytes

IBM Wants to Put a ‘Rodent Brain’ in Your Phone

Dharmendra Modha has an array of 48 circuit boards, lined 6 by 8 on a rack, each with its own processor. Modha describes this set-up as a small rodent. Or, more accurately, a digital recreation of a small rodent brain, and one that he wants to put in your smartphone.

Modha works for IBM and has been developing the neuromorphic TrueNorth chip, which mimics the brain of a rodent with its cluster of 48 million artificial nerve cells, since 2008 as the Head of its Cognitive Computer Group. Researchers in Colorado working with the processor have developed software for it that can recognise spoken language and identify images, using deep learning algorithms. The project is backed by a $53.5 million grant from the US Department of Defense’s research arm, DARPA.

“What does a neuro-synaptic architecture give us? It lets us do things like image classification at a very, very low power consumption,” Brian Van Essen, a computer scientist for the Lawrence Livermore National Laboratory, said. “It lets us tackle new problems in new environments.”

The TrueNorth CPU is a low-power conduit for the kind of deep learning artificial intelligence that is being utilised by Google, Facebook, and Microsoft, usually through more powerful GPUs. The low power consumption of the TrueNorth means it has the potential to outperform its GPU and FPGA-powered alternatives.

Though TrueNorth cannot yet be described as a digital brain, the rodent synapse-inspired chip is certainly a step in the right direction. “You don’t need to model the fundamental physics and chemistry and biology of the neurons to illicit useful computation,” Modha says. “We want to get as close to the brain as possible while maintaining flexibility.”

Thank you Wired for providing us with this information.

IBM Manufactures World’s First 7nm Chip

IBM, in collaboration with leading companies including GlobalFoundries, Samsung and SUNY have finally cracked the sub 10nm process and produced a fully working 7nm chip. This technological marvel is based on commercial FinFet transistors, but utilizes a silicon-germanium (SIGe) alloy, self-aligned quadruple patterning (SAQR) and EUV lithography to produce chips on a minuscule design process. It’s important to reiterate though, that this is still in the early stages of production and it’s unlikely to become an integral component of mainstream appliances for at least 2-3 years.

The engineering teams have also managed to perform extremely dense stacking with a 30nm transistor pitch. According to IBM, this will result in a surface area reduction of close to 50% over the 10nm process.  Allegedly, IBM is aiming for at least a 50% power to performance ratio increase and feel the move from 10nm to 7nm will be more dramatic than 14nm to 10nm.

So how does it all work? SIGe operates at a higher electron mobility than traditional silicon making it the better choice with smaller transistors. Additionally, the gap between silicon nuclei is unbelievably small and cannot transfer current through a standard atomic structure. This is where the  germanium alloy comes into play which increases the electron mobility and leads to a proper current flow. EUV is another piece of intriguing technology and designed to help alleviate problems with light etching on smaller chips. This is vital because as the chip size decreases, you have to infuse a narrower beam of light to etch the structure accurately. Currently, this procedure is complex and quite expensive so it’s unsure how long it will be before it becomes a viable option on a large scale.

It’s always fascinating to see the prototype phases of incredibly advanced technological advancements coming to fruition. Yes, 7nm is some time off, but today is the first step on this revolutionary journey.

Thank you ArsTechnica for providing us with this information.

GlobalFoundries Seals IBM Deal

GlobalFoundries has been given the all-clear to complete its purchase of IBM’s chips division, in a deal worth $1.5 billion. An off-shoot of AMD, GlobalFoundries was founded in 2009 on the back of massive funding from the Advanced Technology Investment Company, the tech investment arm of the Abu Dhabi government. IBM has been struggling to prop up its loss-making chips division for some time, and it is them that is paying GlobalFoundries $1.5 billion to take the division off its hands, with the promise of a further $3 billion investment over the next five years.

As part of the deal, GlobalFoundries has gained two new chip-fabs and over 16,000 patents, but holistically it positions the company as a new key player in the chip market. It will now take control of IBM’s manufacturing plants in East Fishkill, New York and Essex Junction, Vermont. As part of the agreement, GlobalFoundries will provide IBM with semiconductors for the next ten years.

Sanjay Jha, Chief Executive Officer of GlobalFoundries, lauded the deal as a huge boost to his company’s research and development, saying, “We have added world-class technologists and differentiated technologies, such as RF [radio frequency] and ASIC [application-specific integrated circuit], to meet our customers’ needs and accelerate our progress toward becoming a foundry powerhouse.”

Thank you Fudzilla for providing us with this information.

U.S. Navy Doesn’t Trust Lenovo With Their Weapon Systems

You don’t trust anyone with the control of your weapon systems and it looks like the U.S. Navy isn’t too pleased with Lenovo’s recent purchase of the IBM Server division.

According to the Wall Street Journal, the Navy is looking at dropping the IBM servers from some weapon systems after the company’s server line was purchased by Lenovo Group Ltd, a Chinese company.

The last years headlines have shown more than once that security and threats don’t just come from software, but that hardware is an equal access point. With something as crucial as weapon systems, one can understand that the Navy wants to be on the safe side.

On the other hand, more and more reminders of us the cold war era. Russia developing their own CPUs because they don’t trust the west, the U.S. placing trade restriction on China forbidding the sale of the most powerful supercomputers to them, and now this story.

Lenovo spokesman Ray Gorman said that the company generally declined to comment on customer contracts and as such didn’t have anything else to say to this particular case and instead pointed the finger in the direction of the Ministry of Finance.

Lenovo paid a price of $ 2.1 billion when they bought IBM’s low-end x86 server business last year.

IBM Chip uses Light to Transfer Data at 100Gbps

After a decade of research, IBM has finally developed a new silicon photonics chip that can transfer data at the speed of 100 Gbps. These are designed for data centers and reference chips can transfer data over a distance of two kilometers using light pulses. It can be used in data centers to link storage, networking and servers. This technology is not going to be in Personal Computers or Handheld devices anytime soon as IBM is aiming to get it in data centers because of its advantageous high bandwidth optical fiber connection.

There is also demand for more computing power in servers with applications like analytics, machine learning and big data. Optical connections could help dozens of processors communicate on a server rack, making it easier to break up processes over multiple resources, said Richard Doherty, research director at The Envisioneering Group. “Optical connections could make servers much like storage drives, which can be easily hot-swapped depending on processing needs in data centers” he added.

The Optical technology used in telecom is different from what IBM offers, their silicon photonics chip is cheaper and is meant for shorter distances. IBM’s single-fiber implementations are considered to be better than Intel’s MXC optical cables.

Thank you PCWorld for providing us with this information.

Image courtesy of WallSide.

IBM and FujiFilm Show That Tape Storage Still Has Potential

Most people would consider tape storage to be a thing of the past, but that’s far from the case. It is still the most efficient and cheapest-per-byte method of storing large amounts of data that’s infrequently used, and cloud storage comes to mind here just as general archives. IBM and Fujifilm together figured out how to improve upon the current technology for a whopping 220TB of data on a 10 x 10 x 2 cm big tape drive.

The new prototype Fujifilm tape packs 88 times as much data as current tape drives that can hold about 2.5TB uncompressed data on a cartridge. You shouldn’t however start saving up for this yet, as it most likely will take 5 to 6 years before it is ready for a mass production. It’s a big accomplishment none-the-less.

“The new technologies won’t come out in products for several years and may not be quite as extreme when they do, but the advances show tape can keep getting more dense into the future,” said Mark Lantz, manager of IBM’s Advanced Tape Technologies Group.

IBM is demonstrating the new technology this week at the National Association of Broadcasters show in Las Vegas. “The tracks on the tape are narrower, the heads are smaller, and even the particles of barium ferrite that store each bit are finer. All are now measured in nanometers, so the movement of the heads has to be more precise, too. It’s accurate to within less than 6 nanometers, IBM says.”

Thanks to ComputerWorld for providing us with this information

 

Dyre Wolf Attack Reels in over $1 Million in Wire Transfers

IBM’s Security division has been researching a malware attack they have named ‘The Dyre Wolf’ which is said to have been responsible for stealing over $1 million.

It is said that the hacking campaign uses targeted spear phishing emails, malware and a phone conversation on organisations that use wire transfers.

IBM stated that the attack starts with a single user opening an infected email attachment, having it contacting the attacker’s website and downloading the Dyre malware that hijacks the user’s address book and mails itself through the organisation.

After the infection mentioned above takes place, if a user attempts to log into a banking site, it loads up a new screen that says the site is experiencing issues and shows a phone number for the user to call and make their transaction.

Once the attacker has all the user’s details, a wire transfer is made that runs through a series of international banks. IBM recommends that companies train their employees not to open suspicious attachments or links and remind them that banks do not request their banking credentials in any way.

Thank you Engadget for providing us with this information

Apple is Now Twice the Size of the World’s Second Largest Company

Apple’s market cap has just risen to $775 billion – a new record making it twice the size of the world’s second largest company, Exxon Mobil. For the last few years, it’s always been neck and neck between Apple and Exxon, a global oil giant. Many are seeing this as significant, as consumer technology is now a significantly bigger business than oil; a business which for years has been likened to printing money.

Apple is only the second company to achieve such a feat in the last 30 years – IBM was the last company to double second place, that being Exxon as well.  The difference between Apple and IBM today is symbolic, as Apple famously saw itself as the underdog to ‘Big Blue’, likening them to ‘Big Brother’ in its famous ‘1984’ ad. Just late last year, Apple announced that it was joining IBM in a partnership to bring iOS to the enterprise. Many saw this as IBM asking Apple, its former underdog, for help.

What’s next? Well, Apple isn’t far off becoming the world’s first trillion-dollar company. It was always a question of whether it could happen, but now it’s more of a question of when will it happen. Something incredible when you consider Apple was a company 90 days from bankruptcy in 1997.

Source: The Wall Street Journal

Imminent IBM Reorganisation Could Result in Massive Layoffs

A potential restructuring of computer hardware company IBM could result in layoffs of 26% of staff, or over 111,000 people. If true, it would be the biggest corporate layoff in history, dwarfing the previous record – coincidentally held by IBM after a 1993 reorganisation – of 60,000.

Robert X. Cringely, a Silicon Valley journalist and author of the eBook The Decline and Fall of IBM, revealed the news in his Forbes column. Cringely blames former CEOs Louis Gerstner and Sam Palmisano for mismanagement, which current CEO Virginia Rometty has done nothing to stymie.

According to Cringely, the IBM reorganisation, nicknamed Project Chrome, has been in the planning stage since before Christmas, and has been triggered by another quarter of falling revenue, the 11th in a row for the company.

He remains sceptical that the drastic Project Chrome will do anything to save the ailing company, saying, “So while IBM is supposedly transforming, they are also losing business and customers every quarter. What are they actually doing to fix this? Nothing.”

“In saying the company is in a transition and is going to go through the biggest reorganization in its history, will this really fix a very obvious customer relationship problem? No, it won’t.”

Source: IT World

Intel, IBM, and Qualcomm Oppose Title II Net Neutrality

An alliance of 60 tech companies, including the likes of Intel, IBM, and Qualcomm, have signed a letter addressed to US Congress and the FCC opposing Title II reclassification of broadband services.

It was President Barack Obama who proposed classifying internet as a utility service under Title II of the 1934 Communications Act in order to ensure net neutrality, but there has been backlash from ISPs, tech companies, and telecoms providers ever since the idea was pitched.

“For almost twenty years, national leadership, on a bipartisan basis, has nurtured the broadband internet with a wise, effective, and restrained policy approach that supported the free flow of data, services, and ideas online while creating a climate that supported private investment in broadband networks,” the letter claims. Then, attacking Obama’s net neutrality plan, it continues, “Title II is going to lead to a slowdown, if not a hold, in broadband build out, because if you don’t know that you can recover on your investment, you won’t make it.”

FCC chair Tom Wheeler had hoped to bring in legislation to protect the internet by the end of the year, but plans have been delayed until 2015.

Source: The Verge

Discarded Laptop Batteries Could Power Slums, Claims IBM

An IBM study has determined that old laptop batteries could be used to power slums. 70% of discarded batteries were able to hold enough charge to power an LED light for four hours a day over a whole year, an Indian IBM team has shown.

The idea was trialled in Bangalore earlier this year, and is seen as a positive step in recycling the growing quantity of ‘e-waste’. Many towns in deprived areas either have no access to electricity, due to not being connected to an electrical grid, or are too poor to afford it. Laptop batteries offer a cheap, portable, and environmentally friendly alternative.

The IBM team in India developed UrJar, an electrical hub powered by lithium-ion cells from old batteries.

IBM said, “UrJar has the potential to channel e-waste towards the alleviation of energy poverty, thus simultaneously providing a sustainable solution for both problems.”

One UrJar unit would be priced at 600 rupees (£7). IBM hopes to further develop the system after positive feedback from the initial trial.

Source: BBC

Access IBM’s Watson Supercomputer for Free

IBM has opened up its Watson supercomputing platform to everybody for free. The decision to open up a public beta for the data analytics platform means that we now all have partial access to a supercomputer, anytime, anywhere.

Using what is described as “the most powerful natural-language supercomputer in the world”, you can upload a dataset and let Watson analyse it all in incredibly accurate detail – producing correlations, predictive analyses, graphs, charts and even infographics that represent your data.

It’s a very interesting concept and is probably the first time anybody and everybody has been able to access a supercomputer for free. You can access Watson at IBM’s website here, where you will be required to set up a free account.

I know what some of you are wondering. Can it run Crysis?

Source: Gizmodo

Intel Announce 10nm Chips Capabilities and Release Plans

We reported early in 2014 of Intel’s 14nm “Fab 42” plant remaining closed and the rumors surrounding 10nm chip manufacturing that came with it – from recent reports it now seems like it’s a a reality.

As IBM and NVIDIA have teamed up to win the next generation of top-level US government supercomputers, Intel is not to be left in the dust. After 50 years of global supercomputing, the Intel platform occupies a massive 85.5% of all machines, with a reported 97% ownership in new age systems. Today, Intel have announced the details of some of their planning in high-performance computing.

The announcements have come to us thanks to CNBeta and Chiphell, these are listed in dot point form as:

  • The third-generation Xeon Phi family, codenamed “Knights Hill”, will use Intel’s 10nm process. This is the first publicly identified 10nm product, as well as Intel Omni-Path fiber optic interconnection technology. Intel’s upcoming second generation “Knights Landing” will be the first commercial system set to debut early next year.
  • The industry’s interest in Xeon Phi is more concentrated. There are already a large number of customers looking to order Knights Landing, of which more than 50 percent will use the processor version, and the rest set to utilize the PCI-E accelerator card version. The total computing capacity is said to contain more than 100 PFlops (10 quadrillion floating-point calculations).
    NOTE: Using the Intel Xeon Phi to accelerate your system is done through a PCI-E expansion card, this card must be used with Xeon processors and can only be used as a coprocessor. The next generation will have separate processor and coprocessor versions.
  • Knights Landing’s latest implementations include: Los Alamos, Sandia Supercomputers “Trinity”, the US Department of Energy National Energy Research Scientific Computing Center (NERSC) ultra-count “Cori”, Earth Science Enterprise DownUnder GeoSolutions, large-scale cooperation projects in SGI and the National Supercomputer Center IT4 Innovations projects in Europe (to deploy large clusters).
  • Omni-Path Interconnect architecture with provides 100 Gbps of bandwidth, fiber-optic switching and medium-sized clusters. Comparing this to the now popular InfiniBand, latency can be reduced up to 56%. The product architecture will use a 48 switch chip, in comparison to InfiniBand using up to only 36.
  • Intel Fabric Builders project has started, created based on the ecosystem Omini-Path. The Intel Parallel Computing Center also announced their expansion will be the construction of 40 facilities in 13 countries and regions.
  • Intel Enterprise Edition for Lustre v2.2 has been scheduled for release.

That’s quite a lot of information to take in one hit, so we will continue to report on this as the story develops. We’re sure that Intel will now sit down and take a look at these siz points one by one in the near future, so help keep it streamlined and easy to track for consumers.

Image courtesy of Chiphell

U.S. Department of Energy to Spend $425 Million on Supercomputers

The US Government’s Department of Energy has announced it is to invest $425 million to build two supercomputers, which, when built, will be the fastest computers in the world. The ultimate aim is to research science projects, including nuclear weapons.

The two computers, named Summit and Sierra, will be installed at Oak Ridge National Laboratory, Tennessee, and Lawrence Livermore National Laboratory in California, respectively.

NVIDIA, IBM, and Mellanox have provided the components for use in the two computers. Summit will run at 150 petaflops, with Sierra operating at 100 petaflops. For comparison, the world’s current fastest supercomputer, the Chinese Tianhe-2, runs at 55 petaflops.

An extra $100 million will go to fund research into extreme-scale computing, under the project name FastFoward2.

Source: Reuters

IBM Pay $1.5B to Remove Chip Manufacturing Headache

IBM has just secured a deal with Globalfoundries Inc. to offload their chip-manufacturing unit for a cool $1.5 billion today. The news has been leaked by “two people with knowledge of the matter” and comes around the same time that IBM released an announcement telling all to watch out for a big announcement coming tomorrow.

Ginni Rometty, IBM’s CEO, has reportedly been in talks about this deal for months with Globalfoundries. Finally securing a deal to offload their chip manufacturing responsibilities for $1.5 billion with a return of $200 million worth of assets – making it a total $1.3 billion expenditure.

The chip-manufacturing unit has been a long-term headache of IBM’s, with them seemingly happy to finally put it behind them following some frustrating unprofitable results. The company taking it on, Globalfoundries Inc, is owned by an investment arm of the Abu Dhabi government. It will be utilizing these facilities to increase expertise of its engineers in the fundamentals of semiconductor design and manufacturing.

The deal is said to see Globalfoundries supplying IBM with Power processors in return for IBM sharing its intellectual property, allowing Globalfoundries to access said technology and guarantee a steady flow of supply/demand. The $1.5 billion owed will be paid over the course of three years and will not see IBM losing its chip manufacturing process completely, just think of it as a partial outsourcing.

IBM and Globalfoundries have both currently declined to comment publicly on the matter, but all should be revealed tomorrow.

Image courtesy of Delimiter

Wonder What It Is like to Unbox a Supercomputer?

There isn’t a much better feeling then receiving that new product and unpacking it, digging through bubble wrap and packing peanuts for every little thing. And then finally, we can peel off the protective plastic-covers, slowly. But have you ever wondered how it would be to unbox a super computer? Apparently some reporters did, and we got a lot of photos.

Unboxing a super computer isn’t much different then any other computer, it’s just a much bigger scale. Crack open the crates, connect the cabinets and voila, your super computer is running.

Pawset Supercomputing Centre in Australia recently received a Cray XC30, dubbed Magnus2, and the unboxing was covered by the reporters from ITNews. Each of the 7 new cabinets weighs about 1.4 tonnes and can have up to 384 CPU’s in each, cracking a wopping 99 teraflops. The new system features over 35.000 Xeon cores and a peek performance around 1 petaflop.

The University of Arizona recently got their El Gato supercomputer, and the unboxing was covered as well. It’s composed out of IBM’s x86 iDataPlex servers and Nvidia Tesla K20 accelerators, and the El Gato also came fully build, shrink-wrapped but otherwise ready to go. With a peak performance of 145 teraflops, the El Gato is one of the fastest supercomputers located at an US university.

Just as in our consumer world, not everything in the server world is plug and play like above. The video below shows a time-lapse of engineers at DoE’s Oak Ridge National Laboratory (ORNL) manually upgrading the jaguar supercomputer to become the titan super computer. With it’s 560 Xeon cores and 640 Tesla cores, it’s pumping a massive 17  petaflops, yet it still isn’t the world’s fastest.

[youtube]https://www.youtube.com/watch?v=S8Y77efFW-I[/youtube]

So there you have it, it isn’t much more difficult to unbox and set up a super computer then any random Dell PC for example.

Thank you Extremetech for providing us with this information.

Image and video courtesy of Extremetech.

Human Brain Inspired IBM Supercomputing Chip Making Great Progress

IBM have made great advances since they first revealed their prototype human brain inspired SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) chip. That single core prototype has been massively scaled up to a production ready model which features 1 million neurons, 256 million synapses, 4,096 neurosynaptic cores and amazingly it only requires 70mW of power. That’s almost nothing in terms of power, about the same as what a hearing aid battery can provide.

Of course the figures are a little meaningless to most of us, we work in MHz and GHz, these are terms we can scale up compared to our own CPU.  However, this chip doesn’t work like most others, as it works like the human brain in that it can process massive amounts of sensory data in parallel by merging memory and computing. It’s so unique in its approach for processing that IBM have created its own programming language, as well as an educational outreach program called SyNAPSE University to help people work with it. Helped in no small part by DARPA who throw $53 million in funding at the project.

IMB are already building programmable and working boards with 16 of these chips working in concert, which represents 16 million neurons. This single board of 16 chips is capable of blasting through data that would normally require “racks and racks of conventional computers.”

The chips use power levels that are incredibly low, meaning they give off very little heat compared to a conventional chip, they can churn through data at incredible rates and do it in a form factor so small that IBM claim “You can carry our board in your backpack. You can’t carry four racks of conventional computers in your backpack.”

Thank you Engadget for providing us with this information.

Image courtesy of Engadget.

Google Aiming at Making Wi-Fi Hotspots out of Old NYC Payphones

Bloomberg reports that a meeting in New York providing information to companies interested in offering free Wi-Fi has counted Google as one of the attendees, aside from IBM, Samsung and Cisco. However, Google is known for its ambition to offer free or affordable internet connectivity, which indicates that the corporate giant is planning to submit a big proposal to the New York department of IT.

While Google already offers free Wi-Fi around its office in Chelsea neighborhood, the company also has a number of initiatives to bring cheaper and abundant internet connectivity to the US and abroad. People asking why is Google so anxious to bring this feature to the public all over the world should ask themselves what (most) of them are using as a start page on their browser, the developer of their browser or the actual search engine used to find all their information. And yes, the answer to all questions is Google.

The plan Google has for NYC is to enable its payphone locations with Wi-Fi hotspots charged by phone services, and not ISPs, while also incorporating advertisements for actually making it ‘free’. The company has requested the plan be in effect on all 7,300 payphones, meaning that NYC will become a city covered in free Wi-Fi connectivity after the plan is approved and work is finished. Another interesting and beneficial feature mentioned is the ability to connect to every other Wi-Fi hotspot automatically once you authorized access to the Wi-Fi network.

The project would mean a new level of connectivity that does not depend on cellular data subscriptions for new yorkers, providing the project goes through. Even so, it still is a big step forward towards a new type of wide connectivity than just a few wireless routers placed in key, remote locations.

Thank you Techcrunch for providing us with this information
Image courtesy of Techcrunch

Apple And IBM Coming Together To Challenge The Business Market

Apple and IBM, two well-known companies around the globe and two that want to take over the business world by working together and using each other expertise to bring the iPad as your next computer. Yes you read it right, your desktop computer, probably a Dell, in the partnerships eyes can be replaced by an iPad.

It looks like the first targets of this overtake venture are to be the retail, heathcare, banking, insurance, travel and telecommunications sectors – or quite a large chunk of the market in other words, with a new series of Apple developed Apps which promise to address issues that each of these sectors may be facing.

Developing the app is obviously Apple’s deal, but getting it right for the business market is something that they are not too sure on and this is where the expertise of IBM comes into play.

To be honest I really can’t tell if this partnership is trying to flog a dead horse or not at the moment. I know Apple is not business driven, with the pocket of the consumer their primary target, but lets look at the wider picture here. iPhones (and iPads) are already used as tools in the business sector, but the infrastructure that has been setup through a Windows environment is massive – really, really massive. If you’re looking at replacing every single desktop system with an iPad, you’ve got to consider the fact that a lot of training is required to use the new systems and I probably won’t be surprised if a major infrastructure change behind the scenes will also be required to cater for the swarm of iDevices.

Simply put the cost of changing the industry over to iOS is not as straight forward as giving everyone an iPad – once you look at all the other costs that have to be factored in, I just see this as a complete waste of time and money. Apple you’ve already got millions of worldwide users and the business markets are set on Windows – lets just leave it that way and save a ton of time and money being wasted.

Source: T3

Image courtesy: Mashable