I love when a briefing or conference call sends me searching for a new term I don’t recognize.  I had that happen during an HP briefing with Greg Battas about Big Data last month.  It was the first time I had heard of the concept of Dark Silicon, where there are more transistors on a chip than can be powered leaving some of them dark.  To me, it sounded a lot like dark fiber – fiber laid in the ground but unlit because there is no use for it at the time.
When I began digging into Dark Silicon, there were relatively few sites with any information about it. Â Most sites that referred to Dark Silicon were from 2010 and 2011 and were referring to the pace as which chip makers can pack transistors on a chip was quickly outpacing the ability to keep them powered. Â Also, as the ability to pack more transistors increases, the size of the process stays about the same, leading to more and more cores being present in the same space. Â In many cases, the 4 and 8-core chips common in systems today physically have more cores on them, but cannot be powered. Â But at the same time, the cost of omitting these transistors is negligible for the manufacturer.
From the conference call I was on, Battas relayed to us that some interesting things are happening particularly in system on a chip designs. Â These applications are particularly power sensitive in mobile devices leaving more dark silicon in these devices. Â But the emerging idea is that instead of simply leaving the additional cores dark, these could be engineered for specific tasks and powered only when needed to improve the efficiency of a task.
Specialized Cores
I was able to find a great resource on the topic – a presentation titled “Is Dark Silicon Useful” from Michael B. Taylor, an associate professor at the University of California, San Diego.  The presentation has a good general explanation of dark silicon and it details the topic that Battas mentioned on the call in more detail.  The slide deck is available online and I suggest that you take a look at page 40 and beyond in particular.
The slide on page 43 is extremely interesting showing a 91% savings of energy through using conservation cores – which were specialty dark silicon cores tuned for a specific task, rather than traditional RISC based processing. Â One of the two insights that Taylor shares about conservation cores is that “specialized logic can improve energy efficiency by 10-1000x.” Â That’s pretty compelling.
In terms of HP and Big Data
In April, HP launched Moonshot for the world. Â Moonshot is a hyperscale, datacenter solution that uses server cartridges to massive scale out software. Â The cartridge (think blade, but more simplistic) includes all the workings of a server while offloading the power, cooling and networking to the enclosure.
After a couple briefings that have mentioned Moonshot, a few things become clear. Â First, Moonshot cartridges can be introduced on short, iterative cycles allowing for faster innovation and integration of the latest technologies. Â Second, the cartridges can be customized to perform specific tasks. Â One of the early briefings mentioned creating a cartridge GPU’s integrated on the board to handle facial recognition at ATMs. Â There is unique flexibility in that design.
Now bringing in the idea of dark silicon on Moonshot cartridges, designers and engineers are thinking that creating silicon specifically tuned to certain processes that can turn on and turn off  regions as needed will improve efficiency even further with these system on a chip designs in the datacenter.  There are applications like voice service processors which can greatly benefit of custom silicon.
“Because adding transistors to an existing piece of silicon is really cost effective, Â it lowers the barrier for putting really interesting acceleration onto the chip for databases,” according to Battas. Â HP really sees a lot of this innovation occurring from system on a chip vendors who are looking at applications for database accelleration. Â This could have large impacts in scale out architectures like Hadoop clusters and other Big Data applications.
In addition to the chip innovations, big data software is changing also as open source alternatives to the traditional big data software. Â This speeds adoption of new hardware innovations into the software is accelerated and this allows for new concepts to be rapidly adopted into platforms. Â For instance, Intel is working closely with the Apache Hadoop project to make sure that the software can take advantage of their hardware. Â Battas expects to see close collaboration by other big data software vendors and hardware vendors to try and exploit these types of increased performance with specialized hardware.