The latest processors are becoming hotter with almost every generation, but a study from the University of California L.A. shows that heat-channeling thermal transistors might be the solution (via IEEE). Although these thermal transistors are just in the experimentation stage, they offer a highly attractive way to remove heat from processors that may pique the interest of companies like AMD and Intel.
Today’s modern processors, particularly the high-end ones, have an acute issue when it comes to heat. Processors are getting smaller, but power consumption isn’t going down nearly as much, and since power becomes heat, that means more heat is being packed into a smaller area. This heat is often concentrated in a specific part of the processor (a hot spot), and even if the average temperature of a CPU is fine, the hot spot temperature may hold it back from performing as well as it could otherwise.
These new thermal transistors essentially channel that heat throughout the rest of the processor via the use of an electrical field, spreading out the heat evenly. The design innovation that made this possible was a one-molecule-thin layer of molecules that become thermally conductive when charged with electricity. Thermal transistors could move the heat from a hot spot (often in the cores) to a cooler part of the chip. Compared to normal cooling methods, the experimental transistors were 13 times better.
The advent of heat density issues can be traced back to Dennard Scaling. Dennard Scaling held that smaller transistors were more efficient, which meant heat density would never go up. However, Dennard Scaling stopped holding up in the mid-2000s, around the time the industry hit the 65nm process node. Ever since, the ratio of power/heat to area has been increasing gradually.
Additionally, as processor designers enhance their frequency-boosting algorithms to extract more performance, it’s become more evident how hard it is to beat the heat. If thermal transistors make it out of the lab and into consumer devices, it might at least stave off the heat density problem, if not solve it outright. Otherwise, more exotic versions of traditional cooling methods might be necessary, like immersion cooling.