
Scientists at Stanford University have found a way to cool microchips using thin diamond films. The researchers propose using an ultrathin diamond coating to dissipate heat directly within the chip.
Modern processors are reaching their power density limits. For example, new servers based on the NVIDIA B300 GPU consume nearly 15 kilowatts of power. This leads to chip overheating, performance degradation, and accelerated wear.
Professor Srabanti Chowdhury's team has successfully grown polycrystalline diamond at just 400°C—low enough to avoid damaging semiconductor structures.
The resulting coating, just a few microns thick, can reduce component temperatures by 50-70°C. Adding a diamond layer to gallium nitride-based high-frequency transistors (GaN HEMTs) can reduce channel heating by 70°C and increase signal gain by a factor of five.
During the growth process, a silicon carbide layer forms between the diamond and the semiconductor, acting as a bridge for more efficient heat dissipation. This new technology has already attracted the interest of major chipmakers such as Samsung, TSMC, and Applied Materials.