Why does Intel hide internal RISC core in their processors?

No, the x86 instruction set is certainly not deprecated. It is as popular as ever. The reason Intel uses a set of RISC-like micro-instructions internally is because they can be processed more efficiently.

So a x86 CPU works by having a pretty heavy-duty decoder in the frontend, which accepts x86 instructions, and converts them to an optimized internal format, which the backend can process.

As for exposing this format to “external” programs, there are two points:

  • it is not a stable format. Intel can change it between CPU models to best fit the specific architecture. This allows them to maximize efficiency, and this advantage would be lost if they had to settle on a fixed, stable instruction format for internal use as well as external use.
  • there’s just nothing to be gained by doing it. With today’s huge, complex CPU’s, the decoder is a relatively small part of the CPU. Having to decode x86 instructions makes that more complex, but the rest of the CPU is unaffected, so overall, there’s just very little to be gained, especially because the x86 frontend would still have to be there, in order to execute “legacy” code. So you wouldn’t even save the transistors currently used on the x86 frontend.

This isn’t quite a perfect arrangement, but the cost is fairly small, and it’s a much better choice than designing the CPU to support two completely different instruction sets. (In that case, they’d probably end up inventing a third set of micro-ops for internal use, just because those can be tweaked freely to best fit the CPU’s internal architecture)

Leave a Comment