Benchmark scores for the Apple M1 Ultra match — and in some cases, exceed — the top-spec x86 CPUs from Intel and AMD, but does the difference between RISC and CISC processors still exist? Apple announced the M1 Ultra on March 8 as its new top-of-the-line desktop SoC. The chip powers the company’s all-new Mac Studio that comes with the form factor of a Mac mini but the power and versatility of the Mac Pro.
With the Apple M1 Ultra launch, many say the difference between RISC and CISC has become irrelevant. However, that may not be an exact representation of today’s microprocessor scene. So what exactly is the difference between modern RISC and CISC CPUs in an era when the performance gulf between ARM processors and x86-64 offerings from Intel and AMD is getting increasingly blurred?
Related: M1 Ultra Vs. M1 Max: How The Two Apple Chips Compare
Apple’s M1 series of chips are based on ARM architecture, which is a RISC (Reduced Instruction Set Computer) processor. Over the past decades, CISC (Complex Instruction Set Computer) processors, which use a larger set of complex machine language instructions, have traded blows with RISC chips, which use a reduced set of simpler instructions. While CISC has dominated more recently, the M1 Ultra showed just how much the gap between the two standards has reduced in recent times. Still, despite the fast and efficient Apple silicon and continued gains for ARM in the data center market, x86 is far from doomed and will remain relevant in the foreseeable future.
ARM Vs x86 Doesn’t Necessarily Mean RISC VS. CISC Anymore
It is important to note that ‘x86 vs. ARM’ doesn’t necessarily translate to ‘CISC vs. RISC’ circa 2022. While it might have been true a few decades ago, the terms have become more ambiguous over the years with both ISAs borrowing technologies from each other. Many commentators believe that despite their names, RISC and CISC are much more than just the simplicity or complexity of an instruction set, and the terms are best consigned to the annals of history. Yet others believe that RISC and CISC are fundamentally different, and the two shall never meet.
However, the former viewpoint is becoming increasingly more accepted with each passing day. According to this school of thought, microprocessors have evolved so much over the past decades that terms like RISC and CISC have become relics of the past. The two instruction sets, meanwhile, continue to have specific differences. For example, some of the characteristics of RISC instruction sets make them more efficient than x86, including the use of fixed-length instructions and a load/store design. No wonder then that the RISC vs. CISC debate rages on and the answer to whether any difference between them still exists in modern times depends on who you ask.