Absurd: Apple and the M1

20201115

Apple has announced the use of its own ARM processors and its own GPU architecture in the new Macbook Air, Macbook Pro, and Mac Mini. Apple also stated its intent to use future iterations of this technology in all of its other Macintosh models. The M1 (the first iteration of this technology) uses 16 billion transistors, and apparently has some rather impressive performance given the power it uses and the thermal envelope it has.

This is not the first time Apple has done this. The Apple I, Apple II, and the Apple III used the 6502. The first Macintosh models utilized the Motorola 68K series of CPUs. In 1994, Apple switched from the 68k to IBM's Power CPUs. This change also meant the use of the PCI bus and standardized connectors for peripherals. The Power CPUs also meant that Apple had closed the large performance gap that then existed between PCs utilizing the 386/486 and the Macintosh. The 16bit 68k line could not really compete with the 32bit 386. The 32bit Power line of Macintosh computers easily competed. The universal binaries and emulators employed in the the 68k/PPC transition were then leveraged again in 2006 when Apple changed the Macintosh again. At this point, the PPC Macs were far more electrically hungry than were the Intel CPUs of the time. The Intel CPUs made the Macintosh essentially a PC with a different OS, different binary format, and higher build quality. This change also meant that Apple's advances would become general PC advances. Apple began to serve as the PC R&D firm.

When Apple chose to use SSDs, SSDs became mainstream. When Apple chose to drop optical media, all computers began to drop optical media. When Apple chose to bring resolutions beyond 1366x768 to laptops, other manufacturers grudgingly began to do the same. EFI. Intel's Thunderbolt. Wider color gamuts as standard. Thin and light laptops that fit in manila envelopes...

People love Apple. People love to hate Apple. Apple's changes are not without error. Snow Leopard was, at first, hated. It dropped support for emulating PPC, for AppleTalk networks, and for HFS volumes. The dropping of these legacy compatibility features weren't good, but a bigger problem was that many of the older but "supported" systems for Snow Leopard were radically slow with it. Yet, Snow Leopard was thoroughly 64bit and SMP which allowed far better performance on the newer Macintosh systems on which it was preinstalled.

Intel hasn't been doing well. The yields on their 14nm process were not good for quite some time after their introduction. The 10nm yields are quite bad still, and we do not yet have a clear idea of when Intel's 7nm process will be introduced. This comes after some severe security problems were discovered. The mitigation of those problems then destroyed much of Intel's performance leads in the market. It would seem that those performance leads were based largely on rampant speculation techniques and it is precisely those speculation techniques that created the security problems. Intel had largely stagnated otherwise. The "core" microarchitecture was based on the P6 microarchitecture. The P6 was largely the "Pentium Pro" or "Pentium M" series. While the Nehalem microarch that came after core was a significant departure, it was very similar to preceding eras but with the northbridge chipset moved into the CPU die. Most CPUs since have been smaller iterative improvements. Without the ability to shrink the feature sizes of the chip, and without a major logic redesign, Apple had very little reason to have much faith in Intel as a supplier. While AMD had some astonishing gains in speed and power draw, the performance gain wasn't the main thing for which Apple was looking.

Now, Apple is transitioning to its own ARM processors. They require less power, they are performance-per-watt better than Intel. They are tuned for common user workloads as well as some power user workloads. They are thermally better than Intel's chips. Best of all, they have a future. By using the M1, Apple is able to decrease software development costs on macOS. All Apple products will share the same microarchitecture which will mean that Darwin development on x86 can cease. This allows iPhone and iPad apps to run on the Macintosh as well. So, the power and performance metrics are there, the software advantages are there, the vertical integration is there, and a splashy headline is also there.

As usual, people on all sides find things to hate and things to like, but I personally believe that Apple had little choice. Their main supplier of chips who also happens to be the largest chip supplier of desktop class CPUs has some serious yield issues, and had some serious security issues. Apple needed to do something. I am eagerly awaiting real-world reports of the M1's performance and macOS Big Sur's feature set on the platform.

Apple's general position in the market is also changing. macOS has been phoning home a lot which has upset people. Their software for professional users had been nerfed some time back, and largely became slightly better versions of their consumer software products. Adding to this is a general decline in software and hardware quality from Apple, which I personally attribute to the overly rapid pace of development and release. Yet, Apple's competitors are still often worse than Apple, and Apple is thereby still able to claim the the throne for most loved tech company. We can all certainly hope that the quality of macOS and Windows will improve, and we can even hope that the build quality of Macintosh and PC hardware will improve... but the M1 could be the first big step in this direction.

⇠ back

© MMIX - MMXX, absurd.wtf
Licentiam Absurdum