The Apple GPU and the impossible bug

The Apple GPU and the impossible bug

These items are smart!!
13 May 2022

In late 2020, Apple debuted the M1 with Apple’s GPU architecture, AGX, rumoured to be derived from Imagination’s PowerVR series. Since then, we’ve been reverse-engineering AGX and building open source graphics drivers. Last January, I rendered a triangle with my own code, but there has since been a heinous bug lurking:

The driver fails to render large amounts of geometry.

Spinning a cube is fine, low polygon geometry is okay, but detailed models won’t render. Instead, the GPU renders only part of the model and then faults.


Partially rendered bunny

It’s hard to pinpoint how much we can render without faults. It’s not just the geometry complexity that matters. The same geometry can render with simple shaders but fault with complex ones.

That suggests rendering detailed geometry with a complex shader “takes too long”, and the GPU is timing out. Maybe it renders only the parts it finished in time.

Given the hardware architecture, this explanation is unlikely.


This hypothesis is easy to test, because we can control for timing with a shader that takes as long as we like:

for (int i=0; i Read More
Share this on to discuss with people on this topicSign Up on now if you’re not registered yet.



Fill your life with experiences so you always have a great story to tell
Get Connected!
One of the Biggest Social Platform for Entrepreneurs, College Students and all. Come and join our community. Expand your network and get to know new people!


No comments yet
Knowasiak We would like to show you notifications so you don't miss chats & status updates.
Allow Notifications