DX12, eager to show off, executed every effect at full quality. He multi-threaded the glass, compute-shaded the fire, and async-computed the dust. For three seconds, he hit 144fps. The crowd cheered.

Later, in the dimly lit shader cache, DX12 sat on a bench, his frame buffer cracked. DX11 walked over, leaned against a rasterizer, and handed him a bottle of VSync.

“You call that parallelism?” DX12 laughed. He split the draw calls across eight threads in one breath. The scene assembled twice as fast. The crowd oohed. DX11’s frame rate dipped, then steadied.

Exhausted, both APIs entered the final phase: rendering a 4K ultra-wide scene with 16x anisotropic filtering and dynamic global illumination.

DX11 handled it with grace. He paused a few shadow maps, lowered the LOD on distant debris, and kept the frame rate at a cinematic 45fps. No one complained.

And somewhere, the teapot finally landed right-side up.

The screen flickered.

DX12 looked up. “Then why do they keep trying to replace you?”

DX11 pulled from his bag of tricks: mature drivers. Every AMD, NVIDIA, and Intel GPU knew his language. He slid through the scene like a warm knife through butter. No surprises. No glory. But no tears.

The teapot screamed.

DX11 laughed, a low, draw-call rumble. “They don’t want to replace me. They want you to become me. Reliable. Low-level. But… you’ll get there. After a few more driver updates. And fewer teapots.”

“You’ve got power, kid. More than me. But power without predictability is just a particle effect waiting to explode.”

Outside, the developers were already arguing about Vulkan. Inside, for one brief, perfectly synchronized moment, DX11 and DX12 rendered the same sunset. It was beautiful.

This year’s match was personal.