Nvidia Seems To Have Used Tile Based Rendering For The Huge Generation Leap Maxwell Provided
Nvidia’s previous generation GPU architecture, Maxwell, was amazing technology at work and there is no denying that. The Maxwell GPUs offered a huge performance-per-watt increase over previous generation while the die size increased only slightly to 28nm.
Despite such a small change, jumping to Maxwell felt like a proper generational leap instead of a refresh of Kepler. Nvidia has previously shared some details on how they achieved that feat, for example the implementation of a significantly altered SM structure and partitioning which wasted less power on interconnects. The company also increased L2 cache size and some low-level transistor optimizations.
They have however, never gone in full detail of all those changes, presumably to keep their secret intact.
Well thanks to some digging over at AnandTech and David Kanter of Real World Tech, that secret might no longer be a secret. David went digging around Maxwell and Pascal architecture and discovered some convincing evidence that suggests Nvidia implemented a tile based rendering system with Maxwell.
Tile based rendering has been in use in mobiles for quite some time now, with both Imagination PowerVR and ARM Mali using it. Makes sense since Maxwell was after all Nvidia’s original mobile-first architecture.
The significance of tiling is that by splitting a scene up into tiles, tiles can be rasterized piece by piece by the GPU almost entirely on die, as opposed to the more memory (and power) intensive process of rasterizing the entire frame at once via immediate mode rendering.
By playing around with some DirectX code specifically designed to look at triangle rasterization, he has come up with some solid evidence that NVIDIA’s handling of tringles has significantly changed since Kepler, and that their current method of triangle handling is consistent with a tile based renderer.
Real World Tech also believes that right now, Nvidia is the only PC GPU manufacturer to use tile based rasterization which could account for their advantages over Intel and AMD GPUs. The full detailed article from Real World Tech can be seen here.
Openly admits his dislike of all things Apple and is a complete Android fanboy. Whenever he looks at something touch enabled or VR, be prepared for a massive geek out. Loves spending his free time experimenting with various Android ROMs.