Better Graphics, Cheaper Cards: Intel’s Latest GPU Research

Looking to cement its GPU cred, Intel shared several new papers on neural graphics and generative AI ahead of SIGGRAPH 2023.

This week, Intel detailed its latest advances in computer graphics with seven research papers and an upcoming course on photorealistic rendering, generative AI and neural graphics. Intel shared four of the papers at the conferences for High Performance Graphics (HPG) 2023 and the Eurographics Symposium on Rendering Techniques (EGSR) 2023, which are being co-hosted between June 28 and 30 in Delft, Netherlands. Intel will present the course and three additional papers at SIGGRAPH 2023, the computer graphics conference to be held between August 6 and 10 in Los Angeles.

“The new building blocks presented at this year’s conferences, along with our wide offering of GPU products and scalable cross-architecture rendering stack, will help developers and businesses to do more efficient rendering of digital twins, future immersive AR and VR experiences, as well as synthetic data for sim2real AI training,” states an Intel blog post discussing the research.

The benefits of improving path tracing

The four papers presented at EGSR and HPG cover techniques for refining the process of path tracing, also known as photorealistic rendering. When ray tracing, shading and sampling are made more efficient, path tracing is available on a wider range of affordable GPUs. Improved path tracing also advances path tracing toward real-time performance on integrated GPUs.

In path tracing, photons are simulated to exist as they do in reality, scattering off objects and then moving forward to their next interaction. The photons move along a straight line using ray tracing. If a photon runs into a surface, that surface is shaded. The photon’s scattering off the surface is simulated using a complex statistical distribution model defined by the surface’s material.

One of the papers on this topic covers how Intel simulated the appearance of a variety of glittery surfaces. The method runs in real-time with stable performances on desktop PCs. The researchers accomplished this by creating a statistical law for the average number of glints visible by a pixel of a virtual camera. The law provides a new data structure that bounds the computation.

Simulations of different glittery surfaces in a video game. (Source: Intel.)

Simulations of different glittery surfaces in a video game. (Source: Intel.)

The other papers presented at EGSR and HPG cover:

  • sampling visible rough, metallic or spherical microfacet surface (GGX surface) normals with spherical caps
  • real-time ray tracing of micro-poly geometry with hierarchical level of detail
  • stochastic subsets for bounding volume hierarchy (BVH) construction

Intel will also cover efficiencies in the talk “Path Tracing a Trillion Triangles” at SIGGRAPH. Intel will show that with efficient algorithms, real-time path tracing requires a less powerful GPU. In the future, path tracing may even be possible for mid-range and integrated GPUs. Intel will make the cross-vendor framework open source as a sample and testbed.

Democratizing generative AI research

The first AI-related paper that Intel will present at SIGGRAPH covers how diffusion models can be made more accessible. Intel’s researchers will show how they developed an equivalent diffusion model using the concepts of simple mixing and de-mixing of images. Currently, specializing diffusion models still requires use of the cloud and multi-GPU hardware. Simpler models could help engineers find less time-consuming ways to train and use diffusion models. 

Another AI-related paper that Intel plans to present at SIGGRAPH covers neural tools for real-time path tracing. Neural tools, which combine AI with graphics, provide a way to improve visual quality in games, movies, synthetic data and digital twins. The neural level of detail representation achieves 70 – 95% compression when compared to “vanilla” path tracing. The result also offers interactive to real-time performance. Neural tools make it possible for images to be clearer and more realistic, even on low power GPUs.

Intel shared in the blog post that it intends to present other progress in games and professional graphics later this summer.