|
2 | 2 | layout: post |
3 | 3 | title: "Neural Graphics in an Afternoon" |
4 | 4 | date: 2025-04-04 17:00:00 |
5 | | -categories: [ "blog" ] |
| 5 | +categories: [ "blog", "featured" ] |
6 | 6 | tags: [slang] |
7 | 7 | author: "Shannon Woods, NVIDIA, Slang Working Group Chair" |
8 | 8 | image: /images/posts/2025-04-04-splatterjeep.webp |
9 | 9 | human_date: "April 4, 2025" |
10 | 10 | --- |
11 | 11 |
|
| 12 | +(For the next article in this series, click [here](https://shader-slang.org/blog/2025/04/30/neural-graphics-first-principles-performance/) |
| 13 | + |
12 | 14 | The intersection of computer graphics and machine learning is creating exciting new possibilities, from scene reconstruction with NeRFs and Gaussian splats to learning complex material properties. But getting started with neural graphics can seem daunting. Between understanding graphics APIs, shader programming, and automatic differentiation, there’s a lot to learn. That’s why the Slang team is introducing [SlangPy](https://slangpy.shader-slang.org/en/latest/), a new Python package that makes it dramatically easier to build neural graphics applications with Slang. With just a few lines of Python code, you can now: |
13 | 15 |
|
14 | 16 | - Seamlessly call Slang functions on the GPU from Python |
@@ -268,7 +270,7 @@ void perPixelLoss(GradInOutTensor<float4, 2> output, |
268 | 270 |
|
269 | 271 | You can see in this code block that `simpleSplatBlobs()` is doing most of the work: iterating over our entire list of Gaussian blobs, and accumulating their contributions to the color of the pixel we are currently calculating. Keep in mind that `perPixelLoss()` is going to be invoked once for each pixel in the output image, so the function is figuring out the loss value for just a single pixel. |
270 | 272 |
|
271 | | -You might wonder if iterating over our entire list of Gaussians for each pixel in the image might be slow. It is. There are some clever things that we can do to speed up this calculation considerably, which I’ll cover in a follow-up blog post, but for now, let’s just focus on the simple– but slow– version. |
| 273 | +You might wonder if iterating over our entire list of Gaussians for each pixel in the image might be slow. It is. There are some clever things that we can do to speed up this calculation considerably, which I’ll cover in a [follow-up blog post](https://shader-slang.org/blog/2025/04/30/neural-graphics-first-principles-performance/), but for now, let’s just focus on the simple– but slow– version. |
272 | 274 |
|
273 | 275 | This set of functions is responsible for calculating all of the output pixels, as well as the difference between those values and our ideal target image, so they’re invoked not just for propagating loss derivatives (the `module.perPixelLoss.bwds` call we made in Python), but also during the rendering of our output texture, via `renderBlobsToTexture`, which looks like this: |
274 | 276 |
|
|
0 commit comments