Skip to content

Commit 5c5090e

Browse files
authored
Update 2025-04-04-neural-gfx-in-an-afternoon.md (#81)
Add this post back to "featured", and also add links to the next in the series.
1 parent 54b54e0 commit 5c5090e

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

_posts/2025-04-04-neural-gfx-in-an-afternoon.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,15 @@
22
layout: post
33
title: "Neural Graphics in an Afternoon"
44
date: 2025-04-04 17:00:00
5-
categories: [ "blog" ]
5+
categories: [ "blog", "featured" ]
66
tags: [slang]
77
author: "Shannon Woods, NVIDIA, Slang Working Group Chair"
88
image: /images/posts/2025-04-04-splatterjeep.webp
99
human_date: "April 4, 2025"
1010
---
1111

12+
(For the next article in this series, click [here](https://shader-slang.org/blog/2025/04/30/neural-graphics-first-principles-performance/)
13+
1214
The intersection of computer graphics and machine learning is creating exciting new possibilities, from scene reconstruction with NeRFs and Gaussian splats to learning complex material properties. But getting started with neural graphics can seem daunting. Between understanding graphics APIs, shader programming, and automatic differentiation, there’s a lot to learn. That’s why the Slang team is introducing [SlangPy](https://slangpy.shader-slang.org/en/latest/), a new Python package that makes it dramatically easier to build neural graphics applications with Slang. With just a few lines of Python code, you can now:
1315

1416
- Seamlessly call Slang functions on the GPU from Python
@@ -268,7 +270,7 @@ void perPixelLoss(GradInOutTensor<float4, 2> output,
268270

269271
You can see in this code block that `simpleSplatBlobs()` is doing most of the work: iterating over our entire list of Gaussian blobs, and accumulating their contributions to the color of the pixel we are currently calculating. Keep in mind that `perPixelLoss()` is going to be invoked once for each pixel in the output image, so the function is figuring out the loss value for just a single pixel. 
270272

271-
You might wonder if iterating over our entire list of Gaussians for each pixel in the image might be slow. It is. There are some clever things that we can do to speed up this calculation considerably, which I’ll cover in a follow-up blog post, but for now, let’s just focus on the simple– but slow– version.
273+
You might wonder if iterating over our entire list of Gaussians for each pixel in the image might be slow. It is. There are some clever things that we can do to speed up this calculation considerably, which I’ll cover in a [follow-up blog post](https://shader-slang.org/blog/2025/04/30/neural-graphics-first-principles-performance/), but for now, let’s just focus on the simple– but slow– version.
272274

273275
This set of functions is responsible for calculating all of the output pixels, as well as the difference between those values and our ideal target image, so they’re invoked not just for propagating loss derivatives (the `module.perPixelLoss.bwds` call we made in Python), but also during the rendering of our output texture, via `renderBlobsToTexture`, which looks like this:
274276

0 commit comments

Comments
 (0)