Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proof of concept: Use Shader Graph for point cloud rendering #265

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

kring
Copy link
Member

@kring kring commented Mar 29, 2023

The point cloud rendering in #218 is awesome, but unfortunately it will only work with the Universal Render Pipeline. That's especially unfortunate because we've added support for the built-in pipeline in this release. I was curious what stopped us from implementing point cloud rendering using Shader Graph, which should make it much easier (perhaps even automatic) to support all the render pipelines. So this draft PR is a proof of concept of doing that, and I don't see any major barriers to it. Short version: it works.

URP:
image

HDRP:
image

It uses a custom function node to read from the structured buffer, so no increase in memory usage. In fact, it should be pretty easy to use this same approach in the non-attenuated case, too, so we only need one copy of the points.

Because the output vertex position in shader graph is always in "object space", we need to transform the clip space position back to object space, only so Unity can go the other way again. I think this is the biggest downside to this approach. But GPUs basically do matrix * vector multiplications in their sleep, right? I haven't measured performance to be sure, but I think this is a worthwhile tradeoff for the compatibility that shader graph gives us.

It should also be possible to do nifty stuff like drape raster overlays over point clouds.

I'm currently using an "empty" Mesh to drive the rendering. It doesn't have any vertex data, but it does have index data because meshes require it. So this is pretty inefficient, but using a Mesh rather than DrawProcedural makes Unity set the model matrix and maybe other uniforms in the normal way that shader graph expects. We can definitely make this more efficient, I was just being lazy.

Stuff I haven't figured out (hopefully no deal killers here, but who knows?):

  • How to conditionally compile custom HLSL in a custom function node based on shader keywords. Right now it's just hard-coded that the buffer has colors but no normals.
  • Normals and Tangents. I've kind of ignored them and have the shader set to "unlit", because the lit shader doesn't work without vertex normals and tangents.
  • For rendering to work in HDRP, I had to force the tileset to render opaque. Something in the transparent branch in UpdateMaterial breaks it and causes it to render nothing. (actually, looking in RenderDoc, it renders all the right triangles, but every fragment of them is fully transparent, so nothing is visible on the screen)
  • The built-in render pipeline isn't working. Looking in RenderDoc, I can see the immediate problem is that the position coming out of the vertex shader is identical for every vertex, both within and across billboards. I don't know why, though.

@j9liu I'm curious what you think. I guess the first question is whether I've missed any major deal killers on this such that its not as practical as it appears. And the second question (assuming the answer to the first is positive) is whether we want to a) ship the current implementation in this release, and perhaps switch to shader graph next release, b) hold shipping attentuation until we sort this out, or c) try to wrap up this approach this week and get it into this release.

@j9liu
Copy link
Contributor

j9liu commented Mar 29, 2023

@kring this is the one method I forgot to test with Shader Graph 😭 I'm glad you were able to get something working!

I'm not sure how many of those items you'll be able to knock out, so while it would be nice to get it into this next release, it depends on how much time you'll be able to spend on the remaining items. But it would be nice to get some form of attenuation in the release for the upcoming blog post.

As for the task list you put together:

  1. I believe you have to explicitly add the keywords to the shader graph, as described here.
  2. We probably need to make two versions of the material, one lit and unlit. This is fine, that's just how the shading models differ.
  3. I think that the keywords being set in the point material may only work for URP, so I'm not surprised. We can remove those keywords, and just include this in our more concerted effort to implement translucency re: Add support for 3D Tiles with translucency #186 .
  4. Does vertexID work in built-in render pipeline? The only reason I could think of is that the vertexID attribute just isn't working, so it returns 0 for every vertex (and thus every vertex resolves to the same position).

Base automatically changed from point-cloud-shading to main March 30, 2023 04:13
@j9liu
Copy link
Contributor

j9liu commented Mar 31, 2023

I messed around with the shader graph today and the vertex ID node does work in built-in. As proof, I removed the vertex deformations from the vertex shader and used this as the mesh color (the divisor is 100,000):

When I do Graphics.DrawMesh with the regular point mesh and the modified material, there's a clear gradient across the points:

Then I thought it had to do with retrieving the data from StructuredBuffer. Hardcoding position in CesiumReadPointCloudData_float to be (0, 0, 0), then applying the XY offset based on vertexID in world space, gave me a bunch of horizontal planes, which means the vertexID definitely works:

And it actually turned out that the data could be retrieved from the buffer if you hardcoded an integer. So when I add a uint cast, i.e. uint pointIndex = (uint)vertexIndex / 6;, it's able to retrieve the positions!

But enabling the rest of the code still makes the entire thing disappear, so there's still something else that's wrong. Have to find out where.

EDIT: I believe that UNITY_MATRIX_I_VP is the problem. The inverse matrices UNITY_MATRIX_I_M and UNITY_MATRIX_I_V work fine. But once you include the inverse view-projection matrix, the mesh won't show up again.

@j9liu j9liu mentioned this pull request May 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants