Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Blender Cycles, using Principled BSDF 2.0, to the render fidelity test #4482

Open
bhouston opened this issue Sep 28, 2023 · 16 comments
Open

Comments

@bhouston
Copy link
Contributor

bhouston commented Sep 28, 2023

Description

Add Blender Cycles using its latest Principled BRDF 2.0 to the Render Fidelity Test page. This will allow us to compare glTF PBR rendering in Blender Cycles with all of the other renderers featured on the fidelity page now.

My recommendation on implementing this is to use the Blender Python interface that allows you to use Blender from the command line via Python. You should set up the various backgrounds using the assets provided by model-viewer and then use the Blender Khronos glTF importer to import the various gltF assets and then render the images using Python. Thus it is one or more of python scripts to run this test suite.

This can be used to guide the development of the Khronos glTF plugin and as well as Blender if there are internal limitations to Blender's Principled 2.0 BSDF implementation.

I am willing to fund this on behalf of Threekit.

Live Demo

No live demo currently.

Version

No relevant version.

Browser Affected

Not relevant.

OS

Not relevant.

AR

Not relevant.

@vis-prime
Copy link
Contributor

I can script this

@bhouston
Copy link
Contributor Author

BTW there is some discussion here on how to integrate these non-web browser hosted renderers into the test suite: #4483 (comment)

@hybridherbst
Copy link
Contributor

Maybe it’s worth to add Eevee as well?

@bhouston
Copy link
Contributor Author

Maybe it’s worth to add Eevee as well?

Sounds like a good idea, but let's do Cycles first. :). We can do Eevee as a separate GitHub issue after, and it will probably be easy to do.

@vis-prime
Copy link
Contributor

vis-prime commented Sep 29, 2023

Yup , it'll be easy to switch

one big difference is cycles does not use support "backface culling" directly (could be done by manually adding some extra shader nodes)
rest of the properties should work in both

2023-09-29.20-50-58.mp4

@bhouston
Copy link
Contributor Author

Here is instructions to add a command line renderer to the fidelity test suite: #4483 (comment)

@vis-prime
Copy link
Contributor

vis-prime commented Sep 29, 2023

will refer to #4487 on how to do the PR

for now

I'm done with loading the glb, camera/target coords and hdri lighting , will test more configs

cycles

@vis-prime
Copy link
Contributor

vis-prime commented Sep 30, 2023

more tests

  • blender 4.0.0 Alpha
  • 20 second time limit + OpenImage Denoise
  • the new default "Agx" colorspace

(resolution & bg transparency mismatch in some due to missing values in config.json)
compare

cycles 2

cycles

@vis-prime
Copy link
Contributor

vis-prime commented Oct 1, 2023

Hi ,@jasondavies
what config file did you use to get the golden outputs ?

@jasondavies
Copy link
Contributor

Hey! So you'll need to follow the instructions in the render-fidelity-tools README.

The key bit is that you can use npm run update-screenshots to update the "golden" screenshots. It reads the scenarios from test/config.json. To add a new renderer, you'll want to add it to config.json. For "offline" renderers, such as Blender, you can add a "command" property to call a command line script (such as a Python script) which will be called with the full JSON (including defaults) once for each scenario.

Note that you can simply run npm run update-screenshots [optional myRendererName] [optional scenarioName], where the additional arguments are interpreted as whitelisted renderers and/or scenarios to avoid re-generating the "golden" screenshots for other renderers.

@jasondavies
Copy link
Contributor

BTW, the external renderer config is explained in more detail in #4483 (comment).

@bhouston
Copy link
Contributor Author

bhouston commented Oct 2, 2023

@vis-prime great work! BTW as with the V-Ray test, we may want to render in linear space and save as EXR and do the tone mapping after using Python code, the reason is that the Three.js tone mapping is very specific and any changes or improvements Blender has may screw up the comparisons significantly. Definitely do not use the AGX because while it is awesome, it will screw up trying to get the material correctly matching. Once we know the material is correct, we can do all the renders we want with any color space we want, but we need to ensure the materials match first.

@vis-prime
Copy link
Contributor

vis-prime commented Oct 2, 2023

ahhh... got it !
will, work on the PR itself with the correct writing pattern to test the npm run update-screenshots stuff

and

will save standard colorspace 32bit exr (zip codec) then do the ACES pixel math


Tested all 87 configs to confirm the camera matches up correctly ... looks good (missed a fov to radians conversion earlier which messed up a few configs.... now all are correct)
(cycles image | model-viewer image)
1
2

@vis-prime
Copy link
Contributor

vis-prime commented Oct 2, 2023

Results from converting temp EXR file to ldr ACES

using three-gpu-pathtracer goldens as reference , colors looks correct

saw0001-0050.mp4
buggy0001-0050.mp4
att0001-0100.mp4

@bhouston
Copy link
Contributor Author

bhouston commented Oct 2, 2023

Beautiful! When the render results do not agree with ACES, such as the attenuation dragon, the issue is with the material properties being interpreted differently in Blender compared to Three-GPU-Pathtracer. This is exactly what we wanted to identify! It is amazing!

This is pushing forward the fidelity tests so far ahead! Thank you!

@elalish
Copy link
Collaborator

elalish commented Oct 2, 2023

Excellent work! I look forward to seeing a PR. And @bhouston, I see why you're pushing for a new repo for this - if we're going to grow it in a serious way, that really does make sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants