You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To repro this, take any D2D shader that specifies [D2DInputComplex(n)] and replace it with [D2DInputSimple(n)].
Two weird things happen:
No compile-time error, which I was expecting d2d1effecthelpers.hlsli to force, e.g. by mapping D2DSampleInputAtOffset(n) to some invalid syntax or the equivalent of a #pragma error
The shader works fine?! What's the actual difference here then? I wonder if it would confuse/break D2D if it tries to use the shader as a linked shader
My proposal is that ComputeSharp.D2D1 should detect this and emit an error.
The text was updated successfully, but these errors were encountered:
rickbrew
changed the title
No error when using [D2DInputSimple(n)] but reading beyond current pixel
No error when using [D2DInputSimple(n)] but reading outside of current pixel
Mar 2, 2024
I only noticed this recently when someone on the PDN forum posted a shader that was using
D2DInputSimple
even though they were doing complex sampling: https://forums.getpaint.net/topic/124551-gpu-median-filter/To repro this, take any D2D shader that specifies
[D2DInputComplex(n)]
and replace it with[D2DInputSimple(n)]
.Two weird things happen:
d2d1effecthelpers.hlsli
to force, e.g. by mappingD2DSampleInputAtOffset(n)
to some invalid syntax or the equivalent of a#pragma error
My proposal is that ComputeSharp.D2D1 should detect this and emit an error.
The text was updated successfully, but these errors were encountered: