Unity Products:Amplify Shader Editor/Tips Tricks and Known Issues

From Amplify Creations Wiki
Jump to navigation Jump to search

Product Page - Included Shaders - Manual - Shader Functions - Tutorials - API - Shader Templates - Scriptable Rendering Pipeline - Nodes - Community Nodes

Tips,Tricks and Known Issues

Tips and Tricks

Xbox UWP Limitations

Shader Model 5.0 not supported, will render black; use 3.0.

LOD Fade Behavior difference between 2018 and 2019

We internally use the unity_LODFade.x value which is a value between 0 and 1 that represents the fade value.

This value on Unity 2018 and below is always positive across all LODgroups.
Taking as an example a LOD group with two lods. LOD 0 containing a sphere and LOD1 containing a cube. When going from LOD0 to 1,the value on the sphere will decrease as the value on the cube will increase.
Both these values complement each other so if p.e. unity_LODFade.x has a value of 0.42 then on the cube its value will be 0.58.

Now on Unity 2019 this behavior has changed. Taking the former example sphere will maintain its 0.42 but now cube will have a negative value of -0.42. They now change the signal of it so we can know if it's representing a fade-out (positive) or fade-in(negative).
In order to have the legacy behavior we now have to see its signal and if negative calculate its positive complement (just add 1 to it).

Supported Texture Formats per platform

https://docs.unity3d.com/Manual/class-TextureImporterOverride.html

Depth Fetch on Mobile

If, on mobile, incorrect values are being fetched from Depth Texture related nodes then the user may need to explicitly let Unity know that it needs to use the depth texture. For that a simple script like this one below must be created and placed over the Camera game object.

using UnityEngine;
[ExecuteInEditMode]
public class SetCameraDepth : MonoBehaviour
{
    private void Awake()
    {
        GetComponent<Camera>().depthTextureMode |= DepthTextureMode.Depth;
    }
}
Baked Lightmaps with Alpha Clip

Unity has strict rules when baking lightmaps with objects using alpha clip.
The texture property which has the alpha clip information must be called _MainTex, and the channel containing the clip values must be the alpha channel in order for the lightmapper to take it into account when doing the alpha clip. So the only thing you can either:

  • Rename the Texture Sample name to MainTex and ASE will automatically create a _MainTex property name
  • Go to the Texture Sample node properties, hit the Locker icon right next to the Property Name field and manually set it to _MainTex
Baked Lightmaps on Lightweight

Besides shader must having the _EMISSION flag turned on and meta pass, the material must also be set to be generating emissive light from baked light maps.

using UnityEngine;

[ExecuteInEditMode]
public class SetEmissionMaterialKeywords : MonoBehaviour
{
	public Material EmissiveMaterial;

	private void Update()
	{
		EmissiveMaterial.globalIlluminationFlags &= MaterialGlobalIlluminationFlags.AnyEmissive;
		EmissiveMaterial.globalIlluminationFlags |= MaterialGlobalIlluminationFlags.BakedEmissive;
		EmissiveMaterial.globalIlluminationFlags |= MaterialGlobalIlluminationFlags.RealtimeEmissive;
		EmissiveMaterial.EnableKeyword( "_EMISSION" );
	}

}
Water on the PS4

Hi Vincent, so it happens that I do have an idea of what that is, simply because we had the same issue in our game and I spend a few days trying everything that made sense until I finally figured out the real issue.

For some reason I can't explain (I haven't found any information on this in unity docs or forums, nor at sony dev forums) the depth buffer doesn't seem to work exactly the same as in a PC build. And this is important because what you are seeing is either the "depth fade" node or a calculation with "screen depth" node failing, both of which use the depth buffer. They fail because when they run they are supposed to sample the depth buffer in order to do the depth coloring effect, however you also want your shader to write to depth so it intersects with your characters and objects, this usually done by setting your shader at a later Queue like transparent but leaving the depth writing ON, but by having these together it seems the depth buffer is not available or it can't figure out whether it's own depth contributes to the buffer on not. Usually it is the case that transparent objects do not go to the depth buffer but in PS4 they do (and btw nintendo switch has a similar issue so if you are planning on that be warned).

In our case we solved it by turning the depth writing OFF in the water shader while setting it to Opaque. Doing this will visually fix the artifacts but it will also ruin the intersection with other objects, they will simply float on top. We solved this last part by creating a second shader that only writes to the depth buffer (ColorMask 0) and does that at a later Queue, AlphaTest+49 (just before transparent). In the water object we added two materials to the material list, one for each shader with the "depthonly" shader on the second place in the list. This solution is not perfect, you'll have two draw calls for the water instead of one (not a big deal since one isn't heavy), the biggest issue is that if you have different water bodies that can be seen at different heights on top of each other it might not know the order of rendering and you may end up with something that was supposed to be rendered below being rendered above, but this never happens in our game so it was fine with us. You could probably fix this by having different water shaders in different render queues.

The real final solution would be to manually resolve the depth buffer at a specific stage, with that you could guarantee to work the same for all platforms. This seems far more complex, specially for legacy shaders, in SRP this seems to be somewhat trivial. This dirty hack did it for us.

The second shader was not create in ASE, here's the full shader, nothing special, just rendering the depth

Shader "WaterDepthOnly"
{
    SubShader
    {
        Tags{ "RenderType" = "Opaque"  "Queue" = "AlphaTest+49" "IgnoreProjector" = "True" "ForceNoShadowCasting" = "True" "DisableBatching" = "True" }
        Cull Back
        Pass
        {
            ColorMask 0
            ZWrite On
        }
    }
}

Original post: https://forum.unity.com/threads/best-tool-asset-store-award-amplify-shader-editor-node-based-shader-creation-tool.430959/page-99#post-4905377


Additional information given to ***removed email*** via email [support] transparent shader issue on ps4. "Hey ***Removed Name****,

No problem at all. We're here to help.

The fact that you are not seeing the intersection being broken over your scene may be caused by how Unity is ordering the objects on your scene. Your water plane is being rendered after all those logs so it having depth write on or off visually makes no difference.

But this may not always be like that, and if you have dynamic objects on your scene the probability of this issue happening is bigger.

Here's a quick example:

9Rie84hdoZ.gif

Over this scene I have 2 cubes and one plane, all with the same render queue. The plane is intercepting both cubes at the same y position. The shader I'm editing on the gif is the one applied to the plane. You can see that setting the ZWrite to Off only affects the interception with the left cube, although both of them have the exact same queue.

By looking at the frame debugger you can notice that what is happening is that Unity is first drawing the right cube, then the plane and finally the left cube ( thus the issue is visible ).

2xve5u2asv.png


You can control object rendering order by setting different Render Queues to different shaders. The bigger the render queue number, the later the object will be rendered. When you set the Render Queue to Geometry, you are internally defining an order value for which objects will be rendered.

P.e. the internal value for Geometry is 2000 and for Transparent is 3000, thus all shaders set with the Transparent render queue will be rendered after the Geometry ones. You can check here the official Unity documentation about Render Queues.

If multiple shaders share the same render queue then what Unity does to order them is a bit of a black box, but is somewhat related to the objects relative position to the current rendering camera. The problem is that it not always orders how we want, like you saw on the gif I posted.

Also one other thing to take into account, if you are using both Deferred and Forward rendering paths then by default all Opaque objects are rendered over the deferred path which always happens before the Forward one.

So getting back to the PS4 issue. Like my colleague mentioned PS4 has issues when reading and writing the Depth texture on the same shader, that is why we need to separate it into two shaders. The first one deals with basically rendering the water. The second one, which is the WaterDepthOnly mentioned on the post, only makes sure the water is correctly written over the depth buffer.

One detail I would recommend, which wasn't on the forum post, is on the main water shader, set 449 to the Queue Index over General Options on the Master Node properties. This makes your water to be rendered after all opaque objects, right before Alpha Test render queue kicks in.

Unity_ckSOraly5r.png

This step is also important for the Grab Screen Color node, since it also ensures that the grab pass texture given by Unity contains all the opaque objects under the water.

Here's also a shot on how you should have your materials setup over the Water mesh renderer. First your main water shader, and second the Water Depth Only shader we supplied over the forum post.

devenv_y7dGCOpCST.png

Regarding the water effect we did for Decay of Logos, unfortunately I'm not allowed to share it, but hopefully with this email you'll be able to set it to work on your end.

Please let us know if you have further questions.

My best regards"

Accessing Depth Data on Post-Processing stack

Depth is only sampled correctly if the PostProcessEvent is set to BeforeTransparent, in the post processing script and for both _MainTex and _CameraDepthTexture, the Screen Position is connected to the UV port.

https://forum.unity.com/threads/best-tool-asset-store-award-amplify-shader-editor-node-based-shader-creation-tool.430959/page-97#post-4748039


Accessing depth data on unlit scene

If a user needs to access values from the depth buffer on an unlit scene ( without any lights ), the DepthTextureMode.Depth must be manually set over the camera (Camera.main.depthTextureMode). This is needed since no shadow caster passes are called because without lighting there are no shadows.

Supplying correct derivatives on transformed UVS

When modifying uvs and then using it on texture fetches, incorrect uv derivatives are calculated fetching some values from incorrect mips. This results into artifacts that resembles seams. In order to fix, just apply DDX and DDY over the original UVS and use them on the texture fetch.

Shadow on Transparent Objects (Surface)

Our surface shader adds a custom shadow caster when on transparent blend mode which uses the alpha value on the shadow itself.
In order for transparency not affect shadows, simply place a Static Switch using the UNITY_PASS_SHADOWCASTER keyword over the Opacity port and set its True port to 1.
This way it forces always Alpha to 1 on the shadow caster pass.

Render Mode on Point Lights
  • Auto: Unity decides for each light if its calculated by vertex or by pixel
  • Important: Light is always rendered per pixel
  • Not Important: Light is always rendered per pixel


Special Considerations for Not Important Lights

Keyword POINT and _WorldSpaceLightPos0.w are not defined, so they can't be used when using Per Vertex point lights. Unity only does per vertex lighting over point lights, directional lighting is always done per pixel. So the VERTEXLIGHT_ON can be used to detect if a certain vertex is being affected by a per vertex point light. It is really important to take into account that VERTEXLIGHT_ON keyword is only defined over the vertex function so, if you need to use that information over the fragment, you'll need to pass that information from the vertex to the fragment via an interpolator.
KiDws9o.png

GBuffer Contents

All deferred GBuffer contents can be accessed via the global textures below.

  • half4 gbuffer0 = tex2D (_CameraGBufferTexture0, uv); // Diffuse RGB, Occlusion A
  • half4 gbuffer1 = tex2D (_CameraGBufferTexture1, uv); // Specular RGB, Smoothness A
  • half4 gbuffer2 = tex2D (_CameraGBufferTexture2, uv); // Normal RGB

Uvs must be screen space position.

Global Array
  • Arrays in shaders cannot be indexed by a property but only constants. If a user connects p.e. a Float node as a property into the Index port of a Global Array node, it won't work.
  • SourceThe first time that you set either a local or global array, Unity fixes the size of the array itself. For instance, if you initialize an array defined in the shader as uniform float _Array[10]; with a C# array defined as float[] array = new float[5];, you will not be able to set more than 5 elements to your array. Because of this and to avoid bugs, always initialize arrays with the maximum size allowed.
Code Comments
  • Add comments to specific parts of your shader using a Custom Expression in Call Mode.

Known Issues

GPU Instancing + Tessellation

At the moment these two are incompatible. Keep in mind that GPU instancing is useful when rendering the same object many times. As tessellation is expensive and about adding details, they're usually not a good combination. If you want to have many instances of something that should use tessellation up close, you could use a LOD group. Have LOD 0 use a non-instanced tessellated material, while all other LOD levels use an instanced non-tessellated material.

https://catlikecoding.com/unity/tutorials/advanced-rendering/tessellation/


However this seems to be supported on URP

"Upon investigating we've determined that the reason for near patches of terrain not rendering is that instancing and tessellation is unfortunately not supported by Unity. It is supported by the new render pipeline (URP) however. We are putting it under a feature in our backlog (rather than a bug), and will take this into consideration in the future. The request will be moved to our internal planning tool, and this case will be closed. This does not guarantee that the feature will be implemented, as objections may be raised once technical planning is started."

https://forum.unity.com/threads/any-chance-of-getting-tesselation-working-on-instanced-terrains.860461/

Big Delay When Creating/Updating Shaders

Over Unity 2019.4 and above some users started reporting big creation/importing times when creating not only ASE but also other custom shaders. (ShaderGraph files don't suffer from this). This was later found to be related to the Asynchronous Shader Compilation option being turned on over the Project Settings > Editor > Shader Compilation. Turning this option off will decrease the shader creation/updating time to their normal time.

Static Switch Enum

Static Switch enum mode is set to a max of 9 options due to a Unity limitation. Advert user that using this amount will increase the amount of keywords (which are limited ) and shader variants (which greatly increase compile time ) so it should be used with caution.

Huge Shader Compilation Times

Shader Features and Multi-Compile must be used with caution with Surface Shaders. Each surface shader goes through an intermediate process on which it transforms its code into several vertex/ frag programs.
For each new Shader Feature and Multi-Compile option added the amount of shader variants will exponentially increase into a point where the conversion process from surface to vertex frag will fail resulting in errors like:
Assertion failed: Shader Compiler Socket Exception: Terminating shader compiler process, 0 retries left
As mentioned this only happens with surface shaders so using Shader Features and Multi-Compile directly on a vertex/frag shader won't result on this mentioned issue.

Light Cookies
World Normal

Jira Ticket Using World Normal node on the vertex function generates issues when working with Light cookies. It internally calls the Unity native UnityObjectToWorldNormal(...) which seems to be incompatible with how light cookies are calculated and applied over the shadow caster. One possible workaround is to do the object to world transform by hand.

Transparency

Jira Ticket There's an issue with how Unity generates the code which copies the cookies information from the vertex to the fragment program. Somehow cookie information is being incorrectly sent/written and is being taken into account as an alpha value. This behavior can be prevented by setting the Blend Mode to Transparent. This mode internally forces the Alpha:Fade surface option to be written into the shader which then correctly passes/writes the light cookie.

GPU Instancing
  • GPU Instancing does not work correctly with Deferred rendering path on Metal as instanced values seems to overlap each other
    • Reported issues refer to this be happening on Intel HD cards
    • There is a similar issue reported by us Instanced properties don't have correct value on surface shader on macOS which can be tracked here.

Both these cylinders have the same material applied to them, thus incorrectly showing different results

GUI
  • Can't animate material properties.
Unity Internal Time/Timer
  • Unity internal time values starts to have precision errors after a hour or so of running the application
    • Although Time node is affected by this issue, some other time nodes like Sin Time aren't
Depth Buffer and Consoles

Depth buffer is written in different stages when working with either consoles or PC.
On PC this buffer is written after all opaques are drawn as this doesn't happen on consoles( not sure what which stage though).
What this means is that nodes like Depth Fade won't work under a normal context as when the fragment is being rendered it is also written on the depth buffer (even when on the Transparent Queue).
The solution comes by separating the shader into two, and adding both materials into the mesh renderer material queue.
The first one must have ZWrite Off and its Render Queue must be under the Geometry range of values.
This shader will be the main one where you do your grab pass, apply refraction, etc.
The second one only needs to write on the ZBuffer ( ZWrite On ) thus its color mask should by set to all channels so no color is written (ColorMask 0). Place its render queue at the very end of the Alpha Cutout (AlphaTest+49) range so it catches all the objects it needs to catch.

Shader "DepthOnly"
{
	SubShader
	{
		Tags{ "RenderType" = "Opaque"  "Queue" = "AlphaTest+49" "IgnoreProjector" = "True" "ForceNoShadowCasting" = "True" "DisableBatching" = "True" }
		Cull Back
		Pass
		{
			ColorMask 0
			ZWrite On
		}
	}
}