AR world dissolve effect in Unity

I have been trying out ARCore & ARKit using Unity’s ARFoundation. It allows you to utilize both ARKit and ARCore without having to write systems for each. This makes it a a nice layer of abstraction that allows for basic AR types to be setup; point cloud data, detected plane visualisation, and AR ray casting that is more efficient than normal physics ray casting.

I thought it would be cool to dissolve the game world away to reveal the real world.

Dissolve Shader

I needed to make a dissolve shader to get started, it dissolves the mesh it is rendering by sampling a noise texture, I used an fbm noise texture.

The shader discards any pixels past the dissolution level to form holes in the mesh. It will also add a burn effect around the edges by corresponding the grey scale values to a _BurnRamp and using that color instead, similar to Unity’s toon shader ramp. The vertex & fragment shaders ignore lighting making it easier to convert to an image effect later on for use in the post processing stack.

Shader "Unlit/SimpleDissolve" {
    Properties {
        _Color ("Color", Color) = (1,1,1,1)

        _NoiseMap("Noise Map (RGB)", 2D) = "white" {}
        _DissolutionLevel("Dissolution Level", Range(0.0, 1.0)) = 0
        [NoScaleOffset] _BurnRamp("Burn Ramp (RGB)", 2D) = "white" {}
        _BurnSize("Burn Size", Range(0.0, 1.0)) = 0.15        
        _BurnColor("Burn Color", Color) = (1,1,1,1) 
        _EmissionAmount("Burn Edge Emission", float) = 2.0
    SubShader {
        Tags { "RenderType"="Opaque" }
        LOD 200
        Cull Off

        Pass {
            #pragma vertex vert
            #pragma fragment frag
            #include "UnityCG.cginc"
            struct appdata {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;

            struct v2f {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;

            fixed4 _Color;
            sampler2D _NoiseMap;
            sampler2D _BurnRamp;
            fixed4 _BurnColor;
            float _BurnSize;
            float _DissolutionLevel;
            float _EmissionAmount;

            float4 _NoiseMap_ST;
            v2f vert (appdata v) {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _NoiseMap);
                return o;
            fixed4 frag (v2f i) : SV_Target {
                half test = tex2D(_NoiseMap, i.uv).rgb - _DissolutionLevel;

                if (test < _BurnSize && _DissolutionLevel > 0) {
                    _Color += tex2D(_BurnRamp, float2(test * (1 / _BurnSize), 0)) * _BurnColor * _EmissionAmount;

                return _Color;
    FallBack Off

It looks good as a dissolve effect, but when you add the camera feed for AR underneath, it just looks like a weird piece of card held in front of the camera. ar basics

AR Basic dissolve

Dissolve Post-process effect.

To work around this I decided to use a custom dissolve post-process built for the Post-processing Stack V2 I followed their wiki on how to set it up, creating the grey scale example and then modifying it to meet my needs.

At first I thought it would be cool to use the cameras depth instead of the noise texture to base the dissolve effect on. However, as the camera for ARCore and ARKit are not Project Tango camera’s they do not provide any depth information. I looked into generating depth from a 2D image as it turns out this is rather slow with current research at about 1 fps on a Nvidia K1 chip.

Instead I just fake it, I grey scale the world and use that as our noise texture. This requires a rework of the shader so it follows the post-processing format for image effects. It utilizes the post-processing stacks macros to make it cross platform shader code: TEXTURE2D_SAMPLER2D & SAMPLE_TEXTURE2D for creating textures and sampling them, it uses #pragma vertex VertDefault to handle the vertex shader as we don’t need any custom vertex manipulation then everything else is the previous code converted to HLSL using _MainTex, the screen texture provided by the stack, instead of the noise texture. The grey scale calculation is slightly different, the values have been inverted, this means the brightest areas shine through first, with dark areas coming in last.

Shader "Mobile/AR/DissolveFromBackground" {

    #include "PostProcessing/Shaders/StdLib.hlsl"

    #pragma fragmentoption ARB_precision_hint_fastest 

    // Screen color
    TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex);
    float4 _MainTex_ST;

    float4 _Color;

    float _DissolutionLevel;

    TEXTURE2D_SAMPLER2D(_BurnRamp, sampler_BurnRamp);
    float _BurnSize;
    float4 _BurnColor;
    float _BurnEdgeEmission;

    float4 Frag(VaryingsDefault i) : SV_Target {
        float4 original = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.texcoord);

        float invertedGreyscale = 1.0 - dot(original.rgb, float3(0.3, 0.59, 0.11));

        float luminance = invertedGreyscale - _DissolutionLevel;

        if (luminance < _BurnSize && _DissolutionLevel > 0) {
            _Color += SAMPLE_TEXTURE2D(_BurnRamp, sampler_BurnRamp, float2(luminance * (1 / _BurnSize), 0.0)) * _BurnColor * _BurnEdgeEmission;

        return _Color;


    SubShader {
        Cull Off ZWrite Off ZTest Always

        Pass {
            #pragma target 3.0
            #pragma vertex VertDefault
            #pragma fragment Frag


    Fallback Off

If you modify the _DissolutionLevel over time this produces a nice effect of the world slowly coming into focus and the game world disappearing I may try using a 360 Video capture to render in place of the grey background to make it appear like you are exiting a 3D game world.

ar dissolve effect

Post-processing Stack V2 - AR Dissolve effect


comments powered by Disqus