はじめに
マテリアルに法線マップ(Normal Map)を割り当てるとモノに凹凸が付いて見えます。
これは法線マッピング(Normal Mapping) と呼ばれる手法になります。
今回は、Normal Mapが Universal RP 内部でどのように利用されるのかを追ってみました。
結論から述べると、Lambert反射 や、鏡面BRDF で使用する法線にノーマルマップは利用されます。
環境
Universal RP 10.2.2
Unity 2020.2.0f1
Lit.shader を読んでみる
まずは Lit.shader
を読んでみます。
プロパティ
ノーマルマップのプロパティは以下のように定義されています。
_BumpMap("Normal Map", 2D) = "bump" {}
...
[Normal] _DetailNormalMap("Normal Map", 2D) = "bump" {}
Normal Map は、 シェーダー内部では _BumpMap
として扱われているようです。
描画Pass
オブジェクトの描画を行っているのは UniversalForward
パスで、描画ロジックは LitForwardPass.hlsl
に実装されています。
# pragma vertex LitPassVertex
# pragma fragment LitPassFragment
# include "Packages/com.unity.render-pipelines.universal/Shaders/LitInput.hlsl"
# include "Packages/com.unity.render-pipelines.universal/Shaders/LitForwardPass.hlsl"
```UniversalForward```パスの定義
// ------------------------------------------------------------------
// Forward pass. Shades all light in a single pass. GI + emission + Fog
Pass
{
// Lightmode matches the ShaderPassName set in UniversalRenderPipeline.cs. SRPDefaultUnlit and passes with
// no LightMode tag are also rendered by Universal Render Pipeline
Name "ForwardLit"
Tags{"LightMode" = "UniversalForward"}
Blend[_SrcBlend][_DstBlend]
ZWrite[_ZWrite]
Cull[_Cull]
HLSLPROGRAM
#pragma exclude_renderers gles gles3 glcore
#pragma target 4.5
// -------------------------------------
// Material Keywords
#pragma shader_feature_local _NORMALMAP
#pragma shader_feature_local_fragment _ALPHATEST_ON
#pragma shader_feature_local_fragment _ALPHAPREMULTIPLY_ON
#pragma shader_feature_local_fragment _EMISSION
#pragma shader_feature_local_fragment _METALLICSPECGLOSSMAP
#pragma shader_feature_local_fragment _SMOOTHNESS_TEXTURE_ALBEDO_CHANNEL_A
#pragma shader_feature_local_fragment _OCCLUSIONMAP
#pragma shader_feature_local _PARALLAXMAP
#pragma shader_feature_local _ _DETAIL_MULX2 _DETAIL_SCALED
#pragma shader_feature_local_fragment _SPECULARHIGHLIGHTS_OFF
#pragma shader_feature_local_fragment _ENVIRONMENTREFLECTIONS_OFF
#pragma shader_feature_local_fragment _SPECULAR_SETUP
#pragma shader_feature_local _RECEIVE_SHADOWS_OFF
// -------------------------------------
// Universal Pipeline keywords
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS_CASCADE
#pragma multi_compile _ _ADDITIONAL_LIGHTS_VERTEX _ADDITIONAL_LIGHTS
#pragma multi_compile_fragment _ _ADDITIONAL_LIGHT_SHADOWS
#pragma multi_compile_fragment _ _SHADOWS_SOFT
#pragma multi_compile_fragment _ _SCREEN_SPACE_OCCLUSION
#pragma multi_compile _ LIGHTMAP_SHADOW_MIXING
#pragma multi_compile _ SHADOWS_SHADOWMASK
// -------------------------------------
// Unity defined keywords
#pragma multi_compile _ DIRLIGHTMAP_COMBINED
#pragma multi_compile _ LIGHTMAP_ON
#pragma multi_compile_fog
//--------------------------------------
// GPU Instancing
#pragma multi_compile_instancing
#pragma multi_compile _ DOTS_INSTANCING_ON
#pragma vertex LitPassVertex
#pragma fragment LitPassFragment
#include "Packages/com.unity.render-pipelines.universal/Shaders/LitInput.hlsl"
#include "Packages/com.unity.render-pipelines.universal/Shaders/LitForwardPass.hlsl"
ENDHLSL
}
LitForwardPass.hlsl
LitForwardPass.hlsl
を見てみましょう。
色の計算は LitPassFragment
メソッドによって行われています。
LitPassFragment
の中では、テクスチャサンプリング・レンダリングが以下で実装されています。
SurfaceData surfaceData;
InitializeStandardLitSurfaceData(input.uv, surfaceData);
InputData inputData;
InitializeInputData(input, surfaceData.normalTS, inputData);
half4 color = UniversalFragmentPBR(inputData, surfaceData);
```LitPassFragment```の定義
// Used in Standard (Physically Based) shader
half4 LitPassFragment(Varyings input) : SV_Target
{
UNITY_SETUP_INSTANCE_ID(input);
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
# if defined(_PARALLAXMAP)
# if defined(REQUIRES_TANGENT_SPACE_VIEW_DIR_INTERPOLATOR)
half3 viewDirTS = input.viewDirTS;
# else
half3 viewDirTS = GetViewDirectionTangentSpace(input.tangentWS, input.normalWS, input.viewDirWS);
# endif
ApplyPerPixelDisplacement(viewDirTS, input.uv);
# endif
SurfaceData surfaceData;
InitializeStandardLitSurfaceData(input.uv, surfaceData);
InputData inputData;
InitializeInputData(input, surfaceData.normalTS, inputData);
half4 color = UniversalFragmentPBR(inputData, surfaceData);
color.rgb = MixFog(color.rgb, inputData.fogCoord);
color.a = OutputAlpha(color.a, _Surface);
return color;
}
テクスチャサンプリング
マテリアルに設定されたテクスチャ類 は InitializeStandardLitSurfaceData
メソッドによってサンプリングが行われ、
surfaceData.normalTS
に格納されます。
InitializeStandardLitSurfaceData(input.uv, surfaceData);
inline void InitializeStandardLitSurfaceData(float2 uv, out SurfaceData outSurfaceData)
{
...
outSurfaceData.normalTS = SampleNormal(uv, TEXTURE2D_ARGS(_BumpMap, sampler_BumpMap), _BumpScale);
InitializeStandardLitSurfaceDataの定義
inline void InitializeStandardLitSurfaceData(float2 uv, out SurfaceData outSurfaceData)
{
half4 albedoAlpha = SampleAlbedoAlpha(uv, TEXTURE2D_ARGS(_BaseMap, sampler_BaseMap));
outSurfaceData.alpha = Alpha(albedoAlpha.a, _BaseColor, _Cutoff);
half4 specGloss = SampleMetallicSpecGloss(uv, albedoAlpha.a);
outSurfaceData.albedo = albedoAlpha.rgb * _BaseColor.rgb;
# if _SPECULAR_SETUP
outSurfaceData.metallic = 1.0h;
outSurfaceData.specular = specGloss.rgb;
# else
outSurfaceData.metallic = specGloss.r;
outSurfaceData.specular = half3(0.0h, 0.0h, 0.0h);
# endif
outSurfaceData.smoothness = specGloss.a;
outSurfaceData.normalTS = SampleNormal(uv, TEXTURE2D_ARGS(_BumpMap, sampler_BumpMap), _BumpScale);
outSurfaceData.occlusion = SampleOcclusion(uv);
outSurfaceData.emission = SampleEmission(uv, _EmissionColor.rgb, TEXTURE2D_ARGS(_EmissionMap, sampler_EmissionMap));
# if defined(_CLEARCOAT) || defined(_CLEARCOATMAP)
half2 clearCoat = SampleClearCoat(uv);
outSurfaceData.clearCoatMask = clearCoat.r;
outSurfaceData.clearCoatSmoothness = clearCoat.g;
# else
outSurfaceData.clearCoatMask = 0.0h;
outSurfaceData.clearCoatSmoothness = 0.0h;
# endif
# if defined(_DETAIL)
half detailMask = SAMPLE_TEXTURE2D(_DetailMask, sampler_DetailMask, uv).a;
float2 detailUv = uv * _DetailAlbedoMap_ST.xy + _DetailAlbedoMap_ST.zw;
outSurfaceData.albedo = ApplyDetailAlbedo(detailUv, outSurfaceData.albedo, detailMask);
outSurfaceData.normalTS = ApplyDetailNormal(detailUv, outSurfaceData.normalTS, detailMask);
# endif
}
TEXTURE2D_ARGSの定義
TEXTURE2D_ARGS
は D3D11.hlsl
に定義されています
# define TEXTURE2D_ARGS(textureName, samplerName) textureName, samplerName
SampleNormalの定義
SampleNormalはLitInput.hlslに定義されています。
half3 SampleNormal(float2 uv, TEXTURE2D_PARAM(bumpMap, sampler_bumpMap), half scale = 1.0h)
{
# ifdef _NORMALMAP
half4 n = SAMPLE_TEXTURE2D(bumpMap, sampler_bumpMap, uv);
#if BUMP_SCALE_NOT_SUPPORTED
return UnpackNormal(n);
#else
return UnpackNormalScale(n, scale);
#endif
# else
return half3(0.0h, 0.0h, 1.0h);
# endif
}
UnpackNormal
や UnpackNormalScale
は Packing.hlsl
に定義されているのですが、全部載せるとキリがないので割愛します。
そして surfaceData
は、InitializeInputData
メソッドに渡され、inputData.normalWS
として設定されます。
ノーマルマップの情報は inputData.normalWS
に入ることになります。
InputData inputData;
InitializeInputData(input, surfaceData.normalTS, inputData);
# if defined(_NORMALMAP) || defined(_DETAIL)
float sgn = input.tangentWS.w; // should be either +1 or -1
float3 bitangent = sgn * cross(input.normalWS.xyz, input.tangentWS.xyz);
inputData.normalWS = TransformTangentToWorld(normalTS, half3x3(input.tangentWS.xyz, bitangent.xyz, input.normalWS.xyz));
# else
inputData.normalWS = input.normalWS;
# endif
InitializeInputData定義
void InitializeInputData(Varyings input, half3 normalTS, out InputData inputData)
{
inputData = (InputData)0;
# if defined(REQUIRES_WORLD_SPACE_POS_INTERPOLATOR)
inputData.positionWS = input.positionWS;
# endif
half3 viewDirWS = SafeNormalize(input.viewDirWS);
# if defined(_NORMALMAP) || defined(_DETAIL)
float sgn = input.tangentWS.w; // should be either +1 or -1
float3 bitangent = sgn * cross(input.normalWS.xyz, input.tangentWS.xyz);
inputData.normalWS = TransformTangentToWorld(normalTS, half3x3(input.tangentWS.xyz, bitangent.xyz, input.normalWS.xyz));
# else
inputData.normalWS = input.normalWS;
# endif
inputData.normalWS = NormalizeNormalPerPixel(inputData.normalWS);
inputData.viewDirectionWS = viewDirWS;
# if defined(REQUIRES_VERTEX_SHADOW_COORD_INTERPOLATOR)
inputData.shadowCoord = input.shadowCoord;
# elif defined(MAIN_LIGHT_CALCULATE_SHADOWS)
inputData.shadowCoord = TransformWorldToShadowCoord(inputData.positionWS);
# else
inputData.shadowCoord = float4(0, 0, 0, 0);
# endif
inputData.fogCoord = input.fogFactorAndVertexLight.x;
inputData.vertexLighting = input.fogFactorAndVertexLight.yzw;
inputData.bakedGI = SAMPLE_GI(input.lightmapUV, input.vertexSH, inputData.normalWS);
inputData.normalizedScreenSpaceUV = GetNormalizedScreenSpaceUV(input.positionCS);
inputData.shadowMask = SAMPLE_SHADOWMASK(input.lightmapUV);
}
描画の計算
陰影の計算は UniversalFragmentPBR
メソッドの中で行われています。
UniversalFragmentPBR
の中ではGlobal Illumination、Main Light、Additional Lightなどのライティング計算が行われています。
half4 color = UniversalFragmentPBR(inputData, surfaceData);
```UniversalFragmentPBR``` メソッドの定義
///////////////////////////////////////////////////////////////////////////////
// Fragment Functions //
// Used by ShaderGraph and others builtin renderers //
///////////////////////////////////////////////////////////////////////////////
half4 UniversalFragmentPBR(InputData inputData, SurfaceData surfaceData)
{
# ifdef _SPECULARHIGHLIGHTS_OFF
bool specularHighlightsOff = true;
# else
bool specularHighlightsOff = false;
# endif
BRDFData brdfData;
// NOTE: can modify alpha
InitializeBRDFData(surfaceData.albedo, surfaceData.metallic, surfaceData.specular, surfaceData.smoothness, surfaceData.alpha, brdfData);
BRDFData brdfDataClearCoat = (BRDFData)0;
# if defined(_CLEARCOAT) || defined(_CLEARCOATMAP)
// base brdfData is modified here, rely on the compiler to eliminate dead computation by InitializeBRDFData()
InitializeBRDFDataClearCoat(surfaceData.clearCoatMask, surfaceData.clearCoatSmoothness, brdfData, brdfDataClearCoat);
# endif
// To ensure backward compatibility we have to avoid using shadowMask input, as it is not present in older shaders
# if defined(SHADOWS_SHADOWMASK) && defined(LIGHTMAP_ON)
half4 shadowMask = inputData.shadowMask;
# elif !defined (LIGHTMAP_ON)
half4 shadowMask = unity_ProbesOcclusion;
# else
half4 shadowMask = half4(1, 1, 1, 1);
# endif
Light mainLight = GetMainLight(inputData.shadowCoord, inputData.positionWS, shadowMask);
#if defined(_SCREEN_SPACE_OCCLUSION)
AmbientOcclusionFactor aoFactor = GetScreenSpaceAmbientOcclusion(inputData.normalizedScreenSpaceUV);
mainLight.color *= aoFactor.directAmbientOcclusion;
surfaceData.occlusion = min(surfaceData.occlusion, aoFactor.indirectAmbientOcclusion);
#endif
MixRealtimeAndBakedGI(mainLight, inputData.normalWS, inputData.bakedGI);
half3 color = GlobalIllumination(brdfData, brdfDataClearCoat, surfaceData.clearCoatMask,
inputData.bakedGI, surfaceData.occlusion,
inputData.normalWS, inputData.viewDirectionWS);
color += LightingPhysicallyBased(brdfData, brdfDataClearCoat,
mainLight,
inputData.normalWS, inputData.viewDirectionWS,
surfaceData.clearCoatMask, specularHighlightsOff);
# ifdef _ADDITIONAL_LIGHTS
uint pixelLightCount = GetAdditionalLightsCount();
for (uint lightIndex = 0u; lightIndex < pixelLightCount; ++lightIndex)
{
Light light = GetAdditionalLight(lightIndex, inputData.positionWS, shadowMask);
#if defined(_SCREEN_SPACE_OCCLUSION)
light.color *= aoFactor.directAmbientOcclusion;
#endif
color += LightingPhysicallyBased(brdfData, brdfDataClearCoat,
light,
inputData.normalWS, inputData.viewDirectionWS,
surfaceData.clearCoatMask, specularHighlightsOff);
}
# endif
# ifdef _ADDITIONAL_LIGHTS_VERTEX
color += inputData.vertexLighting * brdfData.diffuse;
# endif
color += surfaceData.emission;
return half4(color, surfaceData.alpha);
}
MainLightのライティング計算を読む
今回興味があるのは、法線マップが物体の凹凸にどのように寄与するかという点だけなので、
Main Light の のライティング計算だけを追ってみます。
color += LightingPhysicallyBased(brdfData, brdfDataClearCoat,
mainLight,
inputData.normalWS, inputData.viewDirectionWS,
surfaceData.clearCoatMask, specularHighlightsOff);
LightingPhysicallyBased
LightingPhysicallyBased
メソッドは以下のように定義されています。
いま興味があるのは、Normal Mapがどこで使用されているか という点なので、
Normal Mapから取り出したワールド法線 normalWS
の使用場所を追っていきます。
half3 LightingPhysicallyBased(BRDFData brdfData, BRDFData brdfDataClearCoat,
half3 lightColor, half3 lightDirectionWS, half lightAttenuation,
half3 normalWS, half3 viewDirectionWS,
half clearCoatMask, bool specularHighlightsOff)
{
half NdotL = saturate(dot(normalWS, lightDirectionWS));
half3 radiance = lightColor * (lightAttenuation * NdotL);
half3 brdf = brdfData.diffuse;
# ifndef _SPECULARHIGHLIGHTS_OFF
[branch] if (!specularHighlightsOff)
{
brdf += brdfData.specular * DirectBRDFSpecular(brdfData, normalWS, lightDirectionWS, viewDirectionWS);
# if defined(_CLEARCOAT) || defined(_CLEARCOATMAP)
// Clear coat evaluates the specular a second timw and has some common terms with the base specular.
// We rely on the compiler to merge these and compute them only once.
half brdfCoat = kDielectricSpec.r * DirectBRDFSpecular(brdfDataClearCoat, normalWS, lightDirectionWS, viewDirectionWS);
// Mix clear coat and base layer using khronos glTF recommended formula
// https://github.com/KhronosGroup/glTF/blob/master/extensions/2.0/Khronos/KHR_materials_clearcoat/README.md
// Use NoV for direct too instead of LoH as an optimization (NoV is light invariant).
half NoV = saturate(dot(normalWS, viewDirectionWS));
// Use slightly simpler fresnelTerm (Pow4 vs Pow5) as a small optimization.
// It is matching fresnel used in the GI/Env, so should produce a consistent clear coat blend (env vs. direct)
half coatFresnel = kDielectricSpec.x + kDielectricSpec.a * Pow4(1.0 - NoV);
brdf = brdf * (1.0 - clearCoatMask * coatFresnel) + brdfCoat * clearCoatMask;
# endif // _CLEARCOAT
}
# endif // _SPECULARHIGHLIGHTS_OFF
return brdf * radiance;
}
処理1 : 拡散反射 (Lambert)
まず最初に目につくのは、以下の処理です。
half NdotL = saturate(dot(normalWS, lightDirectionWS));
half3 radiance = lightColor * (lightAttenuation * NdotL);
法線 normalWS
と ライトベクトル lightDirectionWS
の内積をとり、
光の色 lightColor
と 光の距離減衰 lightAttenuation
を乗算しています。
これは拡散反射を計算する式になっていて、Lambert反射と呼ばれるものになります。
https://ja.wikipedia.org/wiki/%E3%83%A9%E3%83%B3%E3%83%90%E3%83%BC%E3%83%88%E5%8F%8D%E5%B0%84
ここで計算したradianceは、return時の乗数として利用されます。
return brdf * radiance;
処理2 : 鏡面反射 (Minimalist CookTorrance BRDF)
2つ目に目につくのは、以下の処理です。
brdf += brdfData.specular * DirectBRDFSpecular(brdfData, normalWS, lightDirectionWS, viewDirectionWS)
DirectBRDFSpecular
は以下のように定義されており、Minimalist CookTorrance BRDF という鏡面BRDFを実装したものになっています。
// Computes the scalar specular term for Minimalist CookTorrance BRDF
// NOTE: needs to be multiplied with reflectance f0, i.e. specular color to complete
half DirectBRDFSpecular(BRDFData brdfData, half3 normalWS, half3 lightDirectionWS, half3 viewDirectionWS)
{
float3 halfDir = SafeNormalize(float3(lightDirectionWS) + float3(viewDirectionWS));
float NoH = saturate(dot(normalWS, halfDir));
half LoH = saturate(dot(lightDirectionWS, halfDir));
// GGX Distribution multiplied by combined approximation of Visibility and Fresnel
// BRDFspec = (D * V * F) / 4.0
// D = roughness^2 / ( NoH^2 * (roughness^2 - 1) + 1 )^2
// V * F = 1.0 / ( LoH^2 * (roughness + 0.5) )
// See "Optimizing PBR for Mobile" from Siggraph 2015 moving mobile graphics course
// https://community.arm.com/events/1155
// Final BRDFspec = roughness^2 / ( NoH^2 * (roughness^2 - 1) + 1 )^2 * (LoH^2 * (roughness + 0.5) * 4.0)
// We further optimize a few light invariant terms
// brdfData.normalizationTerm = (roughness + 0.5) * 4.0 rewritten as roughness * 4.0 + 2.0 to a fit a MAD.
float d = NoH * NoH * brdfData.roughness2MinusOne + 1.00001f;
half LoH2 = LoH * LoH;
half specularTerm = brdfData.roughness2 / ((d * d) * max(0.1h, LoH2) * brdfData.normalizationTerm);
// On platforms where half actually means something, the denominator has a risk of overflow
// clamp below was added specifically to "fix" that, but dx compiler (we convert bytecode to metal/gles)
// sees that specularTerm have only non-negative terms, so it skips max(0,..) in clamp (leaving only min(100,...))
# if defined (SHADER_API_MOBILE) || defined (SHADER_API_SWITCH)
specularTerm = specularTerm - HALF_MIN;
specularTerm = clamp(specularTerm, 0.0, 100.0); // Prevent FP16 overflow on mobiles
# endif
return specularTerm;
}
参考
鏡面BRDF
https://qiita.com/mebiusbox2/items/8db00cdcaf263992a5ce
物理ベースレンダリング入門 その② - 鏡面反射BRDF・拡散反射BRDF
https://light11.hatenadiary.com/entry/2020/03/03/195249
Unity ユーザーマニュアル : 法線マップ(Normal Map)(Bump mapping)
https://docs.unity3d.com/ja/2018.4/Manual/StandardShaderMaterialParameterNormalMap.html