tag:blogger.com,1999:blog-64818311917485902972024-03-21T21:18:57.650+08:00SmallBurgerSmallBurger encapsulates passion, convenience, and efficiency within its delicately crafted burger. Dedicated to delivering user-friendly rendering plugins and related tools that are simple yet effective.
廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.comBlogger221125tag:blogger.com,1999:blog-6481831191748590297.post-2966141208747943032024-03-04T23:54:00.002+08:002024-03-04T23:54:43.424+08:00The Simulation Scheme for Volumetric Fog<p> The asset:<br /><a href="https://assetstore.unity.com/packages/tools/painting/fakevolumefogpainter-274645">FakeVolumeFogPainter</a></p><p>The youtube:<br /><br /></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="495" src="https://www.youtube.com/embed/VQFBwxe_cOQ" width="595" youtube-src-id="VQFBwxe_cOQ"></iframe></div><br /><div>The reference medium:<br /><a href="https://medium.com/@pioneering_catawba_cheetah_996/the-simulation-scheme-for-volumetric-fog-55db7de87315">The Simulation Scheme for Volumetric Fog</a><br /></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-48760019778813931392024-03-03T15:05:00.003+08:002024-03-03T15:26:20.464+08:00SmallBurger Asset Home<p><br /><span style="font-size: x-large;"> <a href="https://assetstore.unity.com/publishers/86820?fbclid=IwAR0GvglGbFYqpE7LAwZJdfMJEhz_FAXjR4VhVk7QVbBjgGiUy6wYsH7E5rI">SmallBurger</a></span></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjR15-6ZOtTRZSAWbfR4BfwXQZA_9jVgGRwtEHHv5Es5BfhPYdo4TnuIFlOEhPcZERD0PPFe0mZb6A8b969v148P-QNZ9uC3C9F04q4YG2zApAItaYyVlVfbdQGCgkYu_ZIi7-y5KfrhSinekxpsPFX_ql9Ca0mbRmh93T4_pfogdsa8B9_BgXhlJpIY7U-/s1200/87f219a6-e43f-4ce1-9f3b-45961cd7e9bb.jpg" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="800" data-original-width="1200" height="426" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjR15-6ZOtTRZSAWbfR4BfwXQZA_9jVgGRwtEHHv5Es5BfhPYdo4TnuIFlOEhPcZERD0PPFe0mZb6A8b969v148P-QNZ9uC3C9F04q4YG2zApAItaYyVlVfbdQGCgkYu_ZIi7-y5KfrhSinekxpsPFX_ql9Ca0mbRmh93T4_pfogdsa8B9_BgXhlJpIY7U-/w640-h426/87f219a6-e43f-4ce1-9f3b-45961cd7e9bb.jpg" width="640" /></a></div><br /><p><br /></p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-84949138529669094732024-03-03T11:46:00.002+08:002024-03-03T11:46:38.116+08:00Challenges in River Creation<p>The asset:<br /><a href="https://assetstore.unity.com/packages/tools/painting/fastriverpainter-272846">FastRiverPainter</a><br /></p><p>The youtube:<br /><br /></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="498" src="https://www.youtube.com/embed/kj7NJ1ct8Wg" width="599" youtube-src-id="kj7NJ1ct8Wg"></iframe></div><br /><div>The reference medium:<br /><a href="https://medium.com/@pioneering_catawba_cheetah_996/challenges-in-river-creation-83ad8538f271">Challenges in River Creation</a><br /></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-14210850738152044252024-03-03T10:32:00.002+08:002024-03-03T10:32:29.838+08:00OriginTerrainBlending<p> The asset:<br /><a href="https://assetstore.unity.com/packages/vfx/shaders/originterrainblending-272303">OriginTerrainBlending</a><br /></p><p>The youtube:<br /><br /></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="492" src="https://www.youtube.com/embed/oXY45XjVCuM" width="592" youtube-src-id="oXY45XjVCuM"></iframe></div><br /><div><span style="background-color: white; color: #333333; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: 14.85px;">The reference Medium:<br /><a href="https://medium.com/@pioneering_catawba_cheetah_996/mountain-edge-and-surface-transition-management-fe37f4de07f2">Mountain Edge and Surface Transition Management</a><br /></span></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-73056938278767423552024-03-03T10:28:00.001+08:002024-03-03T10:28:42.884+08:00GPUPlantPainter<p>The asset:<br /> <a href="https://assetstore.unity.com/packages/tools/painting/gpuplantpainter-266965">GPUPlantPainter</a></p><p><br />The youtube:<br /><br /></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="475" src="https://www.youtube.com/embed/c7MDPh_NHBo" width="572" youtube-src-id="c7MDPh_NHBo"></iframe></div><br /><div>The reference Medium:<br /><a href="https://medium.com/@pioneering_catawba_cheetah_996/integration-and-application-of-drawmeshinstancedindirect-5fc8df7a2bf9">Integration and Application of DrawMeshInstancedIndirect</a><br /></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-23405726793949251022023-12-26T15:07:00.007+08:002023-12-26T16:08:25.927+08:00The conclusion on Unity Instancing API (2022) testing note<p> <span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">Conclusion on Unity Instancing API (2022) Testing:</span></p><p><span></span></p><a name='more'></a><p></p><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">1.Regardless of whether using DrawMeshInstanced or RenderMeshInstanced, both are limited by the underlying InstanceCount per batch. Currently, it is not possible to draw 1023 instances in a single batch. For Vulkan and PC, it requires two batches, while GLES3.0 requires nine batches.</span><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">2.If SRPBatch is enabled, it will forcibly override AutoInstance(only with the Material checkbox checked). It takes precedence over AutoInstance, presumably because Unity considers it to have better performance.</span><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">3.When using AutoInstance (only with the Material checkbox checked), it automatically performs instance batching. The result appears similar to DrawMeshInstanced, and it can also automatically handle culling. In this case, the necessity of ManualInstance (DrawMeshInstanced or RenderMeshInstanced) is questioned.Perhaps using JobSystem to asynchronously perform PerInstanceCulling might be a better alternative.</span><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">4.The order of preference is SRPBatch > AutoInstance > DynamicBatch.</span><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">5.ManualInstance (DrawMeshInstanced or RenderMeshInstanced) is completely independent of SRPBatch.</span><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">6.If there are PerInstance properties (apart from transform attributes), ManualInstance + PropertyBlock is still required.</span><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">7.Currently, DrawMeshInstancedIndirect + GPU culling + result RenderTexture seems to be the most advantageous for rendering a large quantity of the same Mesh.</span><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><br style="background-color: white; box-sizing: inherit; color: rgba(0, 0, 0, 0.9); font-family: "Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif; font-size: 15px; line-height: inherit;" /><span color="rgba(0, 0, 0, 0.9)" face=""Microsoft Yahei", "PingFang SC", "PingFang TC", "Hiragino Sans", "Hiragino Kaku Gothic Pro", -apple-system, system-ui, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", "Fira Sans", Ubuntu, Oxygen, "Oxygen Sans", Cantarell, "Droid Sans", "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Emoji", "Segoe UI Symbol", "Lucida Grande", Helvetica, Arial, sans-serif" style="background-color: white; font-size: 15px;">If there are any errors, I appreciate feedback from experts. Thank you.</span>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com2tag:blogger.com,1999:blog-6481831191748590297.post-16440741723419225652023-12-16T10:42:00.005+08:002023-12-16T11:02:24.854+08:00InifityQuadOcean(My first unity asset package)<p>InifityQuadOcean is my first Unity plugin. Due to my continuous dedication to graphic rendering techniques for mobile games, I have observed that the number of vertices significantly affects the performance of mobile devices. Therefore, the primary goal here is to address the challenges of rendering an infinite ocean using a single Quad. This involves meeting basic requirements such as Bidirectional Reflectance Distribution Function (BRDF) lighting, undulating surfaces on the water near the shore, and handling waves and foam.<span></span></p><p>The relevant plugin links are as follows:<a href="https://assetstore.unity.com/packages/vfx/shaders/inifityquadocean-258528">InifityQuadOcean</a></p><a name='more'></a><p></p><p>The first challenge lies in simulating waves. Here, we primarily use a method based on the flow of flat water normals. However, dealing with the undulating motion near the shore is not as straightforward. To address this, we use the technique of dynamic pull depth texture inverse world position. Those interested in further details can refer to my GitHub repository for this project.<a href="https://github.com/AkilarLiao/DynamicPullDepthMap">DynamicPullDepthMap</a><br /></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="537" src="https://www.youtube.com/embed/qJc5gJPG17A" width="645" youtube-src-id="qJc5gJPG17A"></iframe></div><div class="separator" style="clear: both; text-align: center;"><br /></div>Next is the noise module, where we employ the overlay of second-order Value Noise to achieve a more undulating effect. Additionally, we intensify the disturbance to enhance the overall appearance, resulting in the following effect:<p></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEi4ienXmviqzvV5bgAd-Mb0XRLQVBxkDZwlYAEzcLBiLI-T2nIJv70EI3WXBwz7cU4n5iXSZihW2DrOIVkpGfeNi1HpM0GCGAilJgN3L7PgoruDf070J-fWSbdl9nbBMn3AXsr6YH6kYpYNXKzk2vjguot6FvJjjsYU9c3-UrtZAmuzOUOq_RTwaZtJDCUx" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="424" data-original-width="834" height="326" src="https://blogger.googleusercontent.com/img/a/AVvXsEi4ienXmviqzvV5bgAd-Mb0XRLQVBxkDZwlYAEzcLBiLI-T2nIJv70EI3WXBwz7cU4n5iXSZihW2DrOIVkpGfeNi1HpM0GCGAilJgN3L7PgoruDf070J-fWSbdl9nbBMn3AXsr6YH6kYpYNXKzk2vjguot6FvJjjsYU9c3-UrtZAmuzOUOq_RTwaZtJDCUx=w640-h326" width="640" /></a></div><p>The relevant reference links for noise are as follows: <a href="https://www.shadertoy.com/view/4dS3Wd">1D, 2D & 3D Value Noise</a></p><p>Next is the lighting effect, where we calculate the depth based on the depth information to determine the relevant colors for diffuse and subsurface scattering (SSS). We then overlay the refraction processing, specular reflection, and foam color. Finally, we calculate the lighting based on the Fresnel effect, resulting in the following effect:</p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEgZusHPPWZA7dLUfp7DeA3xLXCHUXcynglIPQGc8jxqQZf9x0aizUG3pxM2vnjM4zfEkgRNf1BxB1aclLyEfvPMcwkFORa_Xy5oJbYgSGetfbkIv-_aVldEMruXq6pA74NdAbN3Zwimk16DOU3Mnen320-VVY0VyGB-qt6hxrD2vks_2xGAoXN1yX-_8P_B" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="374" data-original-width="734" height="326" src="https://blogger.googleusercontent.com/img/a/AVvXsEgZusHPPWZA7dLUfp7DeA3xLXCHUXcynglIPQGc8jxqQZf9x0aizUG3pxM2vnjM4zfEkgRNf1BxB1aclLyEfvPMcwkFORa_Xy5oJbYgSGetfbkIv-_aVldEMruXq6pA74NdAbN3Zwimk16DOU3Mnen320-VVY0VyGB-qt6hxrD2vks_2xGAoXN1yX-_8P_B=w640-h326" width="640" /></a></div><br />Regarding the details of foam, there are currently three types of foam effects: Wave, Edge, and FX (Interactive water waves). For the FX part, we employ the method of splatting particle weight maps. Finally, these are overlaid to create the FoamMask, resulting in the following effect:<p></p><p></p><div class="separator" style="clear: both; text-align: center;"><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEg0oDj-qVSpIxs9L9gzxT-Z4PjfbzzAgV1rJSOsx-9h-TVeHFIP4WUmNWBvab873qsXgoS06HerW_YSgljWTlIWrMWKc-tUd4J1GjBtxuBAHy4TTfFcl6HjFVzEFTvSzz20RtvUeiRicCDn6Ipgzq7GQ76Mq-0_9E5Tb-tV4m62_8hzh_RgftEote1fzhM7" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="314" data-original-width="616" height="326" src="https://blogger.googleusercontent.com/img/a/AVvXsEg0oDj-qVSpIxs9L9gzxT-Z4PjfbzzAgV1rJSOsx-9h-TVeHFIP4WUmNWBvab873qsXgoS06HerW_YSgljWTlIWrMWKc-tUd4J1GjBtxuBAHy4TTfFcl6HjFVzEFTvSzz20RtvUeiRicCDn6Ipgzq7GQ76Mq-0_9E5Tb-tV4m62_8hzh_RgftEote1fzhM7=w640-h326" width="640" /></a></div></div><p></p><p>In addition, we have added a caustic effect. For performance considerations, we have opted for a simple flowing texture approach, resulting in the following effect:</p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjN04si1njPsTHFQH9qoRcXxgQeaI83ZVIGCEMpSsxIZLdLHYDBH9YTXOMFV_Ea8-VtKl1t6J_RZR8W4bXO5Fe2_lu2GgGdK3xx4HQ1cBTiASO_Xj2o4kANn1L1iY8WJweetUjJyDcHJhG_OTdSrVhcJImVGxO8c1ruIozJTTYOEsPVhGB1tYtIZu7KxQY3" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="330" data-original-width="650" height="324" src="https://blogger.googleusercontent.com/img/a/AVvXsEjN04si1njPsTHFQH9qoRcXxgQeaI83ZVIGCEMpSsxIZLdLHYDBH9YTXOMFV_Ea8-VtKl1t6J_RZR8W4bXO5Fe2_lu2GgGdK3xx4HQ1cBTiASO_Xj2o4kANn1L1iY8WJweetUjJyDcHJhG_OTdSrVhcJImVGxO8c1ruIozJTTYOEsPVhGB1tYtIZu7KxQY3=w640-h324" width="640" /></a></div><br /><div class="separator" style="clear: both; text-align: left;">Here, I'll explain the issue regarding the intersection between the distant view and the skybox. To address this, we gradually fade out based on the camera distance, blending into the background color to avoid the problem. The effect is as follows:</div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEi-wxpSfadhdixuI-TyPNu4Jac8ws6tMAhH7w8O7OvwcWb2B0XPclqLJ53bxXHv0W80GLTsJTCjoO509boizcScc3A1NbuayoAAP7iTQqigBJAxqApooP4lSUAvoQmT0Upmy8UDcy0WUmNMJVSjqyO-7NjMdPNsLv6GnFRrXA0ek60kcOrwvr2kcSoZY2hg" style="margin-left: 1em; margin-right: 1em;"><img alt="" data-original-height="338" data-original-width="666" height="324" src="https://blogger.googleusercontent.com/img/a/AVvXsEi-wxpSfadhdixuI-TyPNu4Jac8ws6tMAhH7w8O7OvwcWb2B0XPclqLJ53bxXHv0W80GLTsJTCjoO509boizcScc3A1NbuayoAAP7iTQqigBJAxqApooP4lSUAvoQmT0Upmy8UDcy0WUmNMJVSjqyO-7NjMdPNsLv6GnFRrXA0ek60kcOrwvr2kcSoZY2hg=w640-h324" width="640" /></a></div><br /><p></p>Finally, I've included the full showcase video. Please pay special attention to the WireFrame display, where you can see that the water surface is represented by only four points. This is why the plugin is called InifityQuadOcean. The video is provided below:<div><br /><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="527" src="https://www.youtube.com/embed/kcAEkprsAVE" width="633" youtube-src-id="kcAEkprsAVE"></iframe></div><br /><div><br /></div></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-596275405889359192023-08-04T12:20:00.006+08:002023-08-04T13:22:43.878+08:00The follow camera GPU instance particles<p>使用nine grid實作follow camera particles effect是一個好想法,它可以避免particles跟隨camera飄浮及需預熱(否則會來不及產生)的問題,而且particles也沒做perInstance culling,這對性能影響蠻大的…<br />這裡使用了GPU instance及GPU culling來實作這個系統,並在ComputeShader中模擬粒子的位移,整個處理流程如下:</p><span><a name='more'></a></span><p><br /></p><p></p><ol style="text-align: left;"><li>產生亂數Transform Buffer</li><li>實作follow camera nine grid</li><li>配置相關ComputeShader所需要的Buffer及RenderTexture</li><li>實作相關的ComputeShader(含GPU剔除及粒子位移模擬)</li><li>實作相關instance shader(渲染用),為了跨平台的考量,不使用SSBO</li></ol><p></p><p></p><div class="separator" style="clear: both; text-align: center;"><br /></div><div>在camera鄰近的nine grid產生相關的particles,以確保camera視角在任何方向都會鋪滿粒子,</div><div>但只有在視野內才會丟進GPU進行渲染。</div><div><br /></div><div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjKW6a8mgYnomNauoXu34ZpQ1xpz6rutWbW3t9XUjgEkkgP6EAK_w82jzMjNeT4vB_Zbvf98NUnaahUC3u_xMo9zMeCmICv4XcxrjicLAmAUr3GlX916V7YaVIdQGOkCf3Mb7lD4vjOfRk5jL-0w5-tskD926ble6YJ7qN6mIJ4RdDLLiLly0WJE2f-gMWY" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="630" data-original-width="760" height="530" src="https://blogger.googleusercontent.com/img/a/AVvXsEjKW6a8mgYnomNauoXu34ZpQ1xpz6rutWbW3t9XUjgEkkgP6EAK_w82jzMjNeT4vB_Zbvf98NUnaahUC3u_xMo9zMeCmICv4XcxrjicLAmAUr3GlX916V7YaVIdQGOkCf3Mb7lD4vjOfRk5jL-0w5-tskD926ble6YJ7qN6mIJ4RdDLLiLly0WJE2f-gMWY=w640-h530" width="640" /></a></div><p>視頻結果如下:</p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="532" src="https://www.youtube.com/embed/O5TguKS4sbI" width="640" youtube-src-id="O5TguKS4sbI"></iframe></div><br /><p><br /></p></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-73240069334179737142023-07-25T13:40:00.001+08:002023-07-25T13:58:51.398+08:00The keep render frame<p> The keep render frame是一個聰明的做法,他不需要像trailRenderer那樣動態維護三角形,也不需要像particle一直噴射粒子,只需要把上一偵的結果保留下來就好,但缺點就是受限於整張紋理的大小,比較難應用在開闊式場景。Unity有相關自帶範例,我只是處理了它沒辦法做到時時消失的部份,你可以自行參考其相關實作。<br />相關視頻如下…</p><span><a name='more'></a></span><br />
<div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="537" src="https://www.youtube.com/embed/GZXamMjlZXI" width="645" youtube-src-id="AOOFQI-i0s4"></iframe></div><br /><p></p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-73434175567636419862023-02-21T18:53:00.002+08:002023-02-21T18:58:34.751+08:00Projected grid的方案(無限大海洋方案基礎模組-URP+SRP)<p>許多人可能直接拿一些水插件就直接套上專案,但在開發中後期可能會發生許多整合方面的問題,這時候花些時間理解其中組成的模組是必要的,甚至還可以達到客製化的目的,而projected grid就是無限大海水方案的基礎模組之一。<span></span></p><a name='more'></a><p></p><p>近期ChatGPT蠻流行的,對於通用的演算法、數學或物理公式,可以幫上不少忙,甚至也可以幫你寫一些demo代碼,所以覺得所有演算法都要自己寫過的必要性也不高,反而我個人認為之後整合的部份會更勝於演算法的實作,簡單來說就是懂得怎麼用可能會比怎麼寫還來得更重要,所以這裡關於Projected grid的演算法如何推導就不是本文的重點,最後會附上相關的理論連結供參考。</p><p></p><ol style="text-align: left;"><li>依鏡頭解析度建立投射格子mesh</li><li>每個frame依鏡頭換算相關投射矩陣</li><li>畫出相關projected grid mesh</li><li>使用geometryShader顯示wireframe來進行相關除錯,並用一個RenderObjectPass來畫它</li></ol><p></p><p>相關demo影片:</p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="537" src="https://www.youtube.com/embed/AOOFQI-i0s4" width="645" youtube-src-id="AOOFQI-i0s4"></iframe></div><br /><p>相關理論文章連結:</p><p><a href="http://habib.wikidot.com/projected-grid-ocean-shader-full-html-version">Real-time water rendering - Introducing the projected grid concept</a></p><p>最後再附上相關的Github連結:</p><p><a href="https://github.com/AkilarLiao/ProjectedGridSRP">AkilarLiao / ProjectedGridSRP</a><br /></p><p>Dream continues in...</p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-29127419411297389912023-02-10T17:52:00.007+08:002023-02-13T09:54:04.195+08:00關於FairyGUI或GUI以StackingCamera受到RenderScale影響的解決方案<p>自從URP問世之後,ScaleRender便在Unity成為主流,他是優化行動裝置3D渲染的一個好方法,當然之前也是有不少人在默認管線下做過這類的應用…</p><p>後來URP也加入了類似默認管線疊合(Stacking)渲染的處理流程,但稍微有研究的人就會發現他會受RenderScale參數的影響,如果拿來畫UI的話,那有關字及ICON的顯示可能會是場災難。<br />有人說那就用Overlay模式來畫UI就好,但專案還是會有一些特殊的需求要用Camera來畫,另外像FairyGUI大部份都是用Camera來渲染的系統,就影響很大了,也總不能為了UI的品質來禁用RenderScale的參數。<br /></p><p>之前沒使用過FairyGUI,因為工作關係接觸了之後,發現他與URP整合上有一些問題,主要是大部份都是用Camera來畫,這樣一來就得走StackingRender的流程,不像UGUI還有Overlay模式可用,於是開啟了FitRenderScaleURP之旅。<span></span></p><a name='more'></a><p></p><p>目標:</p><p></p><ol style="text-align: left;"><li>讓畫UI的Camera可以有RenderScale = 1.0的屏幕解析度</li><li>可以依序疊合渲染</li><li>不使用額外的全屏渲染RenderTexture</li></ol><p></p><p>最直覺的想法,應該就是開一個全屏的RenderTexture來畫UI,但有經驗的人都知道,渲染貼圖是沒辦法壓縮的,內存占用非常大,對行動裝置來說能省則省,所以這個方式不合適,目前看起來最好的方式應該是可以直接畫到FrameBuffer。不知道大家有沒有發現,RenderPipelineAsset裡有一個參數叫<b><span style="color: red;">IntermediateTexture</span></b>這個參數,其實他是用來控制是否要強制走RenderTexture的流程,經測試後,發現我們只要把這個參數設<span style="color: red;"><b>Auto</b></span>且當<b><span style="color: red;">RenderScale為1.0</span></b>的時候,他就會嘗試使用<span style="color: red;"><b>FrameBuffer</b></span>來畫,另外其他的條件是不可以使用<span style="color: red;"><b>DepthTexture及OpaqueTexture</b></span>,所以這提供了我們可以不改URP的原代碼,直接畫到FrameBuffer的方法。</p><p><br /></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEi8MHV1F_gro6o8v9DPDACnc-zKfTGhzYWX_VUdNvDtsXjO436DnSyYF_yG7zrtmLyaHkcR9b28vJ9tF90cdhiOMpoby-j8lWrd-Luq8eAckoDDfYfOOiDFK0TM9sOMUSYycHa7mlkc1GV65zZ0SP1DxPwZVYCwepeIZRoL2cV31CL77Hs2Jb3tCW564g" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="604" data-original-width="614" height="630" src="https://blogger.googleusercontent.com/img/a/AVvXsEi8MHV1F_gro6o8v9DPDACnc-zKfTGhzYWX_VUdNvDtsXjO436DnSyYF_yG7zrtmLyaHkcR9b28vJ9tF90cdhiOMpoby-j8lWrd-Luq8eAckoDDfYfOOiDFK0TM9sOMUSYycHa7mlkc1GV65zZ0SP1DxPwZVYCwepeIZRoL2cV31CL77Hs2Jb3tCW564g=w640-h630" width="640" /></a></div><br /><p></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEi59SUL33MWudRNKwrkrIqdGpQ4NyhUSwQ6TzsHuE4YqEf3GYEd-2MdmDHJokcK4ebUuqG8AvmMt556ESQEODRj_vuewtBFT7Al_aN8bqK9hwg95_jjrZyLYAkvYxZWh6eOI41MIsOYm6mvT4yCxjFiIr9N9FsilxwJuOwY3vhl8TzK3Ml7BZJ-yYKYJQ" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="531" data-original-width="600" height="567" src="https://blogger.googleusercontent.com/img/a/AVvXsEi59SUL33MWudRNKwrkrIqdGpQ4NyhUSwQ6TzsHuE4YqEf3GYEd-2MdmDHJokcK4ebUuqG8AvmMt556ESQEODRj_vuewtBFT7Al_aN8bqK9hwg95_jjrZyLYAkvYxZWh6eOI41MIsOYm6mvT4yCxjFiIr9N9FsilxwJuOwY3vhl8TzK3Ml7BZJ-yYKYJQ=w640-h567" width="640" /></a></div><br />不過目前這裡,我還是會維持DepthTexture及OpaqueTexture的參數是勾選的狀態,因為專案實際上就是有需求,這裡我們會利用Camera的覆蓋參數來解決這個問題,也就是畫UI的Camera我們會用覆蓋掉的方式來處理。<p></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEjJwtVxLvUx_uj6LcWn429zZMtU0Xk4HlRMGquUjYx7WJOZMKf0yklHZ-imOEvoQ5lelRq-FoNBtgUgT3pjT13U7buwRVLPw3GVXseNly87d5-kvVtCPEWGCHfZa5da307jxtJ8__4RtewK2ew-uuJ31DKk7Qgwb4wYg_9j87D-nhUS6oeVow-bYl85bA" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="447" data-original-width="596" height="480" src="https://blogger.googleusercontent.com/img/a/AVvXsEjJwtVxLvUx_uj6LcWn429zZMtU0Xk4HlRMGquUjYx7WJOZMKf0yklHZ-imOEvoQ5lelRq-FoNBtgUgT3pjT13U7buwRVLPw3GVXseNly87d5-kvVtCPEWGCHfZa5da307jxtJ8__4RtewK2ew-uuJ31DKk7Qgwb4wYg_9j87D-nhUS6oeVow-bYl85bA=w640-h480" width="640" /></a></div><br />接下來就是今天的主角FitRenderScaleControler這個腳本,基本上只要把它掛在對應的Camera就可以,他主要的運作原理就是當這個Camera被畫到的時候,我們就把RenderScale參數設為1,然這個Camera畫完後,我們就設回原本的參數,基本上只要傾聽BeginCameraRendering及endCameraRendering這兩個事件就可以達到目的。<p></p><p>我們可以在FrameDebugger的畫面看到,他是直接畫到FrameBuffer,而不是RenderTextrure</p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEhQEDQ91HD2z7d1wdQUNnSgSDqhDLQoQMQsl6OdMyKB2aqqudqsXsxrjeWjsXW-X-9f3qg5TRfvtOMOcNzMN4L94fNGFGNGyW5aWVXKXo5vbvosv1kEhbstj8k2qTHWtdftoIwQYA-ddZl7DBohdrRkZ_TtSHlm17xFqtzYiN_ux7TWJGEdBZVt6YmdHw" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="698" data-original-width="673" height="640" src="https://blogger.googleusercontent.com/img/a/AVvXsEhQEDQ91HD2z7d1wdQUNnSgSDqhDLQoQMQsl6OdMyKB2aqqudqsXsxrjeWjsXW-X-9f3qg5TRfvtOMOcNzMN4L94fNGFGNGyW5aWVXKXo5vbvosv1kEhbstj8k2qTHWtdftoIwQYA-ddZl7DBohdrRkZ_TtSHlm17xFqtzYiN_ux7TWJGEdBZVt6YmdHw=w616-h640" width="616" /></a></div><br />另外還有一個要注意的細節,就是我們疊合上去的BaseCamera,要記得把BackgroundType選成Uninitialized,這樣才不會把原本畫好的東西清掉,才疊得上去…<p></p><p></p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEgZ9znm3ZcLnecKWP_dm_iTB1yCr4GoYCWGW48iN0ubTB-lHlR72jKw57NRBUZR-z_4-sdsOO1Xgj_Ah-uwvK-J2d8doAJojSC4GEnrGSJZAdOmiz_rVJjomS3owtsDluZ2gHSKn3oJUiLcl3bHZ6UXOPtEjYf0Elp2QBWTM3q-0LFnD0NYwEfEO35mlA" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="678" data-original-width="616" height="640" src="https://blogger.googleusercontent.com/img/a/AVvXsEgZ9znm3ZcLnecKWP_dm_iTB1yCr4GoYCWGW48iN0ubTB-lHlR72jKw57NRBUZR-z_4-sdsOO1Xgj_Ah-uwvK-J2d8doAJojSC4GEnrGSJZAdOmiz_rVJjomS3owtsDluZ2gHSKn3oJUiLcl3bHZ6UXOPtEjYf0Elp2QBWTM3q-0LFnD0NYwEfEO35mlA=w581-h640" width="581" /></a></div><br /><br /><p></p><p>FitRenderScaleControler的代碼也只是簡單的幾行…</p><p></p><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEgash8RdLl-73neddjC-6pFPvs1E601bEZQuatqY7XSHE0LFG44VDFuZUWCYKSEj6q6N3bQZkThDYks7aYxgCOMnkrseQ3Ge0rjioIjyJmc8OfY_-V0119buxr-qIsbOIV3Ymbn4dMIffLmd_tpvUeCGulkLnDoE2zzf-PP2Se3u2Go-4zQ8RoR9chPqQ" style="margin-left: 1em; margin-right: 1em;"><img data-original-height="581" data-original-width="1261" height="294" src="https://blogger.googleusercontent.com/img/a/AVvXsEgash8RdLl-73neddjC-6pFPvs1E601bEZQuatqY7XSHE0LFG44VDFuZUWCYKSEj6q6N3bQZkThDYks7aYxgCOMnkrseQ3Ge0rjioIjyJmc8OfY_-V0119buxr-qIsbOIV3Ymbn4dMIffLmd_tpvUeCGulkLnDoE2zzf-PP2Se3u2Go-4zQ8RoR9chPqQ=w640-h294" width="640" /></a></div><br />相關視頻如下:<p></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="540" src="https://www.youtube.com/embed/r-i5jjfnJes" width="650" youtube-src-id="r-i5jjfnJes"></iframe></div><br /><p>不過目前這樣的做法也不是完全沒有缺點,如果是想有3D UI(比方角色血條或TitleBar之類的),要跟3D物件(比方山擋住)排序之類的,這時候因為參考不到DepthBuffer,所以就只能放到MainCamera畫。</p><p>最後再附上相關的Github連結:<br /></p><p><a href="https://github.com/AkilarLiao/FitRenderScaleURP">AkilarLiao / FitRenderScaleURP</a><br /></p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-5971768136148401402023-01-09T16:33:00.005+08:002023-01-10T16:28:59.763+08:00簡單的FragmentShader light culling forward plus(面向行動平台)<p>最近Forward plus成為比較熱門的渲染管線之一,連Unity在新版本也在開發中,他採用的方式是用Job來做culling,我自己實作的方式是用ComputeShader來做culling,相關連結如下:<br /><a href="https://github.com/AkilarLiao/ForwardPlusURP">AkilarLiao / ForwardPlusURP</a><br />但目前這些主流的做法,幾乎都會用到SSBO,另外內存的使用也比較大,想一想其實上帝視角Z值的疊合狀況不是那麼嚴重,所以在想有沒什麼簡單的方式,可以單純使用一個FragmentShader來處理像素等級的剃除,於是開啟了FragmentShader light culling forward plus之旅…<span></span></p><a name='more'></a><p></p><p>步驟如下:</p><p></p><ol style="text-align: left;"><li>PreDepthPass,這是Forward plus常規的處理流程之一,對行動平台不是很友善,之後看看是不是可以少這個步驟,使用上一偵的深度貼圖來處理,感覺有點麻煩。</li><li>寫一個FragmentLightCullingPass,主要是用來生成LightIndexTexture,裡面只是很簡單地把Unity的回傳回來的visibleLights進行相關的燈光資料蒐集,填入相關的UBO,並把主光排除掉,然後把深度貼圖傳入到相關的FragmentShader中,並運行來做燈光蒐集。</li><li>FragmentShader裡,跑了一個燈光的清單的檢查,目前一個像素最多存入二個燈光,蒐集的方式是比對燈光位置及目前的深度紋理的世界座標位置的距離,只存最近的兩盞光,另外這邊也有做降貼圖的尺吋,就像Tile一樣,不用這麼細,以提升燈光culling的性能。</li><li>最後光照處理的部份,會存取這張LightIndexTexture,取出相對應的燈光索引及數量,之後就做常規的BRDF打光處理。</li></ol><p></p><p>目前這樣做最大的好處,就是避免使用SSBO及ComputeShader,對跨平台的部份比較友好,缺點就是每一個像素支持的燈光數量比較少,不過至少比原生的URP light weight以Object相關做為燈光索引的部份,優化了不少,尤其是大面積的地表或物件,相關的視頻如下:</p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="511" src="https://www.youtube.com/embed/fhkjpaBWgFw" width="614" youtube-src-id="fhkjpaBWgFw"></iframe></div><br /><p>Dream continues in...</p><p><br /></p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-20758868284034955282022-12-31T03:31:00.006+08:002023-01-06T11:45:21.011+08:00用MRT來優化SSAO在行動平台的性能<p> 對於行動平台使用深度貼圖,有經驗的人應該都知道Unity曾經不管三七二十一,只要是gles,一律走perDepthPass,而不走CopyDepth的流程,其實這個限制應該只是有在使用硬件MSAA的時候,才有這個問題,這曾經困擾了許多人,後來官方在新的版本做調整,讓大家鬆了口氣…</p><p>我在測試官方的ScreenSpaceAmbientOcclusion時,發現他的Depth Normals模式,不管是不是AfterOpaque,一律都走perDepthNormalPass,這對行動行台非常不友善,就跟perDepthPass是一樣的問題,雖然他有提供純Depth模式+AfterOpaque可以避免這個問題,但沒有normal效果就是比較差,所以便開啟了MRT + SSAO之旅(許多人把MRT跟延遲渲染畫上等號,但其實是可以應用在優化處理上)<span></span></p><a name='more'></a><p></p><p>執行步驟如下:</p><p></p><ol style="text-align: left;"><li>設計一個SetupScreenNormalMRTPass來處理MRT RenderTarget的設定,在BeforeRenderingOpaques時執行,這樣就可以在Render opaque之前把MRT畫布設定好,不過要記得只能在GameView中進行,因為目前在SceneView下使用MRT會造成沒辦法正常刷新的錯誤,反正底層本來就有PreDepthNromal的處理,這邊要注意的是CameraNormalMap的資料不是浮點數貼圖,因為MRT格式本來就要跟ColorRenderTarget一樣。</li><li>另外製作一個SSAOPass,跟原生的差別主要處理如果GameView就走MRT模式,如果是SceneView就直接呼叫ConfigureInput(ScriptableRenderPassInput.Normal),讓底層的DepthNormalOnlyPass來進行處理,其他的代碼跟原生的沒什麼區別。</li><li>再來是Shader的部份,我們在常規的Pass加入了一個#pragma multi_compile _ PROCESS_MRT_NORMAL,主要用來控制在GameView時才會走MRT輸出normal,而SceneView時,則交給常規的DepthNormals Pass來輸出normal。</li><li>最後是SSAO後處理Shader,基本上跟原生完全一樣,只是把取用Normal的部份分成來源是MRT還是常規的CameraNormal,如果是MRT的因為不是浮點貼圖的關係,所以我們要做UnPackNormalMap處理,如果是原生的CameraNormal的話,因為是浮點貼圖,所以就不用做UnPackNormalMap。</li></ol><p></p><p>原生的FrameDebugger如下,會多了DepthNormalPrepass:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirehDRwOLQdt_Va0FGqqEXxIOXHtIucfOm4vakAzB_c3qnArSD_JhQrTeYIoB-v6CuznhLh9jpePAmxA8HstftPXkCMIskEAsgt_iJx6iL_dkH8GEyKczPVYoE2X5X0akzAE7FoTmrwh6wuAsu79Uog_XZpVpFWWk_4oyTXhO7-oSNc1V1jzY_EYrVxA/s813/OriginalSSAOFrameDebug.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="476" data-original-width="813" height="374" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirehDRwOLQdt_Va0FGqqEXxIOXHtIucfOm4vakAzB_c3qnArSD_JhQrTeYIoB-v6CuznhLh9jpePAmxA8HstftPXkCMIskEAsgt_iJx6iL_dkH8GEyKczPVYoE2X5X0akzAE7FoTmrwh6wuAsu79Uog_XZpVpFWWk_4oyTXhO7-oSNc1V1jzY_EYrVxA/w640-h374/OriginalSSAOFrameDebug.png" width="640" /></a></div><br /><p></p><p>MRTSSAO的FrameDebugger如下,少了DepthNormalPrepass:</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTV46L5kKgFSNKauD407VYyhlgLHaKrRGceeO47UMxqH_CmhrOZRm3fuHYgdPsFW0y_Y6AZcoeGuSGKg2KVszR62JCKyns_QS38UivohB_lyq-GBGaUcTjX-nVAIIPBO5FlWVqxnvXDCHhzrLtu5mT-wiTIk5Nq5UGWBHeL-V1nV7v89rTebJeDHdPWQ/s811/MRTSSAOFrameDebug.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="477" data-original-width="811" height="376" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgTV46L5kKgFSNKauD407VYyhlgLHaKrRGceeO47UMxqH_CmhrOZRm3fuHYgdPsFW0y_Y6AZcoeGuSGKg2KVszR62JCKyns_QS38UivohB_lyq-GBGaUcTjX-nVAIIPBO5FlWVqxnvXDCHhzrLtu5mT-wiTIk5Nq5UGWBHeL-V1nV7v89rTebJeDHdPWQ/w640-h376/MRTSSAOFrameDebug.png" width="640" /></a></div><br /><p></p><p>相關視頻如下:</p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="434" src="https://www.youtube.com/embed/UCUenSfcj68" width="522" youtube-src-id="UCUenSfcj68"></iframe></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><p></p><p>Dream continues in...</p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-5985660205896226932022-12-26T15:29:00.002+08:002022-12-26T16:25:13.599+08:00VRM and QT初體驗(QT+VRM+UDP+Unity)<p>最近VTuber+VRM蠻紅的,想說也來把玩一下,另外很久沒有做embed 3d engine into native application,於是開啟了QT+VRM+UDP+Unity之旅,雖然之前是有開發過Ogre3D+MFC的工具,但embed Unity into QT還是第一次,由於Unity沒提供任何對外溝通訊息接口,在Google上找得到的解決方案,都是導向本機UDP訊息傳輸的做法,所以只好硬著頭皮做QT+VRM+UDP+Unity…</p><p>關於VRM整合的部份,請參考<a href="https://github.com/vrm-c/UniVRM">vrm-c / UniVRM</a>,由於VRM1.0有整合自動撥放表情,所以就以整這個為主,但由於目前的版本只支持runTime parse的做法,所以這邊另外寫了一個可以輸出prefab的相關功能,並將相關的貼圖、mesh、材質保存出來(跟univrm插件差不多),這樣可以大大減少runTime parse的時間及達成資源共用的目標。</p><p>相關視頻結果如下:(可選擇右邊的QT button來切換動作撥放的狀態)<br /><br /></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="490" src="https://www.youtube.com/embed/iJmeb3M7YLs" width="589" youtube-src-id="iJmeb3M7YLs"></iframe></div><br /><p></p><p>後記:之後再來看看動捕的部份…</p><p>Dream continues in...</p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-31839724833264758172022-06-08T11:01:00.015+08:002022-06-08T13:21:10.396+08:00關於超大地圖Recastnavigation尋路資料建立的做法<p> 由於專案有複雜超大地圖(unity單位15000*15000)的需求,在Recast資料建立上遇到了一些問題…<br /></p><ol style="text-align: left;"><li>整塊大地建立不出來,發生了以下的錯誤…<br />if (maxVertices >= 0xfffe)<br />{<br /> ctx->log(RC_LOG_ERROR, "rcBuildPolyMesh: Too many vertices %d.", maxVertices);<br /> return false;<br />}</li><li>就算是分了區塊,但因為Recast底層建立的函式,無法並行建立,所以造成了建立時間過久的狀況(超過一天以上)</li></ol>解決方案:<br /><ol style="text-align: left;"><li>自行實作一個建立console工具程式,並用批次檔來運行多個執行檔,來達到並行建立區塊的目的。</li><li>最後將相關的區塊移到unity裡來進行彙整,並輸出跨區域尋路的整塊地圖。</li></ol>利用美術的相關模型生成柱狀體檔格資料:(原本可以直接使用美術的mesh直接生成,但因為資料容易有不合法的問題及因為專案只需要2D的尋路,所以為了減少生成問題,使用了這個方式)<br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEja4l3yPw_mJ6XXa8_WceT81c_igq-GKw1vNAFO3hYCWX9kmq72PilhmMoUvYAdy7ydBsufdIbmbbPW0GT0FWqqlncaq4ACYy_ERldbIc6eNtUE55ON8MeJK1Q1y2UfTlV3PLYl2O6wUTWJC2Gv5-Vv7oX4Q_9RP2Kog_VrphsdY2Sfshh3WBtlgrWeHg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img data-original-height="780" data-original-width="1411" height="369" src="https://blogger.googleusercontent.com/img/a/AVvXsEja4l3yPw_mJ6XXa8_WceT81c_igq-GKw1vNAFO3hYCWX9kmq72PilhmMoUvYAdy7ydBsufdIbmbbPW0GT0FWqqlncaq4ACYy_ERldbIc6eNtUE55ON8MeJK1Q1y2UfTlV3PLYl2O6wUTWJC2Gv5-Vv7oX4Q_9RP2Kog_VrphsdY2Sfshh3WBtlgrWeHg=w640-h369" width="640" /></a></div><br /><br /><br /><p></p><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div>利用外包計算(其實只是做格子的拜訪)的演算法,來算出各區塊的包覆範圍,分割地圖區塊:<br /><br /></div><div><a href="https://blogger.googleusercontent.com/img/a/AVvXsEilpAxtG5BQ70CKNCLYeYvGKKryajTGBTQqJ6jiYoWfLwtR0xzeEsgO09wxQI-KB9muqFtYR6AtvynnOClAzQxJx2u_MYOPrpSSfpWOp7pdnJuIqIuU6qTE-exOL-i0J4DHRORkQn_1wsFKiiiRgUKRDRvWYE4k8ZMC2qJF5E5ZS_5o59xTcCeUkjYiDw" style="clear: left; margin-bottom: 1em; margin-right: 1em; text-align: center;"><img data-original-height="781" data-original-width="1412" height="335" src="https://blogger.googleusercontent.com/img/a/AVvXsEilpAxtG5BQ70CKNCLYeYvGKKryajTGBTQqJ6jiYoWfLwtR0xzeEsgO09wxQI-KB9muqFtYR6AtvynnOClAzQxJx2u_MYOPrpSSfpWOp7pdnJuIqIuU6qTE-exOL-i0J4DHRORkQn_1wsFKiiiRgUKRDRvWYE4k8ZMC2qJF5E5ZS_5o59xTcCeUkjYiDw=w640-h335" width="640" /></a></div><div><br />將各個Console程式計算的區塊地圖彙整到unity,進行地圖合併及提供之後計算跨區域路線計算:<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEhRc5hq_i0Wtxky82YUHL6PJs0h_mHvSx_fcYGeFrrOHs64aq2Ifes5pBuY524nWSJyn1BDgms3YWAx2qP8FQdmiZPz1P_yPfSqZRUeYnEabdD4ldyfVCYQVWs9UVC0pM1ZsmUhJWnOa6xDTm-2YFZ5wboOMkqI0BjAH5jhMsazNlHRcEeHo-rT62dbjg" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img data-original-height="777" data-original-width="1409" height="352" src="https://blogger.googleusercontent.com/img/a/AVvXsEhRc5hq_i0Wtxky82YUHL6PJs0h_mHvSx_fcYGeFrrOHs64aq2Ifes5pBuY524nWSJyn1BDgms3YWAx2qP8FQdmiZPz1P_yPfSqZRUeYnEabdD4ldyfVCYQVWs9UVC0pM1ZsmUhJWnOa6xDTm-2YFZ5wboOMkqI0BjAH5jhMsazNlHRcEeHo-rT62dbjg=w640-h352" width="640" /></a></div><br /><br /></div><div><br /><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div>計算跨區域路線,並存檔,完工…<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/a/AVvXsEiwoOApkLpH-aUPbGRgMEx8JBBkrB7OukAtyWevqR3ke_KvswzxikVPtj7TwR11LWVR1LAuxqWAbEXg3uztF3XhyO3braKSeM6cvrKbz3w5oipkWw_4zvAF-xZJhP-w6fsSPahtSf-8SXDpe3SVrlJOFe02PvVW8UfWr5PU5PLa-4VrAouou7PG19t3_g" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img data-original-height="775" data-original-width="1409" height="352" src="https://blogger.googleusercontent.com/img/a/AVvXsEiwoOApkLpH-aUPbGRgMEx8JBBkrB7OukAtyWevqR3ke_KvswzxikVPtj7TwR11LWVR1LAuxqWAbEXg3uztF3XhyO3braKSeM6cvrKbz3w5oipkWw_4zvAF-xZJhP-w6fsSPahtSf-8SXDpe3SVrlJOFe02PvVW8UfWr5PU5PLa-4VrAouou7PG19t3_g=w640-h352" width="640" /></a></div><br /><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div>後記:實測建立時間可以節省2/3以上,目前覺得還要回來unity進行合併,多了一個步驟,不知道之後有沒辦法分塊,合併及路線計算能在一個步驟就完成,以下是相關的視頻:</div><div><a name='more'></a><br />
<iframe allowfullscreen="" frameborder="0" height="480" src="https://www.youtube.com/embed/B2vHNqUQgH0" width="640"></iframe><br />
<br /><br /><br /><br />Dream continues in...</div><div><br /></div><div><div class="separator" style="clear: both; text-align: center;"><br /></div></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-18956457656441295232021-10-27T16:35:00.017+08:002021-10-29T12:49:06.128+08:00關於GPUInstance與GPU VS CPU culling的恩怨情仇<p>大家應該都知道GPUInstance是拿來畫大量物件的重要利器,但他最大的問題,就是沒有處理Culling,當拿來畫同一種巨量草之類的,面數少的話,應該是沒什麼問題(同種類,只需傳送一次Mesh資料,數量多的話,因為面數少,所以vertex shader處理次數少,也還ok)。但當我們把鏡頭拉近的時候,其實剔除處理就變得很重要,否則有可能GPU instance會比一般用page culling的drawMesh API性能還來得差…</p><p><br />那我們如果幫GPUInstance也切page的話,情況就會變好嗎?其實不一定,在拉近的時候,太多數的情況是還蠻OK的,但如果有拉遠鏡頭的需求,會導致DrawInstance的API呼叫次數變多,進而導致mesh傳輸量增加,這種狀況下反而切page變成是一種負優化…<br /><br />最近比較流行的是直接使用GPU+ComputeShader來做GPU culling,並搭配DrawMeshInstancedIndirect這個API來畫,聽起來似乎是蠻理想的方案,但他最大的問題,就是當鏡頭拉近的時候,如果種類非常多,由於數量是由GPU直接算出來的,CPU完全不知道,所以還是得直接呼叫DrawMeshInstancedIndirect這個API來畫,就算GPU算出來的數量是零,還是導致了把mesh傳輸到GPU,造成了負優化,尤其是當種類越多,這個問題會越嚴重…另外還有硬體支持度的問題…<br /><br />所以種類物件越多的場景,如果想全場景GPU instance化來提升性能,目前可以想得到的方式就是使用CPU+JobSystem來處理剔除,整個處理流程如下:</p><p></p><ol style="text-align: left;"><li>Page Culling,看是要用八元樹還是四元樹,請隨意</li><li>依CullingPageList,將其送到JobSystem來進行剔除處理</li><li>等Job IsCompleted後,將其資料拷貝到相關要畫的陣列中(動態蒐集陣列資料)</li><li>畫出所有instancer的物件(這時候只會畫真正在視野範圍內的物件)</li></ol><p></p><p><br />試著在紅米三,種類高達127種,測試物件數量高達到30000個左右的正式場景,物件面數約幾百面到2000面左右,有搭配LOD來優化,鏡頭不論遠近,FPS幾乎都頂在60(在優化前,是使用一般的DrawMesh+PageCuling,性能從35~40提升到頂60),目前也支持跨硬體API的架構(也就是會視硬體狀況,自動切換DrawMeshInstancedProcedural、DrawMeshInstanced、DrawMesh API,而且Shader都是同一個,目前的狀況是會多了一個變體),以下的測試案例是100個種類,250000個物件,相關運作狀況的影片,請參考…<br /></p><span><a name='more'></a><br /></span>
<div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="352" src="https://www.youtube.com/embed/gzsajHLqZGE" width="639" youtube-src-id="MsYVogXwPxI"></iframe></div><br />
Dream continues in...廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-36337030416380736932021-10-07T12:51:00.000+08:002021-10-07T12:51:07.021+08:00檢查地表混合範圍的功能來提升性能<p>基於目前的地表渲染Shader,由於地表混合要讀多張貼圖,另外在混合計算上也比較花性能,所以做了一個可以檢測目前地表混合權重不為1的範圍有那些,如果能將其大部份的狀況下,只保持邊緣的地方有混色的話,可以大幅提升在中低階手機的性能,概念是這樣…<br /><span style="color: #6aa84f;">//如果第二層的權重小於等於零,直接回傳第一層tileMap的顏色,不進行混色處理</span><br />if (subMapWeightRatio <= 0)<br /> return mainMapColor;<br />相關影片如下:</p><span><a name='more'></a><br /></span><br />
<iframe allowfullscreen="" frameborder="0" height="480" src="https://www.youtube.com/embed/kmHMyWyUu88" width="640"></iframe><br />
Dream continues in...廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com1tag:blogger.com,1999:blog-6481831191748590297.post-80350518730709716542021-09-08T18:09:00.002+08:002021-09-08T19:25:10.358+08:00The Terrain Instance Editor<p>在Unity裡蠻常看到先用一般物件放好,再轉instance資料的做法來編輯instance場景物件,這種方式非常不便利,而且當物件一多的時候,編輯性能就會很差,沒辦法體現GPU instance的威力…<br /><br />由於目前少有直接可以編輯Instance的編輯器,所以只好硬著頭皮自己寫一個,這個算是目前從發佈第一個Unity開發相關影片(2016/8/19)以來,覺得相對做得比較完整的編輯器,目前支持Terrain 、 Vegetation splatting,以及paint per instance and adjust transform…等功能,相關影片如下:<span></span></p><a name='more'></a><br /><p></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="343" src="https://www.youtube.com/embed/znFa6nESleo" width="623" youtube-src-id="znFa6nESleo"></iframe></div><br />廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-28951580102981291232021-08-26T16:28:00.004+08:002021-08-26T18:19:33.414+08:00大地圖多筆刷需求的解決方案<p>Unity的Terrain系統的splatting功能,如果要支持超過四層筆刷的話,就只能一直開更多張的control map,不然就是要分成不同的terrain page,但page跟page之類的過渡,可能又需要占掉一個筆刷來處理,所以基本上不是一個好的多筆刷支持方案。<br /><br />有一種方式,是直接在contol map的rgba通道中直接記錄筆刷的index,這樣一來就可以支持比較多的筆刷,他的設計理念是R、G通道記錄texture index,B通道記錄G通道的weight ratio,然後利用weight ratio來決定怎麼做混色,而平常weight ratio為零的時候,就直接return R通道的結果來優化掉混色的相關處理,測試過之後,效果還不錯,因此為他寫了一個編輯器。<br /><br />相關的編輯視頻如下:<span></span></p><a name='more'></a><p></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="394" src="https://www.youtube.com/embed/qFuRlEJvKag" width="658" youtube-src-id="qFuRlEJvKag"></iframe></div><br /><div>話說回來,平常非常少有機會同時做四層以上的混色,所以Unity Terrain的混色處理,平常都白做了…xd.<br />Dream continues in...</div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-77564404279181668292021-08-18T15:18:00.005+08:002021-08-18T15:18:47.472+08:00使用Smooth tile processing來解決重複感的問題<p>在戰爭策略遊戲的上帝視角,在拉遠的時候,常常會看到地表重複感的問題,常看到手法是使用隨機或美術手工拼接Tile uv旋轉來解決,雖然可以解決重複格子感的問題,但格子感的問題仍無法解決。<br /><br />如果這時候動態依照鏡頭的高度直接調整tile值,也不是一個好的方案,會造成跳動感,這裡導入了Smooth tile processing來解決重複感的問題,相關效果如下:<span></span></p><a name='more'></a><br /><p></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="371" src="https://www.youtube.com/embed/ArZ5XBI0zh4" width="637" youtube-src-id="ArZ5XBI0zh4"></iframe></div><br />Dream continues in...廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com2tag:blogger.com,1999:blog-6481831191748590297.post-19688073044408277592021-08-05T16:37:00.002+08:002021-08-05T17:05:47.115+08:00巨量骨骼蒙皮角色解決方案-GPU skin+GPU instacing+GPU culling+CPU job<p> 傳統的CPU端bone skin轉成在GPU端做是良好的大量骨格角色解決方案,加上了GPU instacing之後,大量減少了傳輸數據,性能可以再往上提升,不過由於沒有做到完整的剔除,導致處理了過多的vertex資料,也造成了性能不佳。<br /><br />這裡再加上了GPU culling來做剔除處理,大大減少了處理vertex的數量,性能往上提升非常多,但由於transform處理邏輯(這邊是用旋轉來當例子)在CPU端做,導致了不少的CPU bounds,影響了性能,所以又加上了Job system來處理這一部份,性能又往上提升了一個層次…<br /><br />相關影片如下…<br /></p><span><a name='more'></a><br /></span><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="302" src="https://www.youtube.com/embed/5UScAHIuKwk" width="631" youtube-src-id="5UScAHIuKwk"></iframe></div><div><br /></div>Dream continues in...廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-15330106857852182672021-07-27T15:59:00.006+08:002021-07-27T16:06:54.113+08:00使用GPU及ComputeShader做大量植被渲染與建築物遮擋剔除<p> 一般來說,大量渲染當然是走GPU instance的流程來處理,但實際應用上,會有需要因建築物的關係,將植被隱藏起來的需求,若在CPU端做,上萬個建築物+上百萬個以上的植被檢查,相信將會是一場性能災難。<br /><br />由於建築物數量也多,故也使用了GPUInstance處理,於是在GPU culling階段時,順便也把這份visisble list準備好,剛好可以提供給植被做剔除處理,目前看起來性能表現相當理想,請參考下列相關影片:<span></span></p><a name='more'></a><p></p><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="404" src="https://www.youtube.com/embed/EBG0C6tqckk" width="693" youtube-src-id="EBG0C6tqckk"></iframe></div><br /><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: center;"><span style="text-align: left;">Dream continues in...</span></div><br />廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-76607141507989392122021-05-20T13:59:00.004+08:002021-05-20T14:00:34.004+08:00在URP下處理平面反射的相關細節<p>在URP下,一般都是註冊RenderPipelineManager.beginCameraRendering這個事件,判斷是不是主camera,再進行相關反射的拍射處理(拷貝目前的camera資料及反射矩陣的設定…等),最後再呼叫UniversalRenderPipeline.RenderSingleCamera這個全域的函式(算是URP少數提供的全域函式之一)來渲染,不過如果直接這樣做,會造成原來的copy opaque,copy depth pass,renderShadow,postProcessing都會被執行到,蠻浪費的…</p><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOYzQKWCB4zBtLd0J5TbXNNdvbOllGGbYmQXmpjjaqoKyiItMK4BU9v44GkuTSyhsExW9JA1_Pqn0dGcEBFh3lZLeoCpq6icrzaiPl8d2OmJPKmuj2FyynFG_jkxLcvz4l5zoZdKRThEpr/s951/extraPass.png" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="838" data-original-width="951" height="282" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiOYzQKWCB4zBtLd0J5TbXNNdvbOllGGbYmQXmpjjaqoKyiItMK4BU9v44GkuTSyhsExW9JA1_Pqn0dGcEBFh3lZLeoCpq6icrzaiPl8d2OmJPKmuj2FyynFG_jkxLcvz4l5zoZdKRThEpr/w612-h282/extraPass.png" width="612" /></a></div><br /><p>URP其實有提供UniversalAdditionalCameraData,而且一直在調整及擴充,目前我們有這些東西可以用…<br /></p><p>additionalCameraData.requiresDepthTexture = false;<br />additionalCameraData.requiresColorTexture = false;<br />additionalCameraData.renderPostProcessing = false;<br />additionalCameraData.renderShadows = false;<br />//...<br />這樣就不會多執行到一些不想用到的pass,節省性能,相關效果影片如下:<br /></p><span><a name='more'></a></span><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="320" src="https://www.youtube.com/embed/TKPgORNfITA" width="570" youtube-src-id="TKPgORNfITA"></iframe></div><br />Dream continues in...<p></p>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-50156573288454254702021-05-13T18:00:00.003+08:002021-05-13T18:01:31.110+08:00動態深度貼圖水Shader<p> 一般的水上下起伏的做法,比較常見的就是做點位移動畫,但需要較多的點資料,否則看起來就會像果凍。<br />在手機下,vertex的數量,由於存取記憶體的關係,太大會影響性能,所以這裡採用了動態深度的做法來模擬點位移動畫的部份,vertex數量可以大大減少,甚至只要一個quad(四個點)就可以。<br />以下是相關視頻結果,請參考:<span></span></p><a name='more'></a><p></p><div class="separator" style="clear: both; text-align: center;"><iframe allowfullscreen="" class="BLOG_video_class" height="352" src="https://www.youtube.com/embed/MsYVogXwPxI" width="639" youtube-src-id="MsYVogXwPxI"></iframe></div><br />Dream continues in...廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0tag:blogger.com,1999:blog-6481831191748590297.post-923975876983140782021-03-15T16:34:00.004+08:002021-03-15T16:36:38.421+08:00如何在URP下做ReplacementShader?在預設渲染管線下,ReplacementShader還蠻常拿來做一些特殊效果(比方後置特效之類的),但SRP渲染管線之後,就無法使用這個功能,官方說法是說可以利用LightMode來達到類似的效果,但是如果Camera Transform不同或平行視角射影機的話,目前看起來切LightMdoe是無法達到相同的效果,由於近期要做的功能也有這方面的需求,在此分享一下怎麼在URP下做到類似的效果,這邊使用shader_feature的做法,也可以自行使用multi_compile的方式…<span><a name='more'></a></span><div><br /></div><div>以下為時時拍射場景高度比的例子:</div><div><ol style="text-align: left;"><li>定義ShaderFeature:<br />#pragma shader_feature CAPTURE_HEIGHT_RATIO</li><li>如果是CAPTURE_HEIGHT_RATIO,才加入高度比的輸出參數:<br />struct VertexOutput<br />{<br />#ifdef CAPTURE_HEIGHT_RATIO<br />float heightRatio : TEXCOORD0;<br />#endif<br />...<br />}</li><li>如果是CAPTURE_HEIGHT_RATIO,才計算並傳入到output<br />#ifdef CAPTURE_HEIGHT_RATIO<br />output.heightRatio =<br />GetHeightRatio(worldPosition.y);<br />#endif</li><li>如果是CAPTURE_HEIGHT_RATIO,回傳顏色為高度比<br />#ifdef CAPTURE_HEIGHT_RATIO<br />return half4(input.heightRatio, 0.0, 0.0, 1.0);<br />#endif</li><li>最後是C#的部份,拍高度的Camera在拍之前,把Keyworld enable起來,拍完後,就設回disable:<br />Shader.EnableKeyword("CAPTURE_HEIGHT_RATIO");<br />UniversalRenderPipeline.RenderSingleCamera(context, m_captureCamera);<br />Shader.DisableKeyword("CAPTURE_HEIGHT_RATIO");</li></ol>這樣就可以達到不同的camera replace shader的效果,而不用在那邊layer設來設去或Material改來改去…相關參考影片如下:</div>
<div class="separator" style="clear: both; text-align: center;">
<br />
<iframe allowfullscreen="" frameborder="0" height="480" src="https://www.youtube.com/embed/yu-Js3nKJn4" width="640"></iframe><br />
<br /></div>廖峻漢http://www.blogger.com/profile/05331049252498881199noreply@blogger.com0