generation
September 30, 2024

Demo scene. Base colors. Part 1.

Telegram: Infinity World

Introduction

In the previous article, I described the process of developing a basic landscape shape. In this article, I will try to explain the first step towards visualizing the landscape.

The process will be divided into several parts since there is quite a lot of content. Specifically, in this part, I will discuss determining voxel types based on SDF landscape information, and I will also briefly touch on the topic of basic colors that will be present in the demo scene.


1. Voxel Types

A voxel (short for "volumetric pixel") is a small piece of data that describes a portion of space. It can be imagined as a cube, for example, of size (depending on the LOD) 1x1x1 unit. This cube contains information about the SDF for generating the landscape shape, as well as some material information.

Figure 1. Voxel. From wikipedia.org

1.1 Voxel Materials

For greater detail and to have more flexibility visually, I decided to select up to four materials per voxel. Moreover, four materials are well suited for vector representation.

Before continuing, let me explain some of the terms I use:

  • Voxel Type: A generalized identifier that contains some common information about the voxel type, such as a textual identifier (name). The voxel type is not directly used in algorithms because that would be very costly.
  • Voxel Type ID: This is an integer identifier (short) that points to the global index of the voxel type in the set of voxel types. Since it's just a number, it can be used on both the CPU and GPU sides, and it can also be used to quickly (in O(1) time) retrieve the voxel type or other related data.
  • Voxel Type Data: These are local data for each voxel, consisting of the identifier (Voxel Type ID) and a weight. The weight is used in many algorithms, from determining what can be generated on the surface to deciding the landscape textures.

Voxel Type and Voxel Type ID are defined in the Editor and cannot be changed at runtime, whereas Voxel Type Data is determined for each voxel during the generation of each chunk on the GPU side. Since each voxel can have up to four different types, even in adjacent ones, during the mesh generation of the landscape, it's necessary to select the voxel types with the highest weights.

For each vertex, I also choose up to four voxel types: each vertex depends on nine voxels (one central and eight corner ones), so I take a total of 36 voxel types and select the ones with the highest weights, also calculating the weighted average of these four, which are then normalized and used in all subsequent algorithms. Normalizing the weights is important here because, to avoid artifacts, the sum of the weights of the selected voxel types must equal one.

The weights of the voxel types are stored in the Vertex Layout, while the Voxel IDs are stored in a separate per-primitive buffer because they cannot be interpolated. If they were stored in the vertices, it would require creating hard edges, leading to a significant increase in memory usage.

The materials themselves are defined independently of the voxel types. Each material contains references to textures (base color, normal, smoothness, metallic, height, ambient occlusion) and some settings, such as roughness level, shading strength, and others. Textures are packed into Texture2DArray at build time. This separation of materials from types has helped save a lot of memory and allowed for texture reuse.

Figure 2. Voxel Materials and Voxel Type.

In Figure 2, it can be seen that the voxel type contains references to three materials: top, bottom, and side. This is necessary to draw different materials on different sides of the voxel: a simple grass material is not enough, as slopes need to show soil or clay, and beneath the surface, a dirt texture should be visible.

1.2 Shader

In the basic version, where we only need colors, we can use a simple shader that includes only voxel material sampling using the triplanar technique.

The triplanar technique involves sampling along three planes and blending the sampling results based on weights that depend on the surface normal's slope. This increases the number of samples, but it allows meshes to be textured without calculating UV coordinates, as the world position is used instead.

Figure 3. Voxel Material Sample function.

Since we have three voxel materials for each voxel type, which correspond to different planes, we can use a small trick. Instead of sampling all three materials separately using the triplanar technique, we can determine the necessary materials within the triplanar sampling itself.

For the XY and ZY planes, the Side material is used. For the XZ plane, we need to look at the sign of the Y component of the surface normal. If the sign is positive, we use the Top material; otherwise, we use the Bottom material. This way, we avoid increasing the number of samples.


2. Voxel Type Generation

The selection of voxel types is quite similar to generating the landscape surface: the same noise, possibly a bit simpler. The voxel types are determined during the SDF generation stage on the GPU side, so the voxel already has the complete set of data by the time it reaches the CPU.

2.1 First Layer

The first layer represents grass of a basic green color, from which color variations will later develop.

Figure 4. Texture of base grass.
The first layer should be positioned in such a way that it creates a gradient effect: since this is the base grass, it should primarily be located in low areas and on slopes.

We will need masks again to determine where the weight of the first layer is the highest and where it is the lowest. Since there is a dependency on height, we will take it and normalize it.

Figure 5. The mask for the first layer.

The next step is to add a dependency on the surface slope angle to our equation: where the surface tends towards a vertical position, we will increase the value.

Figure 6. Mask of the first layer considering slopes.

If the mask is visualized on the actual landscape, it can create an interesting gradient effect (Figure 7). This gradient visually represents the distribution of the first layer, blending smoothly from low areas and slopes to other regions, enhancing the natural look of the terrain.

Figure 7. Mask of the first layer visualized on the landscape.

2.2 Second Layer

The second layer is a lighter shade of grass, as if slightly scorched by the sun. This layer should be positioned in contrast to the first one: only on horizontal surfaces and exclusively at the tops of hills.

Figure 8. Light grass.
Figure 9. Mask for the second layer.

In the figure 9, some artifacts can be seen, which arise because the height is taken from the base surface value without considering finer details. At the moment, it's not possible to use the final surface height since there isn't the option to calculate everything first and then perform various manipulations like filtering. This will become feasible when I switch to an iterative voxel generator.

Figure 10. Mask of the second layer visualized on the landscape.

2.3 Third Layer

The third layer will complement the first two: it's slightly more yellow, representing grass that is more sun-bleached. It should follow the rules of the second layer but appear in "patches." This adds a bit of variety to the landscape.

Figure 11. Sun-bleached grass.

To achieve the "patches" effect, a standard Simplex 2D noise can be used. This type of noise effectively creates random but natural-looking areas that blend seamlessly into the existing terrain layers, adding diversity without disrupting the overall gradient and material distribution.

Figure 12. Mask for the third layer.

As can be seen in Figure 12, the mask for the third layer starts to overlap with the second layer. This will create an effect of mixing different materials, so it is necessary to subtract the third mask from both the first and second layers to obtain more accurate results. This subtraction helps maintain distinct areas for each material, preventing undesired blending and ensuring a clean separation between the different grass types.

Conclusion

What is the Result? How Does the Landscape Look?

Figure 14. The landscape with some voxel types.

As seen in Figure 14, the landscape doesn't look good. Compare it with the colors from Substance Designer!

At this point, I paused, since the masks could still be adjusted, but the colors themselves could not. Lighting, color correction—they're simply absent, which results in the landscape looking genuinely bad.

What to do? Set up lighting and post-processing so that even during the texture generation phase in Adobe Substance Designer, it's possible to understand how the landscape will eventually look!

The next step: lighting and post-processing.