Unconventional ways to make Emissive Masks

I’m always looking for new ways to add interesting visual complexity to my projects.

Personally, I have a taste for abstract and geometric art that gives off an air of mysticism and mystery, stuff like fractals, alchemical symbols, runes and glyphs are things I find very visually stimulating, and pay special attention to.

Some examples I found noteworthy are below:

Doctor-Strange-Geometric-Magic-2
Dr. Strange
gallery_19474934345_53b42ffe7a_h
Journey

Making these symbols is often going to be a process that is completely dependent on what kind of game you are working on, what kind of world it is set in, what kind of feel you aim these symbols to convey and many other factors like this.

This blog post is not intended to address the semiotics and design portion of these symbols, that is a separate topic for another day.

What it IS intended to do, is to provide the reader with some idea of how they might achieve cool looking effects like this in their own projects.

For the purpose of this post I’m going to be using Unreal Engine 4, but in practice I believe all the knowledge would be entirely applicable to Unity or Godot, maybe even your own custom engine, really any system that has a renderer with alpha transparency.

Using photography of real life symbols:

This slideshow requires JavaScript.


Often the best inspiration is found in the real world. The metal plates you see above are used in rituals in Hindu culture, and have a very appealing geometric aspect to them in my opinion.

Here’s what the same symbols look like in UE4:

This slideshow requires JavaScript.


What I did was:

To take some high-contrast photos of these plates, trying to eliminate all shadow and light information from that photo.

Open up those photos in Photoshop, and edit the pictures with the objective of converting the image into what could be thought of as a mask or heightmap (if you’re familiar with terrain generation this’ll probably be more relevant).

This is what the output should look like:

PatternGlowMap.png

Then just import the texture into Unreal/engine of your choice and setup the Material to look something like this:

Capture.PNG
Choosing to use this as a decal or just as a normal Surface material is up to you, but I personally find decals to work great for the use case of applying these glyphs/runes to the world.

Something to take advantage of as Unreal supports decal normals, is to generate a normal map from the mask output of Photoshop/Your-editing-software-of-choice, this adds a little more lighting info. and response for the decals.

I used an online normal map generator, but I’ve since found out that solutions like this (which also includes stuff like CrazyBump) are not advisable for generating textures for production, as the tangent spaces they operate in may or may not have any correlation with Unreal’s own MikkSpaceTangent implementation, and are generally just bad anyway.

This results in textures that will produce incorrect/inaccurate lighting responses, especially in a PBR setup.

(And if you aren’t using a PBR setup while working on Unreal or Unity right now, you should really be using a PBR setup. Even stylized aesthetics still gain a lot from respecting PBR rules)

That about covers this method though.

Use WeaveSilk to generate masks:

This next bit is pretty straightforward, but is also fun to do!

There’s an online interactive art generator called ‘WeaveSilk’ that allows users to generate complicated geometric art with just a few drags and clicks of the mouse using algorithmic generation from user input.

84385c5b97f80e32fbde4f7d2fdc1aac.gif

This lets you generate some very interesting looking patterns and symbols with minimal effort and a high iteration speed.

Saving these symbols out is also just a right-click operation away.

Once obtained, you can edit it to remove the color information and convert it into a proper mask. This mask is then used in a similar way as in the above material to create results like this in-engine:

This slideshow requires JavaScript.


And that’s about it for this post.

I hope that by sharing some unconventional ways to generate masks, I can give other people some ideas to help them start working on their own ways to create these cool geometric symbols and patterns we all love so much.

Hope this is helpful!

I’m on Twitter and you can reach me @nightmask3 if you need any help or have a clarification.

Advertisement

Ground Fog in Unreal 4

I’ve been working on a third person exploration and adventure game called ‘The Nomad’ for the past 6 months, and I wanted to share some of the techniques I had learned in this time.

My previous post dealt with implementing Distance Fog using a Post-Process material.

This time, we are going to explore how to implement a Ground Fog in Unreal 4.

Ground Fog is very important for a variety of reasons.

Here is the same scene as above, without the Ground Fog.

groundnofog
Scene without Ground Fog

A couple of things you can notice:

  1. The scene still looks okay, but overall lacks any visual complexity.
  2. The color of the sand is now too repetitive and dominates the view.
  3. Less easy to differentiate between the foreground and background, the distance fog that we see helps somewhat though.

So, we can see how Ground Fog can add to the overall aesthetic of a level. Let us now proceed to the implementation itself.

fogtexture
Get UV values for Fog Texture

This set of material nodes, is responsible for raycasting forward a certain distance (ML_Raycast), finding a world position and scaling that by NoiseSize.

raycast
Raycast Material Function

This world position is then fed into MF_NormalMaskedVector and what is obtained then is a UV value for the moving fog texture, by masking the input WorldPosition with the Vertex normal.

normalmaskedvector
Normal Masked Vector Material Function

The output of the moving fog is then multiplied (in my case I use Add, it works for this case but might get weird results otherwise), and then you use a if statement to define a World-Z cutoff for the fog.

If the worldZ of the vertex being drawn is lesser than the cutoff value, we multiply the moving fog color value into the post-process output. You can think of this as a simple if-conditional check.

if(PixelWorldZ < CutoffValue)
DisplayFog();

Then, in order to make the fog fade smoothly until the cutoff value is reached, we use another if statement to check the distance between the cutoff Z value and the current pixel world Z value. If the pixel is within the gradient fade range (as defined by GradientRange), then we lerp between the color output of the fog and 1.

if(WorldZCutoff – PixelWorldZ < GradientRange)
LerpBetweenFogColorAndSceneColor();

If the output is 1, we use only the scene color.

groundfogmaterial
Gradient and final material output

The final output of all this is multiplied into the PostProcessInput, and then fed into the Emissive Color.

This material uses the Post-Process material domain. Assign it to a Post-Process Volume, and you should be good to go.

Hope this is helpful to someone!