Interactive Material and Element system from Breath of the Wild

One of the games that came out recently that surprised me in terms of clever design and depth of its gameplay systems was Legend of Zelda: Breath of the Wild.

tumblr_o8ui2sXPJG1vxh6q9o1_500.gif

After watching the GDC 2017 talk given by the three lead developers of BotW, in particular the section given by the lead programmer:

This talk blew my mind when I first saw it.

The simplicity of the system, and the potential that it possessed, made this method seem like something every game going forward that wants to have an fully interactive world should have, in my opinion.

I started thinking about how you might implement something like this in Unreal, and moreover how you might implement it in a way that allows for ease of use and scaling as you added more materials and elements.

This also coincided with the Unreal Summer Game Jam, and I decided to take the opportunity to create a small game/proof of concept for this system:

This is what I came up with:

ezgif.com-video-to-gif (1).gif
The rules of this system are stated quite simply and don’t leave much room for error:

1. Materials can interact with Elements
2. Elements can interact with other Elements
3. Materials cannot interact with other Materials

The system as I implemented it in Unreal has a few basic pieces:

1) Material components – Registered as a C++ base class, this is subclassed into various different Material types like WoodMaterial, MetalMaterial etc in Blueprints.

Material
2) Element components – Registered as a C++ base class, this is subclassed into various different Element types like FireElement, WaterElement etc in Blueprints.

Element

3) Material/Element states – Every component has a state enumeration variable, which by default is “Active” and represents that material/element after being exposed to some element, eg. “Burning”, “Wet”, “Electrocuted” etc. (these are Material state examples, I haven’t quite found a use for Element states yet beyond “Active” and “Disabled” but I’m sure they’ll come in handy at some point)

4) Material and Element types – Another enumeration variable used to identify various types of elements and materials. This variable is chosen at design time.

StateTypeClass
5) Material Class/Element Class – Another variable that is chosen at design time, this is used to know what subclass of material/element to actually use/spawn. This is also why there is a ‘CreatedMaterial’ or ‘CreatedElement’ variable, as the component in the details panel is only a base Material/Element type, and the actual component that you need is created at runtime and stored in ‘CreatedMaterial’ or ‘CreatedElement’ respectively.

I’m fairly certain there’s better ways to do this that aren’t as hacky, but it worked for me.

It’s also a bit unintuitive to have to select both a ‘Wood’ type and a ‘WoodMaterial’ material class, you’d usually expect to do one or the other, but not both, however those values are needed by the system for proper functioning, in different places.

It’s probably something you can abstract away with some boilerplate if checks, I just didn’t find it necessary to implement for a prototype.

An unexpected benefit of doing it in this somewhat unintuitive way is that you’d never need to change your Material or Element component classes when you add new types of Materials/Elements. If you were to do if-checks based on the ‘Type’ being selected, say, then you’d need to add a new if-check for every type of material/element you add, which can quickly get out of hand. The same would apply vice-versa if you were using the Class instead.

This way introduces greater chance of user error, having one of those two properties not selected will cause silent errors that are hard to find, but if that’s a tradeoff you can live with, more power to you.

6) Blueprint subclasses of Material/Element – Having the actual Material/Element behaviors be defined in BPs, allows for designers to utilize some handy flow control systems that exist in BP and allow for intuitive and scalable scripting of the interaction between different Materials and Elements.

A picture is worth a thousand words, so here’s what the BP of the wood Material component looks like:

BP

What’s happening in the gif is that the Player is tagged as a wood Material, and walks into a volume tagged as a fire Element.

This causes them to catch on fire (as wood in fire typically does),  and when that’s done it sets the player to a ‘burning/on fire’ state.

123.PNG

This allows for different objects to own Material or Element components, have a type (Material example would be wood, Element example would be fire), and then allow for designers to implement different types of response behavior for when an Element and Material interact, in Blueprints.

Having state values associated with both Materials and Elements, also means you can script additional or altogether different interactions based on the state of the Material/Element in question, for eg:

A wood material in its normal active state on contact with water does nothing, but a wood material in a ‘burning’ state on contact with water, has the fire removed.

d1cd541d82a24cfe97726178b31f57eb.gif

It also uses the neat visual flow control of the ‘Switch-On-Enum’ nodes to keep things simple and compartmentalized for designers.

How the actual functions of the Material/Element components are triggered is done using the BeginOverlap events from colliders, which looks something like this:

1234.PNG

Something I did for ease of use was to create a BP class which has a collider, Static/Skeletal mesh/Particle system and Material/Element component, and then designers can easily create new interactive objects by inheriting from this BP and switching around the mesh/particle system and selecting different Material/Element types.

This works okay for demo purposes but in a production environment you’d probably make those as C++ base classes too, as there’s nothing happening in the interactive item base classes that you couldn’t easily transcribe to code.

So essentially, any class that wants to use the Material/Element components, just needs to have a collider, and hook up the BeginOverlap events of that colllider to the Material/Element ‘OnStateChange’ events and provide the appropriate arguments from the object that caused the overlap.

Here’s what that looks like for a typical interactive object:

1235.PNG
This is an Element item, which is why it checks for both Element and Material state changes from the other object. A Material item would only check for Element state changes. (remember the three rules!)

 

A final tidbit, though it doesn’t have anything to do with the interactive Material/Element system persay, is for how to get particles to spawn all over a skeletal mesh, which is an invaluable part of the Cascade particle system and is necessary to sell the effect of things interacting in my opinion.

fire
The circled out Cascade emitter is where all the magic is, the details panel in the lower left is where you choose the parameter name that will point to the target actor of this particle effect.

When you want to actually create the effect, it’s just a matter of spawning the appropriate particle system, and setting that parameter using the name you assigned within the particle system earlier, to the actor that you want to be covered in fire/water/etc.

12356.PNG

That’s it for this post, hope that this provided some insight into how to make objects feel more interactive and get more of that lovely ‘game juice’ feel into your games that Nintendo manages to do time and again.

Hope this is helpful!

I’m on Twitter and you can reach me @nightmask3 if you need any help or have a clarification.

Advertisement

Technical Art Demo Reel

 

A compilation of work I’ve done using Blender, Houdini, Adobe Creative Cloud, Quixel Suite, xNormal and Unreal 4.

What follows is a breakdown of what’s happening in each shot:

1) Python tool for animating cinematics:

A tool that allowed the user to create manipulators bound to different properties, that they could then animate in the Sequencer tool. This allowed access to lower level C++ properties or animating properties inside of structs, something that base Unreal does not provide.

It was also built into an editor interaction mode of its own and allowed the user to pick the actor to animate using an eyedropper tool.

2) Python tool to render curve data as a spline:

A tool that allowed the user to visualize their FBX animation data in the form of a spline curve. It utilized Unreal’s built in spline component.

3) Interactive Electricity/Lightning effect:

After watching the Unreal Dev Days talk given by Alan Willard, the Senior Developer Relations Tech. Artist at Epic:

and his subsequent demonstration of that same effect on the UE4 livestream:

I wanted to try to replicate the effect as it depends on the interaction of all the systems of Blueprints, Materials, lights, particles and sounds in order to achieve the final effect, something I’ve never done before, and seemed like a good challenge to improve my knowledge of Unreal.

For my implementation I left out the sound but replicated everything else, as far as I can make out. I was particularly impressed by the range of options that the system provided to designers in order to tweak the effect however they want, and especially how the same asset could safely and effectively be used in multiple scenarios to achieve a variety of different visual compositions.

The system itself isn’t really that complicated once you break it down either, which is all the more impressive to my mind. It just involves raycasting in a random direction, spawning a new electric arc and particles if that cast hit something, and repeating ad nauseum.

Most of the magic really happens inside the Material, which doesn’t utilize any textures at all in order to produce the noise in the electricity, and instead uses overlapping Fast Gradient Noise at different levels of scale and tiling to produce the distortion, from the inbuilt Noise node.

The electric arc itself is a spline with a simple cylinder mesh chosen to be stretched along that spline, and the electricity Material is applied to that cylinder. This bit is accomplished using Unreal’s ‘SplineMeshComponent’ functionality.

When an arc is spawned the starting location is the origin of the spline and if the raycast hits something, the impact location is set as the end of the spline, which is then updated along with its mesh.

4) Caustics Generator:

I was working on a light study of Blade Runner 2049, of the scenes inside Wallace Corporation, a screenshot of which I’ll include:

LuvOffice

In order to achieve a setup similar to this, I figured I’d need to learn how to generate caustics.

There’s a couple of ways to achieve this that I read of:

1) Jos Stam’s method to generate periodic caustic maps: https://www.opengl.org/archives/resources/code/samples/mjktips/caustics/

This method looked interesting and like most of Jos Stam’s work is seminal, but is a bit outdated for current quality requirements.

2) Realtime caustics:
View at Medium.com

This method seemed more feasible and delivered higher quality, but it still didn’t meet the bar for what I wanted in this scene for two reasons:

1) It seemed like the final quality is very dependent on the resolution of the wavefront mesh, surface mesh and the grid plane you are projecting it onto, which is definitely not ideal for a realtime environment where you might want results that could even hold up in 4K and cinematics and so forth, which I think this method could not scale up to, even with GPU computation.

2) The method doesn’t seem to allow for much tweaking or changes to be achieved in the final look, limiting its potential as an actual technical art tool.

I kept searching until I stumbled across the tech demo presented by Ryan Brucks, principal technical artist at Epic, during GDC 2017, where he demonstrated a method in which he baked the results of the caustic simulation into a flipbook which could then be used at runtime with a much lower cost.

I had no idea how to achieve results like that in Unreal, but I had also seen another video where a Houdini user demonstrated something remarkably similar:

I put two and two together, and figured that I could probably generate the simulations and flipbooks in Houdini, import them into Unreal, and then play those flipbooks back in a Material and I’d have the caustics as I needed them.

I reached out to the Houdini user on a forum post and asked them about their method and they were very helpful in describing which nodes to use and what the principle behind the method was.

After a week or two of hacking away at Houdini I had what I wanted, a tool that let you generate caustics and then bake out the flipbooks.

Doing this in Houdini, has three advantages that I see:

1) Can bake out the flipbooks at any resolution, even really large ones that Unreal cannot usually support Render Targets for without causing massive slowdowns and sometimes crashes.

2) Can use different types of input noise to generate the caustics and change a bunch of parameters and options to tweak their final look before baking.

3) Method is exactly the same as the one mentioned in the Realtime caustics blogpost, but because its carried out offline, I could increase the mesh resolutions as much as I wanted until the quality met the bar I had in mind.

I also did a bit of work after this in order to make the flipbook textures tileable using information from this blogpost, also by Ryan Brucks:

https://www.shaderbits.com/blog/tiling-within-subuv-or-volume-textures

This gave me tileable caustic flipbooks that could be animated within Materials using a ‘Flipbook Animation’ or ‘SubUV_Function’ node and then used as a Light Function Material or a simple Surface Material or even as a Decal Material if you wanted.

5) “The Wanderer” character model:

I’m currently working on my own game, the working title being ‘Project Gilgamesh’.

It’s a third person parkour platformer, and I wanted to take the opportunity to learn how to make a next-gen character model that utilized Unreal’s newer cloth/rigid body physics and anim dynamics features that were introduced after 4.16.

The character modelling was done in Blender, and was based in large part on these pieces of concept art for the Dr.Strange movie:

There’s also some influence from the Adeptus Mechanicus from Warhammer 40K:

2771c3874a191ac855a770f46de3d60c

UV mapping was also done in Blender.

The high poly normals were baked down in xNormal before being imported into Quixel.

Texturing was done using Quixel Suite and Photoshop CC. All the textures were worked on at 4K and then rendered down to 2K at export.

The final skeletal mesh had 4 material slots, meaning 4 draw calls for the player per frame. I might add one more for all of the emissive points on the player like the eyes, lights from the gas mask etc.

I remember reading somewhere that Unreal recommends staying within 3-5 draw calls/materials per object, though I could not find that link again for a citation so it might be hearsay, take with a grain of salt.

The character’s poly count is a little bit on the higher side, even with current gen hardware in mind, but my reasoning behind this is that I’m not going to have many (if any) other characters that will need this level of detail by far.

Most other models will either be environmental models, props and maybe a few robotic enemy types.

The base biped model was rigged in Mixamo, brought back into Blender, skinned with clothes, gas mask and belt and then imported into Unreal.

Animations that were applied to the rigged unskinned model in Mixamo could then be imported into Unreal and worked right away on the skinned version as their skeletons were the same.

Cloth was animated using Unreals new Nvcloth realtime solver and cloth toolset which allows you to paint values for the cloth simulation onto the mesh directly in the editor.

This saves a HUGE amount of time compared to the APEX cloth program that was released by Nvidia, which was also rather complicated and obscure without a lot of up-to-date documentation either.

However there are never any free lunches and there are still some drawbacks to using this cloth method, such as how more complicated cloth setups like thick cloth or multi-layered cloth setups would not be possibly with the toolset as it is currently.

Capture

Also there’s the obvious drawback of how real time cloth is never really going to look as good as cloth that is simulated offline. Not for a very long time at least.

On the whole though I think this toolset is worth learning, if you can work around the problems with it and/or they don’t matter to your use case.

The character isn’t finished yet, but its close. I want to add other things like an additional belt strap that is attached on two sides, which I’ll probably try to do with rigid bodies or anim dynamics, which allow you to drive additional bone/socket motion on top of the character’s skeletal animations.

6) Volumetric fog using custom node and HLSL:

A demonstration of a raymarched volumetric fog effect I’ve been working on for a while.

Based on the ShaderToy shader found here:
https://www.shadertoy.com/view/XtfSWX

The effect raymarches forward a certain distance, samples that point in world space, uses it to generate some noise (triangle noise is what the author of the shader on ShaderToy called it), and adds a specified color to it, which is blended with the scene texture based on the scene depth, so that nearby objects are still visible/not occluded.

Does face some issues with Translucency (always troubling in a Deferred rendering setup) and can be solved by placing the post-process effect before the Translucency step in the pipeline, at the cost of a performance hit.

Uses the Custom HLSL shader node in the Materials system.

I was working on this before Epic shipped their own volumetric fog solution. You could probably achieve the same effect using volumetric particles and their global fog but I haven’t attempted to port it or experiment with that so far.

 

 

Unconventional ways to make Emissive Masks

I’m always looking for new ways to add interesting visual complexity to my projects.

Personally, I have a taste for abstract and geometric art that gives off an air of mysticism and mystery, stuff like fractals, alchemical symbols, runes and glyphs are things I find very visually stimulating, and pay special attention to.

Some examples I found noteworthy are below:

Doctor-Strange-Geometric-Magic-2
Dr. Strange
gallery_19474934345_53b42ffe7a_h
Journey

Making these symbols is often going to be a process that is completely dependent on what kind of game you are working on, what kind of world it is set in, what kind of feel you aim these symbols to convey and many other factors like this.

This blog post is not intended to address the semiotics and design portion of these symbols, that is a separate topic for another day.

What it IS intended to do, is to provide the reader with some idea of how they might achieve cool looking effects like this in their own projects.

For the purpose of this post I’m going to be using Unreal Engine 4, but in practice I believe all the knowledge would be entirely applicable to Unity or Godot, maybe even your own custom engine, really any system that has a renderer with alpha transparency.

Using photography of real life symbols:

This slideshow requires JavaScript.


Often the best inspiration is found in the real world. The metal plates you see above are used in rituals in Hindu culture, and have a very appealing geometric aspect to them in my opinion.

Here’s what the same symbols look like in UE4:

This slideshow requires JavaScript.


What I did was:

To take some high-contrast photos of these plates, trying to eliminate all shadow and light information from that photo.

Open up those photos in Photoshop, and edit the pictures with the objective of converting the image into what could be thought of as a mask or heightmap (if you’re familiar with terrain generation this’ll probably be more relevant).

This is what the output should look like:

PatternGlowMap.png

Then just import the texture into Unreal/engine of your choice and setup the Material to look something like this:

Capture.PNG
Choosing to use this as a decal or just as a normal Surface material is up to you, but I personally find decals to work great for the use case of applying these glyphs/runes to the world.

Something to take advantage of as Unreal supports decal normals, is to generate a normal map from the mask output of Photoshop/Your-editing-software-of-choice, this adds a little more lighting info. and response for the decals.

I used an online normal map generator, but I’ve since found out that solutions like this (which also includes stuff like CrazyBump) are not advisable for generating textures for production, as the tangent spaces they operate in may or may not have any correlation with Unreal’s own MikkSpaceTangent implementation, and are generally just bad anyway.

This results in textures that will produce incorrect/inaccurate lighting responses, especially in a PBR setup.

(And if you aren’t using a PBR setup while working on Unreal or Unity right now, you should really be using a PBR setup. Even stylized aesthetics still gain a lot from respecting PBR rules)

That about covers this method though.

Use WeaveSilk to generate masks:

This next bit is pretty straightforward, but is also fun to do!

There’s an online interactive art generator called ‘WeaveSilk’ that allows users to generate complicated geometric art with just a few drags and clicks of the mouse using algorithmic generation from user input.

84385c5b97f80e32fbde4f7d2fdc1aac.gif

This lets you generate some very interesting looking patterns and symbols with minimal effort and a high iteration speed.

Saving these symbols out is also just a right-click operation away.

Once obtained, you can edit it to remove the color information and convert it into a proper mask. This mask is then used in a similar way as in the above material to create results like this in-engine:

This slideshow requires JavaScript.


And that’s about it for this post.

I hope that by sharing some unconventional ways to generate masks, I can give other people some ideas to help them start working on their own ways to create these cool geometric symbols and patterns we all love so much.

Hope this is helpful!

I’m on Twitter and you can reach me @nightmask3 if you need any help or have a clarification.