Shader editor

Graphical shader editor


Shaders can be created in Nebula through directly writing shader code, or by using the shader editor. The editor can be found in the content browser, and is accessed through Window -> Show Shader Editor…

The shader editor provides a graphically guided method of designing shaders by using a node-based network. This is the first screen:shady

The green node in the center cannot be deleted, and is referred to as a super node. To the right is the toolbox containing shader functions, sorted by the type of operation they provide.


Refers to constant variables which cannot be changed during runtime. Context allows for getting variables from the shader context, such as UV coordinates and vertex colors. Changing the value of a constant can be done by clicking that node and editing its properties using the ‘Node properties’ window.


Contains functions for converting data, for example extracting only the alpha channel from a four color vector, or assembling a three color vector from three separate values.


Contains typical mathematical operations, such as add, subtract etc.


Like constants, however they allow for runtime modification, which means their actual value can be changed outside of the editor.


Contains variables exposed by the Nebula shading system in a global context, and gives a per-frame random number, the time and global light information.


Contains functions for texture sampling, such as using a fixed mip level, custom UV coordinates, and cube map samplers.



When the shader is done you might want to build it. This will generate a shader file in your project which matches the name of the project, so your project must be saved before the build can take place. When the build is completed, the code generated from the build will appear in the window labeled ‘Shader code’. If curiousity strikes you, and you might want to see how the code is generated, you can click the ‘Debug code generation’ button in the toolbar to graphically visualize how the code is being generated. This will show the traversal of the graph and how the resulting code is produced. The button ‘Real-time visualize’, which is also found in the toolbar, will when toggled animate the node graph if an animated variable such as time is used. Disabling this is recommended if the node graph and becomes slow or unresponsive.


Creating just a shader isn’t all that needs to be done before it can be used. In order to tell the Nebula rendering system when to render and with which shader (shadows, picking, unlit, etc) a material has to be defined. To create a new material using your new shader, open the material editor by going to Material -> Open material editor. This will bring you to this window:


Materials are defined in files not as single materials, but rather as lists containing several. The material editor creates an unnamed list by default, but it cannot be used until it is saved. To open an existing list, go to File -> Open list…


In this window I have created a testing material to demonstrate how to use this tool. To the right of ‘Material’ is a drop-down box, which contains all materials defined in this list file. Creating a new material in this list file is done by pressing the ‘Create new’ button which will create a new empty material. Deleting a material is done by pressing the ‘Delete selected’ button. When a material is selected in the drop-down box, the UI vill populate with the information defined by the material.


Materials can be defined as virtual, which prohibits them from being created as a Surface. The purpose of virtual materials is to create a stub material which can be implemented and used by other materials.


If your material is complicated and requires some type of explanation, you can write such a text using the ‘Edit description’ button.


Inherting a material will cause all of the variables and passes from the parent materials to be copied to this material. Inheriting several materials is possible, and is done by separating the material name with the pipe character |. For example, Mat1|Mat2 will inherit both material Mat1 and Mat2. If an inherited material defines a variable with the same name as your material, or if any inherited material has a name conflict, the last defined material will replace all previous definitions. If we use the previous example and lets say Mat1, Mat2 and your material contains a name conflict each, then the order of presedence will be Mat1 -> Mat2 -> Yours.


To explain to Nebula when and where to render an object, one must define a pass. This is done by clicking the Add pass button. By default, this will create a ‘default pass’ which will render shader you created before in the common pass FlatGeometryLit. To modify a pass after it’s been added, just click one of the three fields.


Batch denotes WHEN during the frame this object should render. Clicking this will bring up a window which looks like this:shady-mat-frame

Here, we can select which frame shader we want to add the pass from. By selecting the drop-down, we can select from all EXPORTED frame shaders. If for any reason your frame shader doesn’t appear in the list, it is because it’s not been exported yet. This window will give an overview of what is being done during a frame, but without the post effects, compute shaders and algorithms. Clicking on a pass will expand to all batches executed within that pass.


Double-clicking on a batch here will bring you back to the previous window, but now the batch has become the one your selected.


Selecting the shader box will bring a drop-down selection of available shaders. Select which shader you want to use for the batch.


Shaders contains more than one ‘pair’ of vertex and pixel shaders. Generated shaders from the shader editor will have only one, called ‘Generated’, however shaders created by hand will have several. Select from the list which variation within that shader file is to be used.


Materials also define which variables within the shader code are supposed to be changeable. Pressing the ‘Add variable…’ will add a new variable. If you want to set this variable to one from the shader being edited in the shader editor, right click on the item and select ‘Set to parameter’. This will bring up a menu of all parameters in the shader, and set the type accordingly. If you decide to change the type of a variable in the shader, you have to also reset the variable in this editor, by ‘Set to parameter’ again.


For programmers

This chapter follows ways to extend the shader editor beyond the default tools. This allows for adding own code, super nodes and even ordinary nodes.

Creating custom nodes

The shader editor uses XML files to declare nodes, which are called ‘variations’. The super node XML declarations are called ‘super variations’ because of their singleton nature. Variations and the projects can be found in <toolkit>/work/shady or <project>/work/shady. Variations are sorted by their parent folder, so if a new category of variations is to be created, they need to be grouped in a category. This is how the previously mentioned nodes have been created, and grouped under for example Context, System, Texture, etc.


Variations take the form of an XML diagram. The files has to be saved with the extension .ndv for the shader editor to find them. The XML declaration need to be contained within a Shady tag. Here follows an example:

    <Variation desc="Adds two variables together to produce a sum. The two inputs much have the same type" simulationCommand="add">
        <Input type="any" name="input1"/>
        <Input type="any" name="input2"/>
        <Output type="any" name="product"/>
        <Source type="GLSL">
            <Code output="product">
                [output] = input1 + input2;


The variation is opened with a Variation tag, containing a description and a simulation command. If no simulation command is defined, this node will not have a live preview. The simulation commands are hard coded into the shader code, since they are run through the engine code. However using the command ‘copy’ will simply pass the result through the node without making it black.

Inputs and outputs

Variations define inputs and outputs, these are used to determine what goes into the node, and what comes out of it. Type can be the special type we see here, called ‘any’, or a hard defined type. Using any will make the shader editor able to distinguish based on the highest common denominator what the ‘any’ type is referred as. This means that the most complex type (the type with the most components) will take precedence.

The types available for use are:

  • float
  • vec2
  • vec3
  • vec4
  • ivec2
  • ivec3
  • ivec4
  • bvec2
  • bvec3
  • bvec4
  • bool
  • int

So if we define two ‘any’ inputs and an ‘any’ output, and plug a vec4 into one input, then the second input will be expanded to vec4, and the output will be vec4. The name field used for inputs and outputs will dictate the graphical appearance in the editor.


A single variation can have several implementations. This allows the shader editor to target several platforms using the same node graph. Within the Source tag we must define a piece of code for each output. This will mean that if an output goes unused, it doesn’t produce code when generated. Here, we target an output by recalling its name (in this case ‘product’) and write some code. In our example, we want the output to be input1 + input2, and so the [output] is used as a special name used to define the output declared by this Code tag.

Super variations

Super variations are more like code path controllers than anything else. They need a template code file (per source type) which is used to generate a new shader, and can also be supplied a header file. The intention of the header file is to put generic stuff, like functions, variables and whatnot which will be used by the template. The template itself contains the shader stages to create (Vertex, Hull, Domain, Geometry and/or Fragment), but only serves as a stub. The intended way to use super variations is by declaring an input with a specific define. If that input is used, the define will be automatically put into the target code, and thus, that input can produce a code path which generates an input value. If no graph is attached to the input, the define is not set, and thus a ‘default’ code path can be used. Here is the XML for Default.ndv.

	<SuperVariation desc="Typical deferred shader. Outputs to diffuse, emissive, specular, roughness and godray projection." defines="">
		<Input type="vec4" name="Diffuse" result="pixel" defines="USE_CUSTOM_DIFFUSE"/>
		<Input type="vec4" name="Normal" result="pixel" defines="USE_CUSTOM_NORMAL"/>
		<Input type="vec4" name="Emissive" result="pixel" defines="USE_CUSTOM_EMISSIVE"/>
		<Input type="vec4" name="Specular" result="pixel" defines="USE_CUSTOM_SPECULAR"/>
		<Input type="float" name="Roughness" result="pixel" defines="USE_CUSTOM_ROUGHNESS"/>
		<Source type="GLSL" target="gl44" template="toolkit:work/shady/supervariations/templates/glsltemplate.fxh" header="toolkit:work/shady/supervariations/templates/glslheader.fxh">
			<Include path="toolkit:work/shaders/gl"/>
			<Include path="proj:work/shaders/gl"/>


Super variations have no outputs, since they are the end point of a finite graph. Instead they define a set of typed inputs (which may not use the ‘any’ type), a resulting shader stage output, and what to define if this input is used.


In this image, we see graphs connecting to ‘Diffuse’, ‘Emissive’ and ‘Specular’. This will cause the aformentioned inputs to declare their definitions in the generated shader code. By using preprocessor directives in this manner, we can control the code being generated. An example of the output from the above image is the following:

vec4 GetDiffuse(PixelShaderParameters params)
    vec4 Local0;
    Local0 = texture(Param0, params.uv);
    const float Local1 = 2.000000f;
    vec4 Local2;
    Local2 = Local0 * vec4(Local1);

    return Local2;

Notice how the USE_CUSTOM_DIFFUSE seen in ‘Default.ndv’ is used. If we scroll down the generated code we find this snippet which is declared in the template file.

vec4 GetDiffuse(const PixelShaderParameters params)
	return vec3(0, 0, 0);

Notice how if we don’t define USE_CUSTOM_DIFFUSE then this function will be used instead. This is one way inputs can control the resulting to shader code and allowing for ‘sane’ defaults for these values. For example, if nothing is passed to Normal, we can safely assume to use the vertex normal instead of sampling from a normal map.


To generate code, we need to define a source type and target language. This is done using the Source tag. This tag is also the one responsible for loading the template and if available a header. The source tag can also denote default include paths, which will be passed to the target language compiler when building the shader. This allows for using a hand-coded shader library and use functions from it when designing shaders using the shader editor.

Particles and Effects



The Particle system enables a mesh to emit particles (instanced 2D sprites) at its vertices along their vertexnormals.


Left: a mesh (in Maya) with vertex normals displayed. Right: the resulting particles emitted based on vertexnormals in Nebula.

Creating particles

First, create a polygon-object in the shape you want the particles to be emitted. (The particles will emit from the vertices of that mesh in direction of their vertexnormals – the object itself will not be visible in Nebula). Of course this gives you the opportunity to exert influence on emission direction by turning the vertexnormals as you desire (in Maya: Edit Polygons>Normals>Vertex Normal Edit Tool). Further, the emission travels through the vertex order of the mesh, allowing the emission to “creep” across a surface.

In Nebula, you have two options for creating a particle system in the Content Browser.

The first (and usual) is to use the Particle Wizard from the Create menu. This way, the emitter mesh used won’t be rendered along with the particles.


The Particle Effect Wizard. (Note that, at the time of writing, the “Browse…” button does not in fact work. The emitter mesh can be changed afterwards, though.)

The second way (recently added) is to Append particle nodes to existing models – including other particles, allowing for stacking of multiple particle systems in one model. If one adds a particle node to an existing graphics mesh, the graphics mesh will be rendered along with the particle.


Particles overview

Particles are created as nodes in a model and there are 5 tabs for controlling various attributes. They are:

  • Appearance (attributes such as the emitter mesh used, sprite animation/phases, control over RGBA)
  • Emitter (attributes such as the emission duration, particle life time/length, spread of emission direction from vertex normal)
  • Motion (attributes such as gravity, wind direction, initial velocity along vertex normal)
  • Shape (attributes such as size, rotation, etc)
  • Shading (selecting which shader to use – unlit, additive – and which textures)

A few basic concepts about particles are:

  • They are emitted along vertex normals (unless you check “Use single point emitter mesh” in Appearance)
  • If you have an emitter mesh with several submeshes, you can pick which submesh (“Mesh fragment”) to use under Appearance.
  • There is an emission duration (set under Emitter) which can be be looped for repeating effects (such as a camp fire) or play only once (such as for an explosion)
  • Most attributes can be animated over time – either the emitter duration time, or the individual particles’ life time (set under Emitter)
  • By default, particles are aligned according to their vertex emitter – however, they can be set to always face the camera (set under Shape, “View aligned?”). Further, for sprite sheets with animations you can use the Tiles, Phases and Phases/sec under Appearance.



A basic particle system emitted from the 4 vertices of a plane. Each particle has a 2D texture mapped to it (the Nebula logo) and various attributes that affect size, rotation, etc over its lifetime. The only force affecting these particles is gravity, making them fall down.


Many of the particle attributes can be controlled(animated) over time and share a set of similar attributes. Here is an example:


A particle system, where particles have an upwards velocity and their color animated over time. Early on in their life cycle, they have increased red, and towards the end of their life time, an increase in green color.




Particle settings in depth

The below is simply copied from the old documentation, when attributes were still being set up in Maya. Things have changed – this section will be updated eventually. Hopefully soon.


Main Section
part2_main.gifParticle Shader 2 – Main Section

Render Priority:
specifies the position of this particlesystems sprites in the render stack of all particle systems. (Note: Doesn’t work with Additive Particles)Lock To Viewer:
Attaches the origin of the object with this shader to the camera origin of the view. Means its translate-values will always follow the rotation not.

Diffuse Map:
Thats the texture mapped onto the emitted sprites. Keep in mind that Alpha Source/Dest set to One/One doesn’t make use of an alpha channel. For more info on the texture control buttons see Special Attributes of the Nebula Material Editor but note that “Set As …” and “Get From …” don’t make much sense at a particle emitting mesh.

Bump Map:
New in this version is the ability to attach normalmaps to the particles. So each particle can be shaded in realtime enabling one to make particle-rocks, -balls -clouds that look like real.

Emissive Intensity:
Sets the light influence on the particles. Higher values makes them glow.

Depth Density:
Another new feature: The per pixel transparency of the particles is now checked against the Depth Buffer. Here you can set how soon the particle will fade when near solid surfaces.

part2_depth_density.jpgA smoke with depth density 5.0 on the left and 0.5 on the other hand. Actually in both cases the spites interpenetrate the ground! But no hard edges far and wide.

Texture Tiling:
New in Particle2: You can assing “random particle textures”. You do so by creating a vertical strip of several textures and setting the number of them here. For an example see the image below. This can be used to do a lot more realistic looking fire or smoke or whatever you like:

part2_texturetiling.gifRough example: With such a texture set to 2 Texture Tiling would cause something like this

Emission Duration:
Specifies the length of the cycle. If Loop is off the emission stops after the time set here.Start Delay:
From time = 0 the emission will be delayed by the amount of seconds set here.

Precalculation at Start:
Use this to get rid of the annoying “come-into-a-room-and-the-fire-starts-burning”-syndrom. It fast-forwards the particle system by the given number of seconds at creation.

Loop Emission:
If its set the Emission Duration will start over and over again.

Now you can set the gravity global for this particle system. The effect can be manipulated over time by the “Particle Mass” attribute beneath.

Activity Distance:
The Emission will be stopped if your distance to the emitting object is higher than the value set ant Activity Distance and will continue if within the amount.

Particle Stretch:
If higer than 0 this will cause the particles to aim towards their emitting direction and change their lenght according to the speed. Values above 0 influence the lenght-effect and will turn off all rotation specific attributes described beneath.

Stretch Detail:
If your stretched particles perform a curved path caused by gravity or wind you can make the trail bend that way by giving up a number of subdivisions set here. Higher values may have influece on the performance. This only takes effect if Particle Stretch is set.

Stretch To Start:
Checking this makes the stretched particles trail keep pinned to their emitting location. Means only the tip of each particle flyes around.

Avoid Errors by fading:
Especially for the Stretch Detail feature this is a workaround to get rid of a heavy flipping effect that occurs if the particle bows and you look along its moving direction. It fades away the surfaces that are arranged in a close angle towards the viewer.

Render Oldes First:
Makes new emitted particles appear behind the old ones. Setting this depends a lot on usual viewing direction you see that effect from: E.g. in isometric views you’d want that checked on a particle system that creates smoke to avoid a tunnel-effect or a kind of weird looking.

Sets the viewer alignment. If off the particles are just planes aligned to the XY-axis of the emitting object. To turn this you have to make the object a standard hierarchy node.

Editing the Curves

Besides all that unidimensional there are a lot of attributes embedded in curves that change either over duration time or lifetime of a particle.
To every curve applies the following:Selected Position marks one of the four keys in each curve from 0 = left border to 1 = rightborder. This works to the different Emission Duration and Life Time Attributes accordingly(see below). The first and last keys are locked to their positions. Only the middle ones are completely free in position and value. Important Note: only 4 keys will be recognized by the Exporter. If you accidentally deleted or created a new key you better undo so only the default 4 keys are in the graph. More or less keys could result in crashing the preview.

Selected Value is the according value from 0 to 1 on the line between Max and Min. For example if your key has a selected Value of 0.5 and the Max and Min Values are 4.0 and 2.0 the resulting Value is 3.0 at the key time.

Max represents the upper border in the graph.

Min represents the lower border in the graph.

part2_emission-frequency.gifParticle Shader 2 – The Emission Frequency collapsed and expanded

New to this toolkit version is a modified collapsed particle attribute. The work in everyday production showed that you don’t always need to edit the curve and everything but the Maximum Value! So we dicided to display this in collapsed state. As every field in Maya you can slide the value by CTRL+left/middle/right-mousebutton. Here this also works with the realtime remote control.
The only thing left to say is that if you turn the value in collapsed state the minimum value is modified relatively if other than 0.With the following attributes you can modulate the drawn curve with a simple waveform:

Frequency sets the waveform frequency in Hz. 1 Hz is the number of complete waves in 1 second.

Amplitude sets the maximum amplitude modulus to which the modulation can rise or fall. 20 means maximum +20 and minimum -20. Note that a lifetime of < 0 is not valid!!

Function is the waveform mode. Sine starts at waveform amplitude 0 and Cosine on the top of the amplitude.

Edit Curve… Button If you hold the left mouse button over this you have several options to modify the whole curve in a fast way:

  • Flip Curve Vertical – Mirrors all keys on the Y axis. left to right
  • Flip Curve Horizontal – Mirrors all keys on the X axis. upside down.
  • Straighten Curve – Puts the two middle keys on a straight line between the first and last.
  • Copy Curve – Puts the position and values of the curve onto a clipboard.
  • Paste Curve – Takes the data from the curve clipboard and applies it onto the curve.
  • All Keys To … – Takes all keys onto one level so there is no more variation over time.
  • Curve Sharpness Tool – A small helper to create rough or soft envelopes with a slider.
part_editcurvebutton.gifParticle Shader 2 – the Edit Curve… Button

Emission Duration Attributes

the following 5 settings are evaluated over the Emission Duration Time. Means the curves left border represent time = 0 on the Emission Duration Timeline, right border is the end of it to the specified time.Emission Frequency

This allows you to set the amount of particles that will be emitted per second from durationtime start to end.

Particle Lifetime

Allows you to set the lifetime of the particles. The units are measured in seconds.

Particle Initial Velocity

This is the speed that will be set to the particle when it is going to be emitted. This is unit per second. Now Initial Velocity Randomize can be used to have particles with different speeds. There is a random factor between 1 and the given value that is multiplied onto the Initial Velocity of each particle. 1 means all particles will have the given Particle Initial Velocity. If set to 0 the whole range of Particle Initial Velocity to absolutely NO Initial Velocity maybe applied to a particle.

part2_start-velocity.gifParticle Shader 2 – Initial Velocity

Particle Spread Max / MinLike in Nebula1 you are now able to set up 2 cones that represent the emitting angle again. One for adding angle to the emitter and another one to substract from it. Furthermore you can animate these settings over durationtime comfortable with a curve. Here is an example shown below the controls: There is a green cone that represents the MAX value and a red one for the MIN Spread Angle. As you can see the emission occurs between these two cones. Note that the emission will not be outside of the MAX-cone even if the Min exceeds the Max value at any time. Setting both angles to 90° will cause emittion in a flat level.

part2_spreadmaxmin.jpgParticle Shader 2 – Spread Max / Min

Life Time Attributes

All other settings are related to the specified Lifetime. Means the left border of the curves represent the birth of each particle where the right border marks its death.Initial Rotationangle Min / Max

Not yet a Life Time Attributes. But as it is rotation stuff it belongs here: As in Particle1 here you can give each particle a random rotation angle within 0 and the specified value. But you are now able to reduce the angle with the second setting. This is similar to the increasing / decreasing cone of the Particle Spread feature: You can limit the range even up to a single value by setting both values to the same. Here an example:

part2_initialrotationminmax.gifParticle Shader 2 – Initial Rotationangle Min / Max

Particle Rotation VelocityThis sets the amount of rotation for the particle in its lifetime. This is not an absolute value! Its more a rotation speed you set here. Now Rotation Randomize can be used to have particles with different rotation speeds. There is a random factor between 1 and the given value that is multiplied onto the Rotation Velocity of each particle. 1 means all particles will have the given Particle Rotation Velocity. If set to 0 the whole range of Particle Rotation Velocity to absolutely NO Rotation Velocity maybe applied to a particle. Left-Right-Random is used to have the particles rotated clockwise or counterclockwise randomly.

part2_rot-velocity.gifParticle Shader 2 – Rotation Velocity, Rotation Randomize and Left-Right-Random

Particle SizeAllows you to set the size of the sprites over the particle lifetime. Now Size Randomize can be used to have particles with different sizes. There is a random factor between 1 and the given value that is multiplied onto the size of each particle. 1 means all particles will have the given Particle Size. If set to 0 the whole range from Particle Size to absolutely NO Size maybe applied to a particle.

Particle Color

Here you can specify overlay-colors for the image-file. The values can be set in the color-bar an can be changed over time.

Note: The Nebula 2 particle system supports 4 and only 4 keys for the color-ramp. Please do not try to delete or add keys even if there were functions put in trying to prevent you from. Violating this may make your particle shader unusable.

part2_rgb.gifParticle Shader 2 – Particle Color with the Edit Ramp… Button

Edit Ramp… Button: As Particle RGB displays the only color ramp in the Particle System the functions on this button are shown just here:

  • Flip Ramp – Mirrors all keys from left to right
  • All Keys Like The First – If you want the particles to keep their color over lifetime this saves some work.
  • Make Gradient – Like Straighten Curve this puts the middle two keys to a color between the outer ones.
  • Brighter +10% – Adds 0.1 to each color value of each key in a range between 0 and 1.
  • Darker -10% – Subtracts 0.1 from each color value of each key in a range between 0 and 1.
  • Copy Color Ramp – Puts the colors and position of the keys on the ramp onto a clipboard.
  • Paste Color Ramp – Takes the data from the clipboard and applies it onto the ramp.

Particle Alpha

To set the transparency of the particles over their lifetime. This only works with alpha-channel particles. You can make additive particles fade by turning the color to black (see Alpha Source/Dest).

Particle Air Resistance

Defines the wind influence on the particles over lifetime. Since this is a factor you can set higher values to make the particles blow away. Note: “Wind” is a global vector in Nebula2 this attribute listens to. By default its quite unspectacular but theoretical it could be programmed to blow in waves and turn randomly to make it appear more realistic.

Particle Velocity Factor

Allows you to set an additional factor thats multiplicated onto the velocity of the particle over lifetime. Means at 0 the particle stops completely whereas values over 1 accelerate it additional.

Particle Mass

Describes the weight over lifetime of each particle. The resulting movement of course depends on the set gravity. If gravity is set like terrestrial -9.81 a high value will make the particle fall whereas below 0 will cause it to rise. Wich is important for e.g. fire.

Time Manipulator

This is used to take control of the global time of the particle system. Even though its not used very often its able to create some crazy and interesting effects. Last but not least because you can make the particles have negative time.

Scripting reference

See the Scripting page of the manual for an introduction to Nebula’s scripting system. The scripting language used is Lua.




run when the entity is created

function oninit(eid)


run every frame

function onframe(eid)


run upon a collision event (and you have activated feedback for them)

function oncollision(eid, otherid, point, normal)


input event has happened (and you have activated feedback for them)

function oninput(eid)


if your model has animevents (and you activated feedback for them)

function onanimevent(eid,event)



For ScriptingTriggerProperties some additional callbacks are available:


runs when entering trigger

function onenter(eid,othereid)


runs every frame inside object

function oninside(eid, othereid)


runs when exiting trigger

function onexit(eid, othereid)




Registered Commands

Scripting Subsystem



Script Server class is: Scripting::LuaServer

Registered Commands


Activate a postfx preset
activatepostfxpreset(string Preset)


Adds an attachment onto a joint with duration (-1 for infinite)
addattachment(uint entityid, string Joint, string AttachmentModel, float Duration)


Add an childelement to another element
addelement(string layout, string element, string id, string elementtype, string text, string clickevent)


Attach one entity on another, this will mean a model will be loaded and attached on this entity.
addgraphicsattachment(uint ___entityId, string Resource, matrix44 Offset)


Attach one entity on another, this will mean a model will be loaded and attached on this joint, and will follow along through animations.
addgraphicsattachmentonjoint(uint ___entityId, string Joint, int Rotation)


addgraphicsattachmentposaxis(uint ___entityId, string Resource, float4 Position, float4 Axis, float Angle)


Loads a layout file and gives it an internal name
addlayout(string identifier, string filename)


animtrack(uint ___entityId, string Track)


Apply an impulse vector at a position in the global coordinate frame to the physics entity of the game entity.
applyimpulseatpos(uint ___entityId, float4 Impulse, float4 Position, bool MultiplyByMass)


Generate an audio effect
audioeffect(float Volume, string Resource, float Duration, float4 Position)


Sets the camera distance.
cameradistance(uint ___entityId, float RelativeDistanceChange)


Gives Camera focus to the game entity. The entity needs to have a cameraproperty on it which will be activated.
camerafocus(uint ___entityId, bool ObtainFocus)


cameraorbit(uint ___entityId, float HorizontalRotation, float VerticalRotation)


Resets the camera
camerareset(uint ___entityId)


Generate a camera shake effect
camshakeeffect(float4 Intensity, float4 Rotation, float Range, float Duration)


Removes all attachments from entity
clearattachments(uint entityid)


Clear all if any attachments currently active on this entity.
clearattachmentsonentity(uint ___entityId)


Clear all if any attachments currently active on a given joint.
clearattachmentsonjoint(uint ___entityId, string Joint)


Create an attachment effect which uses this entity and attaches an effect to a joint with a name.<br /> Resource is the full resource name to attach, i.e ‘mdl:system/placeholder.n3’.<br /> Joint is a string matching the name of a joint.<br /> Duration is the time during which the effect should be active.<br /> Delay is the time it takes until the effect itself is applied.<br /> RotationMode can be one of the three:<br /> 1 = Local.<br /> 2 = World.<br /> 3 = Entity.<br />
createattachmenteffect(uint ___entityId, string Resource, string Joint, uint Duration, uint Delay, bool KeepLocal, int RotationMode)


Creates an entity by category and template
uint entityid = createentitybytemplate(string category, string templ)


Create a graphics effect on a point with an incident up vector.<br /> Resource is the full resource name to attach, i.e ‘mdl:system/placeholder.n3’.<br />
creategraphicseffectupvec(uint ___entityId, float4 Point, float4 UpVec, string Resource, float Duration)


Creates a model entity at a given position
createmodel(float4 Position, float4 Forward, string Resource)


Creates a trigger entity at given position and size
uint EntityId = createtrigger(float4 Position, float Radius, string Scripts)


Makes a character crouch.
crouch(uint ___entityId, bool Enable)


Exits the program


gets current mouse position in 3d
float4 foundEnt = get3dmouse()


Gets an entity attribute
string attr = getattribute(uint entityid, string attribute)


Retrieves the entity category
string Category = getcategory(uint EntityID)


gets the current levels
string levels = getcurrentlevel()


Retrieves current postfx preset
string Preset = getcurrentpostfxpreset()


gets layout element visibility
bool value = getelementvisible(string layout, string element)


Retrieves an entity unique id for use in scripts
uint EntityID = getentitybyname(string Name)


gets game entity at the current mouse position
uint foundEnt = getentityundermouse()


gets current frame time
float Time = getframetime()


Gets the current fullscreen setting
bool fullscreen = getfullscreen()


gets current game time
float Time = getgametime()


Retrieve available layouts
stringarray layouts = getlayouts()


gets an array of the available levels
stringarray levels = getlevels()


Retrieves an entities position
float4 Position = getposition(uint entityid)


Retrieves an entities transform matrix
matrix44 Position = gettransform(uint entityid)


Generate a graphics effect
graphicseffect(string Resource, float Duration, matrix44 Transform)


Checks if there is a game entity under the current mouse position
bool foundEnt = hasgameentityundermouse()


Hides a layout
hidelayout(string identifier)


Sets input focus on specified game entity
inputfocus(uint ___entityId, bool ObtainFocus)


tests if entity is an active game entity
bool isentity = isgameentity(uint EntityID)


Checks active state of a trigger.
bool Active = istriggeractive(uint ___entityId)


Checks if a key was pressed this frame
bool result = keydown(string PressedKey)


Checks if a key is pressed
bool result = keypressed(string PressedKey)


Checks if a key got released this frame
bool result = keyup(string PressedKey)


load a level
loadlevel(string levelname)


Loads an ui font
loaduifont(string File, string family, string style, string weight)


A MoveDirection message. Expected behaviour is that the entity starts to move into the specified direction. The direction vector can be defined as camera relative or absolute. The velocity settings should be interpreted as a factor.
movedirection(uint ___entityId, float4 Direction, float MaxMovement, bool CameraRelative)


movefollow(uint ___entityId, uint TargetEntityId, float Distance)


movegoto(uint ___entityId, float4 Position, float Distance)


Makes a character jump.
movejump(uint ___entityId)


Commands an entity rotate around the y-axis for a new heading.
moverotate(uint ___entityId, float Angle)


Set the (relative) linear velocity of an entity between 0.0 and 1.0. The actual resulting velocity also depends on the MaxVelocity attribute attached to the entity.
movesetvelocity(uint ___entityId, float RelVelocity)


A MoveStop message. The expected behaviour is that an entity which receives this messages stops immediately.
movestop(uint ___entityId)


Commands an entity to turn into the specified direction defined by a 3d vector. The direction vector can be absolute or camera relative.
moveturn(uint ___entityId, float4 Direction, bool CameraRelative)


Toggle pause on all animation clips on entity
pauseallanims(uint ___entityId)


Play animation clip
playanimclip(uint ___entityId, string ClipName, float LoopCount, bool Queue)


Play animation clip with all parameters, EnqueueMode can be Intercept, Append, IgnoreIfSame
playanimclipfull(uint ___entityId, string ClipName, int TrackIndex, float LoopCount, int StartTime, int FadeInTime, int FadeOutTime, int TimeOffset, float TimeFactor, float BlendWeight, string EnqueueMode)


Play animation clip on a given track
playanimcliptrack(uint ___entityId, string ClipName, int TrackIndex, float BlendWeight, float LoopCount, bool Queue)


Player dies
playerdeath(uint ___entityId)


Enable/Disable Player, affects all processing
playerenable(uint ___entityId, bool Enabled)


Player use something
playeruse(uint ___entityId)


Plays an audio event
playsound(string event, float volume)


Plays an audio event with transform
playsound3d(string event, matrix44 transform, float volume)


Generate a postfx effect
postfxeffect(string Preset, float Duration)


Reloads current level
reload(bool keeptransform)


Unloads a layout
removelayout(string identifier)


rotates an object around the x axis
rotatex(uint EntityID, float angle)


rotates an object around the y axis
rotatey(uint EntityID, float angle)


rotates an object around the z axis
rotatez(uint EntityID, float angle)


Sets an attribute on an entity by name
setattribute(uint entityid, string attribute, string value)


Sets a float material variable
setboolmatvariable(uint EntityID, string node, string name, bool value)


Sets the volume level of a bus
setbusvolume(string busid, float volume)


Sets the current display mode
setdisplaymode(uint width, uint height, bool fullscreen)


Sets a layout element text
setelementtext(string layout, string element, string value)


Sets layout element visibility
setelementvisible(string layout, string element, bool value)


Sets a float material variable
setfloat4matvariable(uint EntityID, string node, string name, float4 value)


Sets a float material variable
setfloatmatvariable(uint EntityID, string node, string name, float value)


sets fullscreen mode
setfullscreen(bool fullscreen)


Shows or hides all graphics entities of a game entity.
setgraphicsvisible(uint ___entityId, bool Visible)


Sets a layout input element value
setinputvalue(string layout, string element, string value)


Sets a float material variable
setintmatvariable(uint EntityID, string node, string name, int value)


setkinematic(uint ___entityId, bool Enabled)


Sets linear velocity of object
setlinearvelocity(uint ___entityId, float4 Velocity)


Sets a entities orientation using a looking direction
setlookorientation(uint EntityID, float4 forward)


Sets the master volume
setmastervolume(float volume)


Sets a entities orientation
setorientation(uint EntityID, float xangle, float yangle, float zangle)


setoverwritecolor(uint ___entityId, float4 Color, string NodeName)


Sets a entities position
setposition(uint EntityID, float4 position)


Show a skin for a character model
setskinvisible(uint ___entityId, string Skin, bool Visible)


Sets active state of a trigger.
settriggeractive(uint ___entityId, bool Active)


Sets scale of trigger
settriggerscale(uint ___entityId, float4 Scale)


Selects a previously loaded cursor layout
setuicursor(string cursor)


Sets the volume level of a VCA
setvcavolume(string vcaid, float volume)


Shows a layout
showlayout(string identifier)


sets visibility of the system mouse pointer
showsystemcursor(bool visible)


Toggles a layout
togglelayout(string identifier)


translates an entity
translate(uint EntityID, float4 translation)


useanimjointmask(uint ___entityId, string MaskName, int Track)



Character death script:

dead = false

function oncollision(eid, othereid, point, normal)
if isgameentity(othereid) then
name = getattribute(othereid,”Name”)
if name == “trapdeath” and not dead then
dead = true
if dead then
creategraphicseffectupvec(eid, point, normal, “mdl:particles/xx.n3”, 2.0)


Multiple visible skins

“SetSkinVisible” function will enable you to add other skins from the same asset , You need to specify them by name. Simply Copy-paste this code:

function oninit(eid)
setskinvisible(eid, “skin_name1”, bool)
setskinvisible(eid, “skin_name2”, bool)

(Replace skin_name with your skin name, and bool with true or false to display skin)




function onenter(eid, othereid)
if getcategory(othereid) == “Player” then
Do something here with code

Activate oninside trigger & with Key

RandomBool = false

function oninside(eid, othereid)
if getcategory(othereid) == “Player” then
if keypressed(“Y”) and RandomBool == false then
Do something here with code


Activate other Trigger with this Trigger

function onenter(eid,othereid)
if getcategory(othereid) == “Player” then
othertrigger = getentitybyname(“trigger2”)


Viewport Window

This page provides documentation for the viewport used in the Content browser and Level editor.


Navigation – basic Maya navigation hotkeys are used. Press F to focus on the current selection/with nothing selected, Frame everything in view.

Texture Import Window

The Texture Importer is used to configure the settings used when importing a texture.



The Texture Import window contains a small preview to the right of the texture being imported, and on the right various settings for compressing the data. While Nebula can import many different texture formats, they all get converted to .dds files on import.



Max Width/Height (clamp to) – if the imported file is larger than the settings here, it will be shrunk to fit

Filter –

Quality – sets the level of compression used. Low quality textures can lose color detail and have block artifacts; high quality textures take up a lot of space.


Contains two options, to automatically pick one or another compression depending on if the input texture has an alpha channel or not.

You can read about DXT compressions here.  In general, use DXT1 for simple textures without alphas, use DXT3 for textures with a sharp alpha transition, and DXT5 for textures with a gradient alpha. Use DXT5nm for normal maps. (If you don’t want any compression at all, use the U8888 setting.)

(The different compression types, and their uses, should be explained here. Perhaps in a table. For now, make do with the information above)


Generate Mip Maps? – dds stores explicit mip maps for efficient rendering. If you’ve already created a .dds texture with mip maps, you want to uncheck this – otherwise, leave it on.

sRGB? – this currently does nothing. Nebula reads all textures as linear data.


FBX Import Window

This window is used to configure the way Nebula imports a FBX file.




The Import Window contains four tabs, with various options depending on the content to be imported, and a Save & Import button at the bottom which does the actual import – and saves the settings for later.



The Information tab shows Nebula’s analysis of the FBX to be imported. This can be helpful for debugging purposes.



If the FBX being imported is detected to contain animation, this tab becomes active. In this tab, you can define and name animation clips from the import, as well as define Events within an animation clip. Animation clips are entire animations, such as walk cycles or other acting, while Events can be inserted to mark specific times in the duration of a clip. These Events can then be used to, for instance, trigger effects or to mark natural break in a longer animation where another animation can start.

Clip – dropdown list of all created clips

New/Remove clip

Name – the clip’s name is used throughout Nebula to refer to the clip, for instance in scripting

Take – the FBX format supports exporting multiple takes (for instance, one take for each animation), but not all programs (such as Maya) supports that function. MotionBuilder does, however.

Start/Stop time – often, one will have multiple animations in a sequence on a timeline. Here you define the portion of the timeline that belongs to the current animation clip.

Pre/Post infinity – constant/cycle. Certain animations should repeat continuously (such as a walk cycle) while others should play only once (such as a death animation) and then hold the last frame/trigger another animation to follow instead. (This needs a better explanation with in-game examples.)


Event – (needs documentation for application in-game)




Import Options

The importer will detect whether there is a skeleton with a mesh skinned to it in the .fbx and set these options appropriately, but you can override it:

Import as static mesh – treats meshes as static meshes, doesn’t parse skins, skeletons and joints.

Skeletal mesh – treats meshes as skinned, generates rigid binds for unskinned objects parented to joints. Also writes joints to model if model creation is requested.


Mesh Options

Remove redundant vertices

Calculate normals

Flip UVs – should generally be on, and is on by default

Import vertex colors (floods to white if mesh is not colored) – vertex colors are used by certain shaders. To be able to use the shaders, even with a model without vertex colors, you can force this option, which floods all colors to white.

Import secondary UVs – secondary UVs are used by certain shaders, such as for lightmapping.


Scale Options

Presets – this interprets the FBX scale into Nebula units, set to Meter by default. To ensure consistency throughout all programs and Nebula’s game-world scale, physics, etc, you should make sure to build your assets to the correct scale in your DCC and set this preset according to the scale you built the model in.

Scale – you can optionally scale your model here



Physics Options

Use physics shape from file – this becomes available if you have meshes in a folder named “physics” when exporting the FBX

Create from bounding boxes – creates a collision mesh from the mesh’s bounding box size

Use graphics mesh – reuses the graphical mesh as physics mesh as well, which can be computationally expensive.


Mesh options

These are options for the physics shape:

Use as concave (slow) 

Use as convex (make sure it is) – 

Create convex hull 

Static Object

Split into convex –  

Skip to toolbar