One major feature a game engine needs to be able to do, is distribution. And by distribution I don’t only mean being able to distribute to customers and users, but also to developers and artists. How annoying wouldn’t it be to have to open every single scene in Maya, every shader project in Nody, every texture, and export them individually. Not only that, each artist or developer would all have to do this in order just to get the engine running.
Batching solves this. Batching can be both a good and a bad thing. The good thing is that everyone can run the batcher once and have everything set up, nice and working. The bad thing is that batching may and in most cases will perform lots of redundant batches, which takes time. The primary reason to why we’d want to redesign the way Nebula content is batched is because of the very fragile character batching, where everything had to be done in the exact unintuitive order or else the batcher would crash. But new reasons, such as the Nody shading system with the header file and the xml-formatted model files means that we need a new set of tools to batch the new content.
Right, I changed how models should be handled. Previously models would be exported DIRECTLY from Maya, so Maya would have to start in order to export the models. Not only that, but Maya would also need a plugin in order to set Nebula-specific stuff, such as the Nebula shader and the Nebula shader variables. In order to avoid this, Maya is only used for modelling. When exporting from Maya there are two paths. If the model file exists, only the mesh, primitive groups, and the node order will be updated. If it doesn’t exist, the exporter, currently under development, will create a basic model file using the always-existing Solid material. Also, instead of saving this directly to the export folder everything gets saved to work, so the model files, which previously only existed in export, can now be modified outside of Maya. Why you may ask?
The idea somehow resembles the material editor found in UDK. A model is presented to the user in a real-time preview of Nebula. The user can then change the material, and also set variables such as the textures, integers, floats and vectors, and of course see the result in real-time. Whenever a change is made, the xml-formatted model file is altered. This means that the modified model-file, still in the work folder, can be committed, and then batch exported without needing the huge Maya binary file, and everyone updating and exporting will see the changes. Not only is this better because one doesn’t need Maya for anything else but uv-mapping and modelling, but it’s also very easy to see how the model will look with the appropriate textures and settings seeing as the preview is rendered directly in Nebula.
This editor will exist as a tool for the level editor, allowing you to edit the model and see it in your level, in real-time!
Models aside, there is also the thing with batching shaders. The easy part is to compile the source, that’s simple, traverse all files in folder A, compile and write to file in folder B. The hard part is writing the .sdh-files without having to load the entire project every time. Since loading the graph network requires us to create all nodes, which in turn creates all the graphics, we’d want to avoid it because of the performance issues. Also, it seems a bit redundant to create graphics if we’re only going to use a command line interface to batch our shaders.
The project loader already works by loading specific objects, so I can chose to only load the settings manager and thus avoid having to create the entire scene in order to batch my shaders. The only problem is that my settings manager does not have the information required to perform a complete header generation, seeing as it doesn’t have any information relating to samplers, textures or class interfaces. These are found and generated when traversing the node graph, and what do we need for that? You guessed it, the graphics scene, which we don’t want in the first place! I’m currently working on a solution to this problem, and when it’s found, we will be able to batch the shaders with the push of a button.
When all the batchers are done, we will implement a program, very much like the current batcher, which will run all of our batchers, or a subset of them, using a GUI application. Also, the batchers themselves should also have a way to only export modified files, so as to avoid redundant exporting. It shouldn’t be more work than just checking the changed date for both the exported file (if present) and the source, and just see if the exported file has a change date older than the source.
So to wrap this up. Batching is good and evil, but is always vital for setting up a game environment from scratch. The key is to make the batching as fast as possible, so as to not slow down the development process whenever someone updates and start working. Also, it is also very important to avoid batching stuff which is already up-to-date, because it takes unnecessary time rather spent on working