Programming in Unity
One of the game engines in which we are working is Unity 3D. It is an engine that
has become very popular recently because of its user-friendliness. It is easy to
start working in Unity as its visual interface is very powerful, not being restricted
to programmers. Just drag and dropping you can create a a highly detailed world
and tweak almost all their characteristics.
However there is more than a friendly user interface in Unity. The logic of the
world and the interaction of the characters (may those be human, vehicles or
monsters) must be programmed using a scripting language. This is what gives life
language and there are thousands of programmers familiarized with it. But for our
purposes we are using C#, which is a language much newer, with more potential
C# could be seen as a mixture of two of the most successful programming languages,
C++ and Java.
For the past week, I’ve been researching higher-end features in the Unreal Development
Kit; such as Kismet, Unreal Matinee Editor, PhAT, Unreal Material
Editor, and the Fracture Tool. Kismet is Unreal’s scripting solution for the artist,
a visual interface where one plugs nodes into other nodes to form scripted events,
such as light switches. Unreal Matinee is completely housed inside Kismet as an
animation editor to animate different things, for example, an elevator or animated
materials. PhAT, or the Physics Asset Tool, is used for applying physics proper
ties to various objects while the Fracture Tool allows one to make objects crumble
to pieces when force, such as a gun shot, is applied to it. The Material Editor
is a visual graph node-system, in which it allows one to edit materials in real-time
with a visual preview.
For next week, I aim to develop some working prototypes in UDK to test some
of our theories and find whether they’d work for what we want in our simulation.
Applying Knowledge Learned
This week I have focused mostly on setting up a scene to
display examples. These examples will show the possible
routes we could take for simulation. The purpose of doing
this is to see what would look best and be compatible with
other aspects of our project. An example of this would be,
say what would work with the terrain deforming system
that was developed last semester. Or say, what could be
done in real time and still look good.
These demos will consist of collision, animated textures
and particles. These three things I have been looking into
for awhile now and wanted to display them in a more complete
manner. Other new demos I will be looking into were
suggested to use during quarters; with are cloth and water
simulation. The hope for these demos is that they will give
us clear illustrations of what would be possible by halves.
Preparing for the Next Few Week
This week has been full of preparation: both for the trip
next week to Tucson, AZ and getting ready to texture the
MTL. In preparation for visiting the Tuscon Proving
Grounds, I checked with Sam K to make sure they had everything
we would need for accurate measurements of material
changes (markers, flags, etc). In regards to the MTL,
I’ve finished the high poly model with the exception of the
cab, as I’m currently waiting on Caterpillar for how much
detail they want inside of the model. While I wait to hear
back, I will be texturing the rest of the MTL (the treads,
the bucket, etc). We had our client and representatives
from NREC come in to help discuss where our project is
going, which provided us with more theories and ideas on
how to proceed with the project’s development.
Next week I will spend time texturing the MTL before I go
to Tucson, and adding more features to the rig.