I spent the majority of my time working on and with our effect container, the GVEffect class.
GVEffects are abstract entities containing any number of basic building blocks called EffectTypes. We have particle system and decal effect types working, light and sound will be implemented later. A typical setup would look something like this:
[Particle system: Base explosion]
[Particle system: Black smoke]
[Particle system: Burning ground]
[Decal: Scorch mark]
[Light: Explosion flash]
[Sound: Explosion sound]
By combining different bits and pieces a wide range of different visuals can be created without the need for authoring new assets. (Of course having a decent library is recommended.) In the example above the black billowing smoke could be replaced quickly with a gray smoke puff sort to experiment with the visuals.
There are many additional parameters for defining the behavior of effect types like start delay or life time. Position, rotation and scale can also be set and randomized for each item.
GVEffects are always embedded in other entities which will create them at runtime when it’s appropriate. For example a pawn might have a GVEffect set up for spawning to make it look like the character was teleported onto the map.
Another example is physics material based interactions (which I implemented last week). What this means is that when two objects are in contact, the right effects are created depending on the interacting materials. For instance if a stone ball hits a reinforced glass surface the following effects will be created:
- Puff of stone dust (particle effect)
- Crack on the stone ball (decal)
- Splinters of chipped off glass (particle effect)
- Cobweb like cracks on the glass (decal)
Each material has its own set of effects for the different types of interactions: impacts, sliding, foot steps for light/medium/heavy characters, bullet/pellet/energy weapon hits, etc. The effects can be dynamically adjusted based on properties of the interaction. For example the force of an impact sets a certain parameter (called “Amount”) in the related particle systems and decal materials. What those assets actually do with that information is up to the author: a decal could change size or opacity, particle systems could vary particle number, size, speed, even turn off emitters entirely for smaller impacts.
While I had no trouble with impact effects, slide detection turned out to be a huge problem. Apparently I have no direct access to contact points in the physics scene which makes it rather difficult to find out which actors are in contact. I did manage to get the necessary data but the workaround is hacky, slow and not very reliable. I might revisit it in the future but it seems that I can not do better without access to lower level functions.
To distract myself from Gavit, I took on a side project: MCEditor. It will be a proper configuration utility for the Razer Hydra controller which to this day lacks that kind of a software. (Razer has a lot to learn from Logitech in this regard…) The programming is handled by a fellow Hydra owner while I’m responsible for the UI. We’re using C# and WPF and this is how it looks at the moment:
This is the second major iteration of the layout. I really enjoy that the challenges are so different from the ones I usually deal with, it’s somehow refreshing. So much so that I intend to start another side project when this one is done.
Next week I’ll work on motion replay for rigid bodies and try to finish my earlier pinball prototype.