Game Design, Programming and running a one-man games business…

Games look (or feel) like their owners.

One of the best things about the indie dev community is that you get to meet a lot of people from all over the place who wouldn’t otherwise necessarily meet. You can be an indie dev living anywhere (although its harder some places than others), and more than ever before, they come from different backgrounds. I find people pretty interesting (ok, I find 0.1% of people interesting), and I do find myself doing a bit of pop-sciency amateurish analysis of people. When you do this with creative people it gets rather fun.

I find it very amusing that when you meet Tim (wicksteed, designer of big pharma) he basically *is* big pharma. Its truly his creation. He is always happy and upbeat and smiling and positive, just like big pharma. From my POV, Tim is young and modern, just like big pharma feels. The music is the clearest indication of this. The music is like Tim. The game is bright and cheery. Tim made a game about curing people.

owners1

When you play Gratuitous Space Battles 2, thats me. If you haven’t met me, just watch a few minutes of the battles and listen to the music. Thats me. Thats what my brain and my soul is like. it’s lots of explosions and power and drama and a voice shouting ARGHHHHHHHH at a billion decibels whilst smashing things. The game is dark, over-complex, ambitious and pushy. Cliff made a game about destroying everything.

owners3

It’s kinda funny how games made by just one person really are a window into their soul, and their personality. Mike Bithel basically is Thomas Was Alone. If you introduced me to a random line up of 100 game developers I’d never met, and told me one of them did a game about repressed rectangles with feelings, I’d know it was Mike. And after playing TWA, I know mike probably reads the guardian and is a vegetarian. Mike made a game about friendship.

owners2

And this is awesome. It means that entertainment really is connecting us to people. That gives us empathy, and opens our mind in the way that ‘committee design’ never can. I have no idea who the designer of the Battlefield 4 games is. I imagine its probably Tom Clancy, more likely a committee of people with a tom clancy design manual. Who knows.

In any case, I think it is really good when games have personality. The absence of it definitely feels bad, like a farmville clone (or for that matter the original) designed mostly by people in the PR and accountancy department. The best artistic design is done by slightly crazy people who are drunk/drugged/suffering in some way, which drives them to think of things other people would not.

Don’t do your game design by committee, or with a spreadsheet. Let it just flow from your soul. If that means your games are downbeat, or dark, or whatever then thats fine too. Better to be dark than to be fake.

Coding the GSB2 Radiation effect (directx9 C++)

I was never 100% happy with the radiation effect in Gratuitous Space Battles 2, so I’ve coded a better version for the next patch. Here it is in very short silent video form:

I thought maybe some people may be interested theoretically in how it is done. Now I’m sure if this was in 3D in unity there is already a 99c plug-in that does it without you even having to understand coding, but thats not how its done so here we go…

The first thing to remember is that the spaceship is not a 3D mesh at all. its a series of sprites rendered to different render targets and then composited. That means that I’m not wrapping a texture around a mesh, but drawing one sprite on top of another, but cropping the final output to the alpha channel of the base (ship outline) sprite.

Before I learned much about shaders, I had a system that did this using mostly fixed function, which I use here, but build on with better shaders to do more stuff, making it a hybrid approach and fairly complex :D. To simply draw one sprite on top of another, but use the alpha channel from the bottom sprite, I use a custom vertex format that has 2 sets of texture co-ordinates. One set is from the splatted image ‘radiation’ and the other is from the alpha channel of the ship outline.

It gets a bit fiddly because where I’m drawing the splat sprite could be anywhere on the ship, so I need to work out the offset and dimensions of the splat in relation to the ship image, and store that in the second set of texture UVs, and pass that into a shader.

The shader then simply reads the color data and alpha data from the radiation splat, and multiplies the alpha by the alpha at that point on the base ship texture, and voila, the image is cropped to the ship.

But thats a very simplistic explanation because there is more going on. To make things fast, all of my radiation gets drawn together (to minimize texture / render target / state change hassle), so its done towards the end of processing. In other words, there is no painters-algorithm going on and I need to check my depth buffer against this ships Z value to see if the whole splat is obscured by some nearby asteroid or debris. That requires me passing in the screen dimensions to the shader, and also setting the depth buffer as a texture that the shader can read.

So that gets us a nicely Z-cropped image that wraps around the ship at any position. It also fades in and out so we need to pass in a diffuse color for the splat sprite and calculate that too. We also have to do all this twice, once for the splat, and once for the slightly brighter and more intense lightmap version, allowing the radiation effect to slightly light up areas of the ship hull during final composition.

At this point the shader is a bit complex…and there is more…

I wanted the radiation to ‘spread’, and to do that, I need to splat the whole thing at a fixed size, but gently reveal it expanding outwards. To do this I create yet another texture (the spread mask) which also gets passed to the shader. There were various ways to achieve this next bit, but what I did was to ‘lie’ to the shader about the current UV positions when sampling this spread mask. Basically I calculate and pass in a ‘progress’ value for the spread, and I use that, inverted, to deflate the UVs of the source image (which is CLAMPed). So effectively my first UVs are -4,-4,4,4 and the spread splat is a small circle in the center of the splat, expanding outwards to full size.

Because I’m only doing this when I sample the alpha channel from the spread mask, the color and alpha data from the base ‘splat’ texture remains where it is, so it looks like its a static image being revealed by an expanding circular (but gaussian blurred) mask.

I’m pretty pleased with it. The tough bit I’ll leave for you to work out is how that works when the source radiation image is actually a texture atlas of different radiation splats :D

Fun fun fun.