History

The making of MoonStrike… thus far

 

For those who want a peek into the life of indie game development on the edge of a new technology, here’s one such tale. It’s a bit lengthy perhaps, but if you enjoy the subject matter (actual game development, sans drama) we think you’ll find it an interesting ride. We hope you enjoy it!

 

A Shortish Setup

 

MoonStrike began early 2015 as a bit of a fluke. Permit me to hit the not-so-way-back button…

For about a year and a half I had been on a quest to become self sufficient as a developer. Over the last two decades I had done nearly everything you could do professionally in making games. Art, direction, animation, texturing, UI, sound design, writing, visual scripting, cinematics… everything but actual text based programming. Without that skill, I felt I would always be in a position of pitching ideas and hoping that someone out there would be able to implement those concepts.  (Cut to someone standing on a sidewalk imploring “won’t someone make my game for me?!”)

After a few months of online tutorials, I started doing some game jams and feeling pretty comfortable with creating small projects. This really helped with the confidence, and I can’t recommend doing game jams enough.

During the same time period Virtual Reality headsets started becoming “a thing”. Oculus of course opened the floodgates, and released the DK1. A close (and very generous) friend gave/loaned me a DK1 and I was hooked. Even before trying it, the appeal of VR was evident to me.

Just before this venture of self improvement I had been working on mobile games with Bitmonster Games (cofounder). I had imagined if I pulled off some solo games they would be either small indie Steam games, or mobile, where I could work on a smaller scope project at that size game would be expected. But once I tried on the DK1 in my kitchen and fired up the infamous Tuscan village scene… there was no way I could not be making something for this tech.  It was eye opening in every way.

I spent a lot of time last year (2014) making what I called “Danger Rooms”. The thing with VR is that so much of the experience is the setting around you. I would build a rough environment in 3D and throw a VR camera setup in it to get an idea of what it “felt” like to be in the scene. When you hop into one of these little model setups, you can’t help but imagine what a game in the environment could feel like. There were little cockpit scenes in robots, top down buildings, interiors of bars, go-karts with follow cams, medieval steam armor in dungeons, the inside of a helmet, mountain climbing rigs, car interiors… just LOADS of little “moments” from games that could exist.

I was fortunate to visit Oculus in Irvine at one point (several founders were fans of Gears of War, and we used the UI software ScaleForm that they had created before becoming Oculus) and Brenan Iribe gave me what was for the longest time referred to as “The Valve Room” demo (it was created by the awesome devs at Valve as I understand it). This was basically the bleeding edge of Virtual Reality at the time. It was a small office with a very advanced prototype for a headset, and you were positionally tracked in the room. You didn’t just look around, you could move around in the room. That was huge… beyond huge. It was huge enough that Valve bet the future of VR to be “Room Scale”.

One moment in that demo stuck with me (and I believe the same is true for several other developers I spoke to). There was a moment where a tiny scaled down office building interior was in front of you, teeming with tiny miniature life. Since you could move your head around the scene, you would naturally crane your head forward and orbit the office, as if it were this real life diorama in a museum; it just encouraged so much motion, and it felt so REAL in the room with you.  It was a really unexpected joy.

Suddenly VR didn’t HAVE to be about being first person and standing in some crazy world and looking all around you in awe. It turns out that it’s maybe even more impressive to have a very small scale setting in front of you, a self contained little “playset” you’ve always imagined as a child, full of living breathing little actors. (Not only that, it solved a load of issues with VR that we’re still wrestling with, like movement discomfort)

 

Back In The Lab – It’s Alive!

When I returned, I started a whole new set of “Danger Rooms” based on these ideas about miniaturization. My Oculus DK2 (Developer Kit 2) arrived and it had decent positional tracking as well. You could strafe your head around and get some of that awesome moment from the demos I loved so much. The DK2 was basically meant for what I call “Desk Scale”, a user sitting at their PC, but I was longing for that Valve Room so much, I would move the little webcam tracker for the DK2 to a shelf in my office and stand in front of it. I had a few cubic feet of freedom to explore; not much, but I hoped it was enough to get that scale experience.

Just to test that it was all working, I scattered some smallish 2” cubes around the small level, just floating in front of my face. The first time I tossed on the headset, it worked well enough that I jerked my head back… there was a cube right in front of my face, and my brain yelled “back up!” pretty clearly. Seconds later I’m crouching, looking up at the underside of it, kneeling and leaping to see the tops of these little cubes. I stuck a little 3D model of a Subaru Impreza into the scene, matchbox size on top of a cube… that totally worked. Little robots, etc… I was like a mad scientist in a playground of chemicals now.

I don’t know why, but I recall thinking, “Bubbles! I need bubbles! Bubbles everywhere!”

I wrote a little snip of code to randomly create scattered objects around where I was standing in the VR space. Moments later I tossed on the headset, and the room around me was filled with little spheres floating in the room, each sized randomly about 2 or 3 inches across. Again with the moving the head around, ducking under them, craning the neck around… just magic.

Even though these little objects were completely featureless spheres, there was this instinct to move my face close to them and look at the surface as though details would pop out. Looking around it became clear I wasn’t seeing bubbles… I was seeing planets.

I adjusted the script to spread them out a little and spawn fewer of them, and immediately it felt right. Oh yeah, these were definitely planets. The immediate next thought was, “I need little space fleets in all these gaps in space. Little ships, big ships, formations going back and forth between orbiting planets… little big battles. This whole setup demands tons of tiny life.”

I spent a day or so making a basic little ship that would thrust towards a target and go away when it reached the target. I started spawning those little ships randomly, and suddenly I was standing in the middle of this tiny intergalactic web of transit and conflict.

 

The Devil In The Details

Most devs would recognize this moment as the “glow” where you’re just exploding with possibilities. Decisions made at this point dramatically steer the project in different directions. Am I making a space mining game, a space pirate game, a solar system engineer sim, a music visualizer? There’s a lot of ways to take something like this.

The “glow” quickly turns into the “Oh God! How do I do any of this?!” stage.

To say there are things undiscovered about making VR games is… perhaps the greatest understatement you could make since “kids seem to dig Minecraft”. Allow me to highlight just one such issue. A VR unknown was that there really weren’t VR controllers to speak of at the moment, nobody knew how we’d be interacting with any of these little scenes.

I started working on a way to select the planets, highlight them, send fleets around, etc… basic interactions. Simplicity was the goal, so for a good while the game involved a subtle crosshair and the player looking at planets to select them. Turns out, making things in small scale and making a crosshair/cursor in your view is just catastrophically challenging. It sounds so simple, put a dot on the screen, and if you line that dot up with something, you can select it. No. Oh man, no. I could literally write a 20 page document on why this was so complex to pull off.  Stereo vision with multiple offset camera setups are a harsh mistress.

Things like “hey, uh, what’s a cursor?” aren’t things you typically have to figure out to create in a game (but it’s also a reason why VR’s frontier is so interesting). In short, I did a lot of work polishing that interaction, adding poppy little sound effects, and generally making it feel really good. If you could look around, and you had a button to press… this would work. I had a basic control scheme and a general goal of “different teams trying to take over the galaxy with little fleets”.

 

Risk of Toppling

Typically this is where a game spirals out of control, turns into a mish mash of features, and tears itself apart like an old video of a suspension bridge in a storm. Our term back at Epic was “Toppling a feature”. Someone has the core of a great idea, and people just pile more and more on top of that feature until the whole thing collapses, everyone throws their hands up and shouts “Nevermind! Trash the whole thing!”

I started working out a system of planets being either rocky or moonlike. You set up routes where you deploy miners on the rocky planets, they harvest materials, and transports are shipping those resources back to your home planets where you’re building fleets and populations are growing. You could set up a trade route with another neutral planet, amplifying both of your resources.

When you attacked someone, you set up waypoints that acted like “trade routes”, you were just exporting combat… sort of. You would sit back and stare at this web of ‘lines of power’ and constantly adjust the flow of ships and upgrading your planets, etc. It was pretty cool…

Right up until I started showing it to some people.

The main issue was that when I would give people the headset, they were still very much in the “COOL! LITTLE PLANETS!!!” stage. This whole miniature world thing is still really new, and they wanted to take that in. Explaining to them the interactions of resource gathering, trade routes, war routes, connecting paths… it was all far more than it needed to be to create the golden little moments I was trying to create. I couldn’t easily answer the, “How do I make these little ships?” question without explaining that you set up a kind of persistent state of war between two planets, and it automatically happened from there, as long as you had resources, etc.

They wanted to see cool moments, spawn these little fleets, and ‘duke it out, in space’. My gameplay was getting in the way of that. This was a classic case of designing a game like it was mine, instead of the player’s, and that’s nearly always a problem (even in a case where I was making a very personal project.)

There were other issues, like how long a match was going to take to play out. In VR, often people don’t feel like having this headset on for more than 10-30 minutes, they’re not looking for an 8 hour game of Civilization (yet).

When all was done, I stripped out the concept of resources, tossed out war declarations, stripped out trade routes, and made a much more visceral method of triggering fleets to spawn manually. What remained turned out to be about 50 times more fun, and a match takes closer to 4 minutes than 40.

 

Uh Oh, Familiar Territory

With the gameplay elements stripped down to “you have planets, you spawn fleets, you spread”, I had stumbled into another dilemma.

You see, there’s this extremely talented solo developer named Phil Hassey… and Phil made a game called Galcon. Galcon is, without doubt, one of the finest series of games to ever hit the mobile games market. Galcon is like the ultimate in simple RTS tactics games. It did so well that it spawned a load of clones, some of which are really well done. In fact it spawned so many clones that Phil gave a talk a couple years ago at GDC (Game Developers Conference) where he talked about this specifically. He was incredibly gracious about the whole thing, and I got to know him a bit during that convention and really respect the guy.

The more features I trimmed, and the more friendly and cartoony the game became, the more it unavoidably gathered Galcon comparisons. MoonStrike was becoming close enough to the Galcon games that I felt I needed to keep at the design phase to find ways to differentiate the game. Shipping something so close to an existing game just didn’t sit well with me, at all.

I looked at the things that made MoonStrike stand apart, and came away with the actual ship-to-ship fleet combat, the unique abilities of the different races I had planned, and the sheer presentation of the game in VR. I needed to double down on those attributes, while keeping the game as simple and accessible as possible.

It started to feel like the game might really turn into something, and above all, I was really enjoying playing it over and over (even knowing how the AI enemies worked since I wrote them).

 

A New Player Has Entered The Arena!

It was about this time that I had the game far enough along, even with placeholder art, that I could get the general idea of the game across. I made a build of the game to run on the DK2, and made it available to some friends on a forum.

The reaction from people I trusted was pretty compelling. There simply wasn’t that many people making real game games on VR yet; everything seemed to be interactive ‘experiences’ for people to look around and ooh-and-ahh in their new headsets. It was kind of refreshing to have something that really felt like a playable game on the device.

One of those people was Tiffany Smith. Tiffany was a veteran game programmer, and had shipped boatloads of respected games across many platforms, including consoles (which involves some truly hellish storms of technical issues for those who have never tried it). Tiffany wrote me about the game and had some great words of encouragement. Even more so, she was interested in working on a VR game, and just hadn’t found a project yet that really caught her eye.

I had some initial hesitation, basically thinking, “I’m making this game solo, proving to myself I can do it, and I’m not sure if I really want someone else involved”. But the more I thought it over, the more it made sense to have a partner involved. The early build had created some interest with various contacts at the major VR platforms and it really set in that I might actually need to ship this thing! Hell, when the time came I wasn’t sure I’d be able to even set up a console dev kit, much less handle the loads of technical requirements these companies require to ship a game on their devices. (You might not know it if you’re not a developer, but there are truly reams… binders… mountains of technical papers laying out things your game MUST do if you’re going to be on so-and-so’s device. They’re called TCRs, and they’re… nontrivial.)

But mostly it came down to Tiffany having a load of enthusiasm about the project. It was awesome to be able to hold up this playable game and say “this is what I’m making” and have that clarity of purpose, instead of a small design by committee situation starting something from scratch. She was gung-ho about what the game already was, and that was really damn cool.

We formed Big Dorks Entertainment together and she has been integral to MoonStrike since.

One major benefit that she brought with her was a strong interest in network programming. Out of the gate she said we needed multiplayer, and that was the top of her wish list for the title. The ability to play against your friends online was a huge opportunity. Very few people were focusing on multiplayer games for VR, and I can guarantee you as a self-taught programmer I was in no position to tackle that complex task myself. She was absolutely right, MoonStrike was a game that practically begged to be a multiplayer experience. (I have always had this thing for 3-way multiplayer game dynamics, and that’s where we’ll focus our efforts for the game’s MP.)

Suddenly all those technical tasks became much more manageable… I had someone experienced to pelt endlessly with stupid coding questions, and the development power of the project doubled.

My personal quest to make a game completely solo would have to wait, we owed it to MoonStrike to make the most of a rare game at a great time in VR’s infancy, and great business partners don’t just fall out of the clouds every day.  I couldn’t be happier to have a partner on the project.

 

Get Control

If you recall, I talked earlier about how VR games didn’t really have a set “control scheme”, people really didn’t know how users would be interacting with their environments. Valve rolled out their VR setup, (the ‘Vive’) and it included two positionally tracked controllers. They were like wands that the player could see and move around in the world they were in, and it opened up the options for interactivity massively in VR worlds. Similarly Oculus introduced the Touch controllers, which put something eerily similar to the player’s own hands in the world as well. Meanwhile Sony’s Playstation VR setup included the floating “Move” controllers, that were also something of a floating wand/hand in the game world. It was pretty clear as of E3 2015 that the “how do we interact with VR” question was being answered, and it involved tracking where the player’s hands are.

The endlessly supportive folks at Valve got us a Vive dev kit and it became immediately apparent that MoonStrike would simply never reach its potential without supporting the controllers. Waving around the wands, seeing them in your hands in the world, it was just too important to not support. Anything else was a complete waste of time.  With a little sniffle and a tear I disabled my beloved look-based controls, complete with all their neat little feedback features, and went back to the drawing board on how MoonStrike would be controlled.

At first I tried something like a magnifying glass shaped “scope” that you held up and looked through to aim and highlight planets. It was this inherently 2D based interaction with a 3D world you were freely occupying though, and it would never do. I created a sort of lightsaber (no developer could resist) and made the user wave it around like a teacher’s wand and touch planets to select them. It was actually pretty hard to select things with this method, it required Jedi-esque precision to wave at planets in the middle of a battle, and it immediately felt wrong (very cool! But wrong). I tried a very thick sort of “lightsaber club” to make it easier to ‘hit’ planets, but it still felt really wrong for the game.

I went back to the ‘magnifying glass’ and found myself waving it around at the ships as if I were holding a butterfly net and trying to capture them, and it really just clicked. I scaled the hoop up a bit to encompass the largest sized planets, made anything inside the hoop become selected on contact, and it felt natural and intuitive in seconds. I added some cool little touches like sound effects, and making the controllers vibrate for a brief moment as the planet was contacted. Now I would wave around this hoop, I could easily select or avoid particular planets, and most importantly it was just truly fun!

Watching other people use the hoop as intended without ever being told what it was for, that was just the clincher.

 

Reveal

As I write this I’m at Oculus Connect 2015, and Tiff just arrived at the show as well. Oculus dug our latest work as well and the game will be playable even in its early state by anyone at the convention, it’s one of a few demos that are on hand to demonstrate their new Touch controllers and latest VR kits. There are 32 demo booths running through the whole show, and I can’t say enough how excited Tiff and I are to be on those kits.

The last week and a half has been a whirlwind of intense activity. We learned about our chance to show here with only a couple days to take my old version of the game and polish it to the state that random people and press can go hands-on with the build. Making a web site, poster art, game and company logos, screen-printing our T-shirts in my basement, cutting together a trailer of the game in literally 6 hours from existing VR captured b-roll footage along the way… it has been… intense.

The Oculus guys, Pruett, Jurney, Ross, Nate, Brendan, Lee Cooper, et all have been incredibly supportive and helpful to Tiff and I to get their new controller integrated with the game and making the demo happen. Also massive thanks to Aaron and Augusta at Valve for their support with the Vive.  And of course thanks to my lovely bride Gab, and Tiff’s husband Rob for all the family support through the brief crunch.

For now I feel what most artists usually feel about their work. I love the game, but artists are perpetually in a state of not being happy with our work. We see only errors. I want to disclaim the hell out of everything and scream to the wind that basically every piece of art in the game is placeholder except like two little ships, and tell people not to look at so-and-so, it’ll look way better in a couple months, so much more cool stuff is coming, etc…

Right now I’ll settle with a deep sense of satisfaction that we’re working on a game that doesn’t need to exist. We’re not making this game because we “had to work on something, so it might as well be this!” We don’t have a publisher and debt to pay back, we’re not ‘on the hook’ to a soul out there. MoonStrike exists purely because it’s really fun already, really promising in how it’s developing, and we wouldn’t have bothered entering production with it unless we knew from experience that it’s already something we love.

So! I hope you enjoyed that peak (ok, more like an intense glare) into how the game has came into being so far. Both Tiff and I will do our best to keep the blog updated with cool little nuggets as they come online.

 

 

-Lee
In the meantime, go get Galcon 2 on mobile! Seriously one of the best mobile games there are… Phil deserves your couple bucks!

 



Comments are closed.