Wednesday, May 23, 2012

World Building Through Game Context

I was excited to see this piece by Patricia Hernandez over at Kotaku because she's really understood what we were trying to do with the world building in Super Monday Night Combat.   Her article outlines it well but the basic idea is to describe the world without sticking it right into your face as an intro video.   You learn about the dystopia by inferring what's going on outside the game through the context of (mostly) the announcers.   Some novels take a similar approach to revealing a world through actions and dialog without long descriptions of world background.   I'm glad people are appreciating this aspect of the game because I love the world of SMNC.  We do owe a bit of a debt to Idiocracy one of the greatest movies ever made IMHO.

For the record the fantastic Ed Kuehnel does almost all of the writing in MNC and SuperMNC.  If you need some writing done he's your guy.

Sunday, May 13, 2012

RTS UI Philosophy

When I get to talking about RTS games with either fans or other people in the industry they often ask me what my favorite RTS is.  Now the first one that I put any real time into was the original C&C which was just a genius of a game.  But C&C suffered from the same issue as Warcraft and it's successors.  That issue is simply that the interface is designed to be part of the game experience.  Micro play and control of individual units is commonplace in these games and the emphasis is more on tactics that huge armies.  Limitations on numbers of units selected create a command and control scaling problem.  I've heard designers of these games talk about these issues and most of them consider the UI to be part of the game itself.  They want to limit what the players strategic options are by making the UI more difficult to use in some cases.

To me this is the fundamental difference between games like TA and SupCom and most other RTS games.  The philosophy is to make the interface as powerful as possible so that the player can concentrate on strategy, not on having to quickly hit buttons.  Micro is discouraged by giving the player the ability to automate control as much as possible.  Huge battles are only possible if you can actually control huge numbers of units.  Things like factor queue automation, automatic orders when units are built, automatically ferrying carriers, sharing queues between factories and all kind of other things made it into SupCom.  Shift-click in TA was the original simple version of all this stuff and by itself was pretty powerful.

Now that's not the only difference but I think this is one of the major things that differentiates these games from one another.   I want huge fucking battles!  Let me build 10,000 units and throw them against the enemies buzzsaw!

Anyway I've been thinking a lot about RTS stuff lately now that Super Monday Night Combat is out.  I have some very specific ideas about what the next generation of RTS looks like that I would like to put into motion.  More on that later...



Thursday, May 10, 2012

Analog was cool

 I was thinking a little bit about how old photographs have really decent image quality.  In fact other than the color issue they can pretty much have as much detail as typical modern photos.  Why?  Because analog technology works really well for capturing sound and images.  It's really a great hack that "just works" for a lot of things.  Think about photos.  It's taken us until basically the last decade to get digital cameras that are good enough.  Do you have any idea how much processing is going on inside a typical digital camera?  To match analog pictures you basically need megapixels worth of data.  We had to invent good compression algorithms, ccd sensors with high resolution and decent microprocessors to match a few mixed up chemicals soaked into some paper (or on glass).

Another thing to think about is sound.  Edison basically started with a piece of tinfoil and a needle.  The technology of analog sound is fairly trivial.  Extending it to electric analog is also fairly trivial. Saving that sound to something like tape is a bit more work but in the scheme of things is also trivial.  These are easy to bootstrap technologies that from a cost effectiveness standpoint blow away digital. Analog tech has huge bang for the buck.

Of course once you get your digital stuff bootstrapped to where we have it now we start to see enormous advantages over analog.  Perfect copies sent instantly anywhere in the world is a good start.   We can store enormous amounts of information in a small space now.  Imagine lugging around your record collection!  I knew people that had walls off tapes so they could have a large music collection.  Those are the people that now have terabytes of MP3's.  So some other advantages is that the cost of an individual piece of data is reduced to almost free.  Pictures used chemicals, paper and other real resources.  With digital you can take as many shots as you want and preview them right away.  Being able to transfer images to a computer is nice as well.  Backing up your photos online can save your family pictures from a disaster.  You can share your pictures with more people.  The list of advantages to digital is basically never ending.

So just think about and remember how far we've come with digital technology.  We've literally gone from Kilobytes to Megabytes to Gigabytes to Terrabytes and almost beyond in the last 30 years.  A modern chip is basically the complexity of a city shrunk down to the size of your finger nail.  My first PC had 4k of RAM.  Nowadays a typical PC has Gigabytes of RAM, multiple processors and a secondary GPU that may be pushing  teraflops of processing power.

I had the entire source code and all of the assets of the first game I worked on emailed to me a while back.  It was 19MB.  More on Radix later...

Saturday, May 5, 2012

Computing as a commodity

The market for personal computing has been evolving rapidly in the last decade.  The modern era really began in the 70's with computers like the Apple ][ or the Commodore Pet.  Well the Altair as well but that was true hobbyist level stuff.  The Apple pretty much anyone who was a little geeky could do neat things with.

My first computer was a TI-99/4A which was kind of a weird hybrid game console / PC.  It had a lot of expansion possibilities but mine was bare bones.  You could buy game cartridges for it but I was poor and only had a few, fairly lame, games.  Some kind of math asteroid blaster thing, a MASH game and their Pac-Man clone called Munch Man.

Anyway not to digress too much but I bring this up for a reason.  If you look at the computers of that era, including the PC when it arrived, you pretty much had to type things at a prompt to do anything.  In the case of the TI it booted up to a BASIC prompt.  If you weren't interested in programming these machines would probably be kind of boring.  But, if you could program the machine you could do utterly astounding things!  It opened up an entire new world to me at the time.  But realistically only a small portion of the population is going to want to deal with that kind of thing.  It wasn't accessible in the way that something like an iPad is.

The reason I bring this up is to point out that nowadays grandma and grandpa are using computers.  They have very specific needs like browsing the web and of course playing games.  Now when you think about the form factor of a computer there are different tradeoffs in an iPad than a regular PC.  For example there is no hard keyboard on the iPad.  For someone who is a hunt and peck typist they may actually increase typing speed on the iPad vs a keyboard.  For most of the population a real keyboard may simply be a waste of time, especially if they don't have a reason to type much.  Dictation, for example, has gotten a lot better and will improve more in the future making typing redundant for twitter updates and email.

Now consider the professional programmer like myself.  I often use command line tools and write small applications in python for processing data.  I can type 100 wpm on a "real" keyboard.  I can't imagine trying to get serious programming work done on an iPad.  I would type a lot more slowly and I also need a lot of screen real estate when coding and debugging.  The usage case for coding is just insanely different than what 99% of people need to do on their computer.  Even for me I could get away with an ipad for almost everything.

So what are the implications here?  To me it seems simple.   We are going to see a split between consumers of content and creators in terms of what kind of computers we use.  Most content consumption will happen on something like an iPad.  Lot's of content creation will also happen because you can create neat stuff on them.  But for hardcore application development you will see niche operating system environments that most people don't use.  It's entirely possible that the open desktop version of windows eventually withers and programmers move over mostly to linux.  Hell at places look Google you have to look hard to find a windows box.

I think there are some other interesting implications as well.  For a long time I thought that the generation after mine was going to have so many brilliant programmers that all of us vets would be out of a job and outdated.  Well we surely do have some brilliant young talent around but not in anywhere near the numbers I would expect.  I don't think it's lack of actual talent, I think it's that the computing environment we have now isn't as conducive to easing into programming as it was back in the day.   Kids learn how to use windows and browse the web but finding a way to program something is kind of daunting.  BASIC programs that ran at a command line were easy to learn and experiment with.  Of course this stuff still exists but you have to seek it out. There are simply so many things to waste time on (reddit) that why would anyone get around to programming unless they really cared about it?  You could spend your entire life playing games that are free!  Or get sucked into WoW...

Anyway I guess the conclusion here is that we are going to see a further separation of development tools and general computing.  Desktop type power user computing will become more niche as most people use tablets or their cable box to get at the content they want.  Hopefully we will find ways to expose young people to programming so that we can have future generation of software creators. 

One other thing I would like to point out is that all of this could change quickly.  For example there are several projects now underway that are pushing wearable computing.  It's possible that new input methods, being connected to the net 24/7 and the ability to overlay AR type stuff might allow programming environments significantly more powerful than we have now.   I fully expect within a decade for eye goggles capable of AR to be as commonplace as an iPhone is now.

I'm not claiming any crystal ball, just calling it as I see it, like usual..