Game Design, Programming and running a one-man games business…

Fixing / Improving game review systems, for gamers and developers.

A few thoughts I’ve been mulling over for the last few weeks, concerning steam, and most relevantly, the steam review system.

I like the idea of product reviews. So many people try to sell you crap with a lot of marketing bullshit and vague promises. Reviews are a powerful weapon which allow little-known but great products to rise to the top, and punish the superficial, poor-quality crap with big marketing budgets. Done correctly, reviews are a win for the consumer and for the developer. Consumers get unbiased purchasing advice, (and lots of it), and also a voice for their opinions, and developers get a free marketing department for good products, and constructive feedback on why some customers are unhappy.

Of course, that is all theoretical, the real impact of reviews depends vastly on their implementation. In many ways, the steam implementation is extremely good, and actually better than the implementation you see on some other websites. Firstly, you have to actually have bought the product to review it, which eliminates 95% of dubious reviews. I could easily go and review my neighbours B&B on google and say it sucked, if he is someone I don’t like, but with steam, if I wanted to maliciously give bad reviews to every other developer’s strategy games, I’d have to buy them all, which acts as a powerful brake on people just acting like dorks.

Also, steam lets you rate reviews as helpful or unhelpful, which is cool, but AFAIK this has no impact on the extent to which a review is counted towards the review score. This is a tough line to walk, because AFAIK anyone can rate a review without being a customer. If steam allowed review ratings to influence review scores, then you are back to square one with the malicious review-manipulation issue. The review-rating system is presumably a nudge towards  encouraging thoughtful reviews, which probably works to an extent, but you still have a problem that people may leave a bad review for the wrong reason such as ‘Developer is a woman/gay/nazi/non-white…’. How can this be combated?

I think the solution is pretty simple, and obvious when you go back to first principles and ask yourself what a review is supposed to be. let me put forward this assertion:

“A review is an objective measure of the collective opinion of customers as to the quality of the product they have bought”.

That sounds pretty fair to me, and when you put it like that you realize that we try to collect such measure all the time in the real world, with question like this:

“If an election were held tomorrow, which candidate would you vote for?”

Yup, opinion polls are basically trying to do the same thing. They are trying to work out what people think of products, in this case politicians and parties. The key thing I’ve realized, is that there is a wealth of expertise and knowledge gained from such systems about ‘how to do it right’, where ‘right’ means predict the real opinion of everyone from a small subgroup. With this in mind, lets look at everything game reviews on steam do wrong:

Problem #1: A self-selecting electorate.

You don’t have to review a product on steam, and you get NOTHING for it, if you do. No steam points, no gems, no chance of a discount coupon, nothing. You take up your own time. As a result, steam reviews are basically like holding an opinion poll where people have to choose to take part, and then take their own time and effort to participate. Any pollster would laugh you out of the room if you tried to predict an election result by waiting for the public to come to you and tell you how they would vote. You get the activists, the extremists, the angry, and also more relevantly, you get people with time on their hands. You would have huge over-representation by the unemployed, the teenagers and the retired. The result is worthless.

Problem #2: A small sample size.

Ask 10 people who will win the next election and you will get a pretty useless result. Ask 100 and its closer, but for a really close election (48/52% style) you are going to need thousands, even assuming that you have carefully ensured its not self-selecting and that the gamers have been randomly polled.

Problem #3: People lie to themselves.

Some political opinions are widely held but publicly frowned upon. In the US, saying you supported trump would be unpopular in some circles, in the UK, supporting UKIP can be seen as signifying racism. In working-class towns, saying you vote conservative is downright dangerous in some places, and a labour sticker in some conservative villages will exclude you from dinner-parties. Pollsters try to find out what people really think and will do, not what they claim to think and do. In gaming, we dont have much in the way of ‘shame’ although its interesting that all reviews are public and non anonymous. How many people dont want to have a positive review of a gay-dating sim visible on their profile? How many gamers wont post a glowing review of a game they love when the developer gets hate due to their political views? Its probably not *that* many. However, we do have a problem where gamers routinely plough hundreds of hours into a game, then give a negative review. This seems…. weird. To some extent, steam should be able to factor this in. Maybe some fudge factor needs to take players median play time into account when computing a score? This is the trickiest area to fix.

 

So the first two problems are EASILY fixed. You just get more people to review a game. Don’t leave it to the bored (mostly young) or the incredibly outgoing, happy to write comments everywhere (again, mostly young) crowd, or the angry mob (people are more likely to review badly when something goers wrong than they are happily when something goes right). Steam needs to do a simple thing… Raise the percentage of gamers leaving reviews above 1%.

Proposal 1 (Meek). Make it easier to leave a review

You can see a big ‘write review’ box on the store page. So whats the problem? NOBODY visits the store page after they bought the game. Why on earth is that big box not on the games page within the steam app itself? This would be easy to do. Also…on the page for a game right now, ‘write review’ is TINY. I couldn’t find it the last time I looked. Even a different color or a bigger font would help. The current UI design for this is incredibly meek. There is a big fat piece of prime estate next to the play button where it could go instead!

Proposal 2 (Bold) Incentivize reviews

The minute you add any reward for anything on steam, you get side effects, so for now lets ignore the idea of giving out steam points, or gems or anything, and just keep it really simple. When you quit a game session lasting more than 30 minutes, if the player has not reviewed the game , pop up a dialog (like the screenshot uploader) asking them if they want to leave one. 95% of them will hit escape, but even if the other 5% leave a review, we have boosted the accuracy of steam reviews by 500% immediately. Concerned about the 30 minute hard limit? fine, make it random for each player/game combination between 30 minutes and 8 hours, so you get a random sampling of play-times.

I’ve been thinking about this a lot, and these are the solutions that I think are a) hard to cheat and b) easy to try. You can even A/B test steam users and see what the effects are before rolling out to everyone. I’m interested to hear peoples opinions on this, and think its always worth discussing this sort of stuff. It applies of course not just to steam, but GoG, Humble, Itch and everyone else. Its in everyone’s interests that game reviews are fair and accurate for all.

 

 

More balance analysis with production Line (build 1.33)

My Excel skills have levelled up since I last wrote about balancing production line using player statistics. As a result I now have more informative charts to look at when analysing play sessions from build 1.32. My intentions with this balancing are to increase the long term playability and balance of the game. basically player retention is good after 1 day, good after 7 days, but starts to tail off before 28 days, implying that the game is good initially but loses its challenge after a while. it may also suggest a lack of content, which is surprising given what’s in the game, but will be naturally fixed over time as more is added (Pickup trucks, quality control, branding, breakdowns).

Looking at the following chart I can see that the amount of cash players have after 50,200,200,300…500 hours. I’m quite happy with this. clearly the amount climbs over time, but is not exorbitant for the median player. I’d like the player to have the odd million dollars in cash, but beyond 10 million makes things a bit easy. Hopefully some expensive upgrades for luxury cars in the late game will push that down slightly.

This second chart shows the intensity of AI competition, and is basically a measure of how well the player is doing, as perceived by the AI. I can see that I was absolutely right to do away with the 50-hour moratorium of AI competitors, as clearly some players race ahead and needed to have the AI rein them in. The clear problem here is that the competition value is trending rapidly up to 100%. I feel that this is a strong indicator that the maximum competitive level of the AI just is not competitive enough. In other words, the metrics by which the AI judges the player are not being bought under control by the methods available to the AI. This needs fixing.

This final graph shows the profitability (as percentage margin) of the players business over time. Its not unreasonable for this to be low, even at a loss during the start, as the player invests in equipment and ramps up production. Over time this is trending to slightly above zero, and my raw stats show an average value at 500 hours of 7.2%. This isn’t too bad, certainly believable in an industry like car production. I dont see that anything really needs to change is response to this graph.

So my conclusions from the currently available data is that the competition index metric is too meek, and that the player should face potentially more challenging AI at the top end, but at the bottom end, it should definitely continue to act as before, taking its foot off the metaphorical gas pedal of competition. The AI seems ok at not crushing the poor-performing player, but too weak to offer a decent challenge to the high-performing one.

Of course the important thing here is to work out what my ideal metrics are for improving the game. I’m assuming that people only continue to play games that they enjoy, and thus the hours played of the game should be a decent metric to show whether or not the game is getting more fun. Right now those stats look like this:

Which isn’t too shabby. I compared it with another one of my games and this isn’t too bad, especially considering the much shorter time its been out, and the fact that it is not content complete. Ideally you dont just make a game for those hardcore who put in 20+ hours but try to move everyone along that graph. I’d like to see the number of people playing 2 hours go up a lot more. I think if you don’t like a game you find out before then, so that’s a sign I’ve made something enjoyable. To that end, I need to ensure the game remains challenging in the long run, so tweaking these figures should hopefully nudge it in the right direction.

I feel I should do some actual marketing fluff here, so if you like the sound of the game and haven’t bought it, here is a link :D