W

I'm trying to calculate the variance of a game given a set of session outcomes.

I'll explain:

Imagine a game where you have an edge but don't know how much of an edge. You play the game and take note of the result of each session to find out your edge over the long run. After thousands of sessions you know your edge and now you need to calculate the variance of the game. There is no way for you to know the probabilities of the outcomes in the game to make a combinatorial analysis, so you are forced to work only with your registered session outcomes.

Each session recorded contains 50 rounds with bets ranging from $1 to $10 distributed equally, so an average bet of $5.

Take this set of outcomes as an example:

Session 1: $115

Session 2: -$50

Session 3: $210

Session 4: $25

Session 5: -$40

To start the calculations I divide each outcome by $5 to express them in betting units:

23

-10

42

5

-8

I calculate the mean which is (23+-10+42+5+-8)/5 = 10.40

Now to calculate the variance I subtract the mean from each initial outcome and square each result:

23 - 10.4 = 12.60^2 = 158.76

-10 - 10.4 = -20.40^2 = 416.16

42 - 10.4 = 31.60^2 = 998.56

5 - 10.40 = -5.40^2 = 29.16

-8 - 10.40 = -18.40^2 = 338.56

Then finally I add all the results (= 1941.20) and divide by the number of outcomes (1941.20 / 5 = 388.24). So the variance is 388.24.

Now, remember that each outcome registered is a whole session which contains 50 rounds of the game.

So before I have my actual variance do I have to correct for this by multiplying the number of outcomes by 50 (the number of rounds) such as: 1941.20 / (5*50 rounds) = 7.7648?

Which is my actual variance for the game? 388.24 or 7.7648?

Am I way off? Is this just the wrong method to get an accurate variance considering the betting fluctuations and number of rounds in each session?

Thank you

Continue reading...