I don’t have good intuitions about the Gamma distribution, and I’d like to have good intuitions for computing your Rule’s outcomes in my head. Here’s a way of thinking about it—do you think it makes sense?
Let S∗ denote either S or S+1 (whichever your rule says is appropriate).
I notice that for t<<T, your probability of zero events (1+tT)−S∗=((1−tT)Tt)−S∗tT≈e−S∗Tt=e−λ∗t, where λ∗ is what I’d call the estimated event rate S∗T.
So one nice intuitive interpretation of your rule is that, if we assume event times are exponentially distributed, we should model the rate as λ∗=S∗T. Does that sound right? It’s been a while since I’ve done a ton of math, so I wouldn’t be surprised if I’m missing something here.
That’s exactly right, and I think the approximation holds as long as T/t>>1.
This is quite intuitive—as the amount of data goes to infinity, the rate of events should equal the number of events so far divided by the time passed.
In addition to what you say, I would also guess that e−λ∗t is a reasonable guess for P(no events in time t) when t > T, if it’s reasonable to assume that events are Poisson-distributed. (but again, open to pushback here :)
My intuition is that it’s not a great approximation in those cases, similar to how in regular Laplace the empirical approximation is not great when you have eg N<5
Id need to run some calculations to confirm that intuition though.
Great post, thanks for sharing!
I don’t have good intuitions about the Gamma distribution, and I’d like to have good intuitions for computing your Rule’s outcomes in my head. Here’s a way of thinking about it—do you think it makes sense?
Let S∗ denote either S or S+1 (whichever your rule says is appropriate).
I notice that for t<<T, your probability of zero events (1+tT)−S∗=((1−tT)Tt)−S∗tT≈e−S∗Tt=e−λ∗t, where λ∗ is what I’d call the estimated event rate S∗T.
So one nice intuitive interpretation of your rule is that, if we assume event times are exponentially distributed, we should model the rate as λ∗=S∗T. Does that sound right? It’s been a while since I’ve done a ton of math, so I wouldn’t be surprised if I’m missing something here.
That’s exactly right, and I think the approximation holds as long as T/t>>1.
This is quite intuitive—as the amount of data goes to infinity, the rate of events should equal the number of events so far divided by the time passed.
Thanks for the confirmation!
In addition to what you say, I would also guess that e−λ∗t is a reasonable guess for P(no events in time t) when t > T, if it’s reasonable to assume that events are Poisson-distributed. (but again, open to pushback here :)
What’s r?
Oops, I meant lambda! edited :)
I still don’t understand—did you mean “when T/t is close to zero”?
Oops yes, sorry!
My intuition is that it’s not a great approximation in those cases, similar to how in regular Laplace the empirical approximation is not great when you have eg N<5
Id need to run some calculations to confirm that intuition though.