I keep seeing these articles about the introduction of artificial intelligence/data science to football and basketball strategy. What’s crazy to me is that it’s happening now instead of much much earlier. The book Moneyball was published in 2003 (the movie in 2011) spreading the story of how use of statistics changed the game when it came to every aspect of managing a baseball team. After reading it, I and many others thought to ourselves “this would be cool to do in other sports”—using data would be interesting in every area of every sport (drafting, play calling, better coaching, clock management, etc). But I guess I assumed—if I thought of it, why wouldn’t other people?
It’s kind of a wild example of the idea that “if something works a little, you should do more of it and see if it works a lot, and keep doing that until you see evidence that it’s running out of incremental benefit.” My assumption that the “Moneyball” space was saturated back in 2011 was completely off given that in the time between 2011 and now, one could have trained themselves from scratch in the relevant data science methods and pushed for such jobs (my intuition is that 8 years of training could get you there). So, it’s not even a “right place, right time” story given the timeline. It’s just—when you saw the obvious trend, did you assume that everyone else was already thinking about it, or did you jump in yourself?
Part of the problem was that doing the work to apply those insights and doing so in a way that beats trained humans is hard because until recently those models couldn’t handle all the variables and data humans could and so ignored many things that made a difference. Now that more data can be fed into the models they can make the same or better predictions that humans can make and thus stand a chance of outperforming them rather than making “correct” but poorly-informed decisions that, in the real world, would have lost games.
Interesting conclusion. It sounds like the bystander effect. I wonder how many big ideas don’t get the action they deserve because upon hearing it we assume it’s already getting the effort/energy it deserves and that there isn’t room for our contribution.
Another weird takeaway is the timeline. I think my intuition whenever I hear about a good idea currently happening is that because it’s happening right now, it’s probably too late for me to get in on it at all because everyone already knows about it. I think that intuition is overweighted. If there’s a spectrum from ideas being fully saturated to completely empty of people working on them, when good ideas break in the news they are probably closer to the latter than I give them credit for being. At least, I need to update in that direction.
I think this is caused by the fact that we lack tooling to adequately assess the amount of free-energy available in new markets sparked by new ideas. Currently it seems the only gauge we have is media attention and investment announcements.
Taking the time to assess an opportunity is operationally expensive and I think I’ve optimized to accept that there’s probably little opportunity given that everyone else is observing the same thing. However, I’m not sure that it makes sense to adjust my optimization without first increasing my efficiency in assessing opportunities.
Ya, it’s interesting because it was a “so clearly a good idea” idea. We tend to either dismiss ideas as bad because we found the fatal flaw or think “this idea is so flawless it must’ve been the lowest hanging fruit and thus have already been picked.”
Another example that comes to mind is checklists in surgery. Gawande wrote the book “checklist manifesto” with his findings that a simple checklist dramatically improved surgical outcomes back in 2009. I wonder if the “maybe we should try to make some kind of checklist-ish modification to how we approach everything else in medicine” thought needs similar action.
I keep seeing these articles about the introduction of artificial intelligence/data science to football and basketball strategy. What’s crazy to me is that it’s happening now instead of much much earlier. The book Moneyball was published in 2003 (the movie in 2011) spreading the story of how use of statistics changed the game when it came to every aspect of managing a baseball team. After reading it, I and many others thought to ourselves “this would be cool to do in other sports”—using data would be interesting in every area of every sport (drafting, play calling, better coaching, clock management, etc). But I guess I assumed—if I thought of it, why wouldn’t other people?
It’s kind of a wild example of the idea that “if something works a little, you should do more of it and see if it works a lot, and keep doing that until you see evidence that it’s running out of incremental benefit.” My assumption that the “Moneyball” space was saturated back in 2011 was completely off given that in the time between 2011 and now, one could have trained themselves from scratch in the relevant data science methods and pushed for such jobs (my intuition is that 8 years of training could get you there). So, it’s not even a “right place, right time” story given the timeline. It’s just—when you saw the obvious trend, did you assume that everyone else was already thinking about it, or did you jump in yourself?
Part of the problem was that doing the work to apply those insights and doing so in a way that beats trained humans is hard because until recently those models couldn’t handle all the variables and data humans could and so ignored many things that made a difference. Now that more data can be fed into the models they can make the same or better predictions that humans can make and thus stand a chance of outperforming them rather than making “correct” but poorly-informed decisions that, in the real world, would have lost games.
Interesting conclusion. It sounds like the bystander effect. I wonder how many big ideas don’t get the action they deserve because upon hearing it we assume it’s already getting the effort/energy it deserves and that there isn’t room for our contribution.
Another weird takeaway is the timeline. I think my intuition whenever I hear about a good idea currently happening is that because it’s happening right now, it’s probably too late for me to get in on it at all because everyone already knows about it. I think that intuition is overweighted. If there’s a spectrum from ideas being fully saturated to completely empty of people working on them, when good ideas break in the news they are probably closer to the latter than I give them credit for being. At least, I need to update in that direction.
I think this is caused by the fact that we lack tooling to adequately assess the amount of free-energy available in new markets sparked by new ideas. Currently it seems the only gauge we have is media attention and investment announcements.
Taking the time to assess an opportunity is operationally expensive and I think I’ve optimized to accept that there’s probably little opportunity given that everyone else is observing the same thing. However, I’m not sure that it makes sense to adjust my optimization without first increasing my efficiency in assessing opportunities.
Ya, it’s interesting because it was a “so clearly a good idea” idea. We tend to either dismiss ideas as bad because we found the fatal flaw or think “this idea is so flawless it must’ve been the lowest hanging fruit and thus have already been picked.”
Another example that comes to mind is checklists in surgery. Gawande wrote the book “checklist manifesto” with his findings that a simple checklist dramatically improved surgical outcomes back in 2009. I wonder if the “maybe we should try to make some kind of checklist-ish modification to how we approach everything else in medicine” thought needs similar action.