I don’t really expect a huge change, because rationality is all about aiming the arrow better, but it does not change how strong you pull the bow or how many arrows you have in the quiver.
To continue in your metaphor, a small improvement in aiming can in some situations significantly increase the ratio of arrows that hits the target. Of course assuming that precision was the problem, instead of e.g. distance or lack of arrows. Returning from the metaphor, the benefits of (LW-style) rationality probably also depend on what kind of problems you solve, and what kind of irrational things you were doing before.
What would be the kind of situation where one gets a lot of quick gains from rationality? Seems like a situation where people have to make decisions which later have a big impact on the outcome. (Not the kind where the smart decision is merely 10% more effective than the usual decision; unless those 10% allow you to overcome some critical threshold.)
Decisions like choosing a school, a carreer, a life partner, whether to join a cult, or move to a different city, etc. Or possibly creating useful habits that give you a multiplier on what you were already doing, such as using space repetiton or pomodoros, avoiding planning fallacy or other biases, etc. -- Either doing a few good big changes, or developing reliable good habits. (Also, avoiding big bad changes, and getting rid of bad habits.)
Almost the same for me (just replace Python with Android or Unity, and “playing videogames I don’t really enjoy” with “reading websites I don’t really enjoy”).