I think the way LLMs work might not be well described as having key internal gears or having an at-all illuminating python code sketch.
What motivates your believing that?
What motivates your believing that?