I’m afraid I couldn’t follow most of this, but do you actually mean ‘high energy’ brain states in terms of aggregate neural activity (i.e. the parentheticals which equate energy to ‘firing rates’ or ‘neural activity’)? If so, this seems relatively easy to assess for proposed ‘annealing prompts’ - whether psychedelics/meditation/music/etc. tend to provoke greater aggregate activity than not seems open to direct calorimetry, leave alone proxy indicators.
Yet the steers on this tend very equivocal (e.g. the evidence on psychedelics looks facially ‘right’, things look a lot more uncertain for meditation and music, and identifying sleep as a possible ‘natural annealing process’ looks discordant with a ‘high energy state’ account, as brains seem to consume less energy when asleep than awake). Moreover, natural ‘positive controls’ don’t seem supportive: cognitively demanding tasks (e.g. learning an instrument, playing chess) seem to increase brain energy consumption, yet presumably aren’t promising candidates for this hypothesised neural annealing.
My guess from the rest of the document is the proviso about semantically-neutral energy would rule out a lot of these supposed positive controls: the elevation needs to be general rather than well-localized. Yet this is a lot harder to use as an instrument with predictive power: meditation/music/etc. have foci too in the neural activity it provokes.
I think this post is referring to “high energy” not in terms of electrochemical neural activity but instead as a metaphor for optimization in machine learning.
Machine learning is the process of minimizing an error function. We can conceptualize this error function as a potential gradient such as a gravity well or electrostatic potential. Minimizing the energy of a particle in this potential gradient is mathematically equivalent to minimizing the error function. The advantage of referring to this as “energy” instead of “error” is it lets you borrow other terms like kinetic energy (in both the classical and quantum sense) which makes search algorithms intuitively easy to understand. The post is referring to this kind of entropic energy.
I’m afraid I couldn’t follow most of this, but do you actually mean ‘high energy’ brain states in terms of aggregate neural activity (i.e. the parentheticals which equate energy to ‘firing rates’ or ‘neural activity’)? If so, this seems relatively easy to assess for proposed ‘annealing prompts’ - whether psychedelics/meditation/music/etc. tend to provoke greater aggregate activity than not seems open to direct calorimetry, leave alone proxy indicators.
Yet the steers on this tend very equivocal (e.g. the evidence on psychedelics looks facially ‘right’, things look a lot more uncertain for meditation and music, and identifying sleep as a possible ‘natural annealing process’ looks discordant with a ‘high energy state’ account, as brains seem to consume less energy when asleep than awake). Moreover, natural ‘positive controls’ don’t seem supportive: cognitively demanding tasks (e.g. learning an instrument, playing chess) seem to increase brain energy consumption, yet presumably aren’t promising candidates for this hypothesised neural annealing.
My guess from the rest of the document is the proviso about semantically-neutral energy would rule out a lot of these supposed positive controls: the elevation needs to be general rather than well-localized. Yet this is a lot harder to use as an instrument with predictive power: meditation/music/etc. have foci too in the neural activity it provokes.
I think this post is referring to “high energy” not in terms of electrochemical neural activity but instead as a metaphor for optimization in machine learning.
Machine learning is the process of minimizing an error function. We can conceptualize this error function as a potential gradient such as a gravity well or electrostatic potential. Minimizing the energy of a particle in this potential gradient is mathematically equivalent to minimizing the error function. The advantage of referring to this as “energy” instead of “error” is it lets you borrow other terms like kinetic energy (in both the classical and quantum sense) which makes search algorithms intuitively easy to understand. The post is referring to this kind of entropic energy.