Highly Expected Events Provide Little Information and The Value of PR Statements
A quick review of information theory:
Entropy for a discrete random variable is given by I(X)=−∑ip(xi)logp(xi). This quantifies the amount of information that you gain on average by observing the value of the variable.
It is maximized when every possible outcome is equally likely. It gets smaller as the variable becomes more predictable and is zero when the “random” variable is 100% guaranteed to have a specific value.
You’ve learnt 1 bit of information when you learn the outcome of a fair coin toss was heads. But you learn 0 information, when you learn the outcome was heads after tossing a coin with heads on both side.
PR statements from politicians:
On your desk is a sealed envelope that you’ve been told contains a transcript of a speech that President Elect Trump gave on the campaign trail. You are told that it discusses the impact that his policies will have on the financial position of the average American.
How much additional information do you gain if I tell you that the statement says his policies will have a positive impact on the financial position of the average American?
The answer is very little. You know ahead of time that it is exceptionally unlikely for any politician to talk negatively about their own policies.
There is still plenty of information in the details that Trump mentions, how exactly he plans to improve the economy.
PR statements from leading AI Research Organizations:
Both Altman and Amodei have recently put out personal blog posts in which they present a vision of the future after AGI is safely developed.
How much additional information do you gain from learning that they present a positive view of this future?
I would argue simply learning that they’re optimistic tells you almost zero useful information about what such a future looks like.
There is plenty of useful information, particularly in Amodei’s essay, in how they justify this optimism and what topics they choose to discuss. But their optimism alone shouldn’t be used as evidence to update your beliefs.
Highly Expected Events Provide Little Information and The Value of PR Statements
A quick review of information theory:
Entropy for a discrete random variable is given by I(X)=−∑ip(xi)logp(xi). This quantifies the amount of information that you gain on average by observing the value of the variable.
It is maximized when every possible outcome is equally likely. It gets smaller as the variable becomes more predictable and is zero when the “random” variable is 100% guaranteed to have a specific value.
You’ve learnt 1 bit of information when you learn the outcome of a fair coin toss was heads. But you learn 0 information, when you learn the outcome was heads after tossing a coin with heads on both side.
PR statements from politicians:
On your desk is a sealed envelope that you’ve been told contains a transcript of a speech that President Elect Trump gave on the campaign trail. You are told that it discusses the impact that his policies will have on the financial position of the average American.
How much additional information do you gain if I tell you that the statement says his policies will have a positive impact on the financial position of the average American?
The answer is very little. You know ahead of time that it is exceptionally unlikely for any politician to talk negatively about their own policies.
There is still plenty of information in the details that Trump mentions, how exactly he plans to improve the economy.
PR statements from leading AI Research Organizations:
Both Altman and Amodei have recently put out personal blog posts in which they present a vision of the future after AGI is safely developed.
How much additional information do you gain from learning that they present a positive view of this future?
I would argue simply learning that they’re optimistic tells you almost zero useful information about what such a future looks like.
There is plenty of useful information, particularly in Amodei’s essay, in how they justify this optimism and what topics they choose to discuss. But their optimism alone shouldn’t be used as evidence to update your beliefs.
Edit:
Fixed pretty major terminology blunder.
(This observation is not original, and a similar idea appears in The Sequences.)
Note, the quantity you refer to is called entropy by Wikipedia, not Shannon information.