A main priority for the CRO is to ensure that the organisation is in full compliance with applicable regulations (chief compliance officer). They may also deal with topics regarding insurance, internal auditing, corporate investigations, fraud, and information security.
Unfortunately, this description is missing the point. Main existential risks come from the inside, like over-optimistic projections, sunk cost-based decisions, NIH syndrome behavior, rotting corporate culture, etc.
I see your point here, although I will say that decision science is ideally a major component in the skill set for any person in a management position. That being said, what’s being proposed in the article here seems to be distinct from what you’re driving at.
Managing cognitive biases within an institution doesn’t necessarily overlap with the sort of measures being discussed. A wide array of statistical tools and metrics isn’t directly relevant to, e.g. battling sunk-cost fallacy or NIH. More relevant to that problem set would be a strong knowledge of known biases and good training in decision science and psychology in general.
That isn’t to say that these two approaches can’t overlap, they likely could. For example stronger statistical analysis does seem relevant to the issue of over-optimistic projections you bring up in a very straightforward way.
From what I gather you’d want a CRO that has a complimentary knowledge base in relevant areas of psychology alongside more standard risk analysis tools. I definitely agree with that.
From what I gather you’d want a CRO that has a complimentary knowledge base in relevant areas of psychology alongside more standard risk analysis tools. I definitely agree with that.
Yes. A CEO is by nature an optimist, with a “can do” approach. A CRO would report to the board directly to balance this optimism. This way the board, the CEO and the company will not be blindsided by the results of poor decisions, or anything else short of black swans. Currently there is a lip service to this approach in the SEC filings under possible risks and such.
Of course, this is a rather idealistic point of view. In most public companies the board members do not share in the troubles of their company, only in its benefits, so it would be easy for them to marginalize the role of the CRO and restrict it to checking for legislative compliance only. No one likes hearing about potential problems. Besides, if the CRO brings up an issue before the board, assigns a high probability to it, but no action is taken and the risk comes to pass, the board members might be found responsible. They would never want that.
From Wikipedia:
Unfortunately, this description is missing the point. Main existential risks come from the inside, like over-optimistic projections, sunk cost-based decisions, NIH syndrome behavior, rotting corporate culture, etc.
I see your point here, although I will say that decision science is ideally a major component in the skill set for any person in a management position. That being said, what’s being proposed in the article here seems to be distinct from what you’re driving at.
Managing cognitive biases within an institution doesn’t necessarily overlap with the sort of measures being discussed. A wide array of statistical tools and metrics isn’t directly relevant to, e.g. battling sunk-cost fallacy or NIH. More relevant to that problem set would be a strong knowledge of known biases and good training in decision science and psychology in general.
That isn’t to say that these two approaches can’t overlap, they likely could. For example stronger statistical analysis does seem relevant to the issue of over-optimistic projections you bring up in a very straightforward way.
From what I gather you’d want a CRO that has a complimentary knowledge base in relevant areas of psychology alongside more standard risk analysis tools. I definitely agree with that.
Yes. A CEO is by nature an optimist, with a “can do” approach. A CRO would report to the board directly to balance this optimism. This way the board, the CEO and the company will not be blindsided by the results of poor decisions, or anything else short of black swans. Currently there is a lip service to this approach in the SEC filings under possible risks and such.
Of course, this is a rather idealistic point of view. In most public companies the board members do not share in the troubles of their company, only in its benefits, so it would be easy for them to marginalize the role of the CRO and restrict it to checking for legislative compliance only. No one likes hearing about potential problems. Besides, if the CRO brings up an issue before the board, assigns a high probability to it, but no action is taken and the risk comes to pass, the board members might be found responsible. They would never want that.