Are you talking about free rider problems with health care costs under a partly or fully socialized health care system or something else?
And also, if you take some risky action that increases your chances of get infected, that also increases the chances of everyone else getting infected (causally, via yourself getting infected and then infecting others).
STDs seem to be less of a problem than more easily transmitted diseases like flu for most negative externalities I can think of.
I’m not sure I get your point here. Whether it’s more or less of a problem doesn’t seem relevant to the original claim that spawned this subthread.
I’m not sure I get your point here. Whether it’s more or less of a problem doesn’t seem relevant to the original claim that spawned this subthread.
It’s relevant to using your negative externality argument to support the original claim. To be consistent you would have to argue that we should make even more effort to avoid spreading the idea that airborne diseases like flu have low transmission rates (if true) than the idea that STDs have low transmission rates. Are you advocating a general policy of deliberately misleading people about the risks of various activities in an effort to correct for negative externalities? I’m pretty sure more efficient and robust approaches could be found.
It would be consistent with Wei Dai’s claim just to argue that we should make an effort to not reveal how low the transmission rate of influenza is among people who don’t wash their hands; we know that hand-washing is a large factor in transmission, but actual transmission rate numbers are still low enough to fail to convince people to wash their hands.
From a brief study of those particular numbers (I worked on a team modeling the spread of H1N1), I feel like we already mislead the public about the numbers themselves by being truthful as to the societal benefits and somewhat optimistic about the individual benefits of hand washing. If you believe more robust methods are more efficient, by all means, advocate for them, but I’m reasonably happy with the current situation.
From another perspective, blood-borne pathogens are particularly worth focusing on because they are easier to control. If we could encourage the entire population of the world to behave safely (not reuse needles, use condoms for sex, etc.), it would be a fairly minor change for individuals, but could eradicate or nearly eliminate HIV over time. With the flu, safe behavior will limit the damage of seasonal infections, but it’s not realistic to actually eliminate the virus. Thus, over the long term, I think the negative externalities of HIV might outweigh those of influenza.
I’m pretty sure more efficient and robust approaches could be found.
I think government policy makers and public health authorities already use a variety of approaches to reduce negative externalities related to infectious diseases, including subtle misinformation, such as making efforts to correct people’s beliefs about transmission rates when they are too low, but not when they are too high (anything really obvious wouldn’t work in a free society like ours). But it seems clear that large negative externalities still exist. What other approaches do you have in mind, and why haven’t they thought of it already?
I think we’re starting from quite different assumptions about how society works. I don’t believe that government policy makers or public health authorities are very rational. Even to the extent that they are rational, I don’t believe that their incentives are such as to reliably lead them to decisions that maximize utility by the kind of utilitarian calculus you seem to be assuming. So to the extent that we agree negative externalities exist (and I suspect we differ a fair bit on what they are and to what extent they exist) I have very little expectation that government policy makers or public health authorities will tend to take actions that minimize them.
What did you mean when you said “I’m pretty sure more efficient and robust approaches could be found”? You’re not offering any concrete ideas yourself, and apparently you weren’t thinking of government health authorities when you wrote that, so who is supposed to find and apply these approaches?
Think ‘market based’. Internalize negative externalities. To a first approximation this usually means reducing government involvement rather than increasing it. This is straying into politics though so maybe we should avoid further discussion of this topic.
Compared to the rest of this open thread, I don’t think you have anything to worry about!
Seriously though, I think we’d both like to hear you elaborate upon your market-based idea. I don’t think I got any useful information out of your blurb.
Let me first clarify the points I was making in this thread (which were not intended to lead to a debate about healthcare or politics in general). If you still feel we have substantive disagreements that we might be able to resolve through more explicitly political discussion I’m willing to continue the conversation unless there are strong objections from others.
First my points were not intended to imply any particular opinion on AIDS transmission rates specifically. My initial post was simply intended to point out that the utility of spreading the idea that AIDS transmission rates are low is dependent on the truth of the claim.
This was intended to be a more general point that exaggerating the risks of any particular activity is not a good general policy. In the AIDS case there are genuine costs to taking precautions against transmission, even if they are in fact greatly outweighed by the benefits. In a hypothetical world where transmission rates are negligible, maintaining that they are high would have negative utility.
Wei Dai responded by claiming that because of negative externalities associated with infectious diseases, exaggerating the transmission rate can improve social welfare. Now this is not incompatible with my original point, it is rather a claim of a mechanism by which exaggerating the risks of an activity can have positive utility. It is probably worth noting at this point that I am not a utilitarian so I am likely to disagree with utilitarians on what outcomes have positive utility but we can probably agree that in general internalizing negative externalities is a good thing.
I concede that it is possible in theory to imagine a situation where deliberately exaggerating risks has positive utility. The fun thing about negative externalities though is that it is very easy for an intelligent person to think of some and to propose plausible mechanisms by which any given action can be justified. I could easily argue for example that the credibility of science and scientists is undermined when they are caught making false claims and that the negative utility resulting from this outweighs any positive utility from individual acts of well intentioned deception.
Ultimately though I have what you might call a deontological normative belief about science that it should always pursue the truth and leave the task of judging when to strategically lie to others. I also suspect that this is a winning strategy for agents with imperfect powers of prediction but that is mere supposition.
Regardless, if negative externalities associated with infectious diseases are the real concern I’m pretty confident that you’d start with much higher expected value actions than lying about the facts in an effort to influence individual choices. If individuals are not bearing the full costs of their actions then there are more direct ways of changing their incentives such that the costs are better reflected than trying to influence their beliefs away from the truth by spreading false facts. This is the point at which I’d start to get into the politics of healthcare however and I don’t particularly want to do that here.
My initial post was simply intended to point out that the utility of spreading the idea that AIDS transmission rates are low is dependent on the truth of the claim.
Agreed. I think Wei Dai and I also agree (without speaking for Wei Dai), but think that the idea transmission rate to spread is conceivably a function of the true transmission rate, rather than locked to the true transmission rate itself. It sounds like you agree with that too, but are a little more stringent with what makes that “conceivably” true.
I concede that it is possible in theory to imagine a situation where deliberately exaggerating risks has positive utility.
To my mind, one such situation is one where a super-majority of the population don’t understand probability, and 1% effectively means “never” to those people. In this case, specifically lying about the number is less helpful than phrasing it alternate true ways, but because they don’t understand probability, if you asked them to estimate a probability based on the true facts you just gave, they might say 10%; I might consider them deceived, even if you weren’t doing anything that would deceive a rational agent.
Ultimately though I have what you might call a deontological normative belief about science that it should always pursue the truth and leave the task of judging when to strategically lie to others. I also suspect that this is a winning strategy for agents with imperfect powers of prediction but that is mere supposition.
This is a good point, much like Ends Don’t Justify Means (Among Humans). One small caveat I’d like to make is that I don’t think we are talking about science (or more accurately) scientists lying to the populace. I think what we are discussing are administrators, media outlets and policy makers that are informed of the numbers (it would probably be too generous to say informed of the science) choosing not to widely disseminate them in favor of reporting non-numerical information that might lead the populace to adopt an inaccurate belief about those numbers.
If individuals are not bearing the full costs of their actions then there are more direct ways of changing their incentives such that the costs are better reflected than trying to influence their beliefs away from the truth …
Yes. Please elaborate, this seems to be the most interesting part of the conversation, and the way you’re conversing now, I don’t see why you’re worried about a backlash. It would probably be wise to be as general as possible and not make any specific reference to a given countries system of healthcare (if that’s where you’re going), but if you have a real point I’d really like to hear it.
… by spreading false facts.
Again, I wasn’t purely speaking about false facts, more about choosing which true facts to announce, and in what way to achieve the desired impact. One doesn’t have to lie to manipulate, and it’s generally not the best strategy. I’m also not claiming that manipulation should be a primary goal, but selective revelation of data (which will always happen because of time and attention constants) is bound to be manipulative, even if accidentally, and it would be better to produce desired action than undesired action, so we should be aware of the manipulative potential of what we plan to say.
Ok, we’ve had two examples of negative externalities associated with infectious diseases. I brought up the free-rider / moral hazard problem of the costs of treating the disease not being fully born by an individual who engages in high risk activities. Wei Dai brought up the problem of an individual who gets infected as a result of high risk activities increasing the risks of others getting infected by increasing the incidence of the disease.
Now if negative externalities are your true concern then you should address them as directly as possible. There are a variety of a variety of possible solutions to addressing negative externalities. Note that lying about the actual damage caused is not a standard solution.
The moral hazard / free-rider problem is a general problem that affects healthcare as it is organized in most developed nations. A significant number of people consider this a feature rather than a bug however. If you actually wanted to internalize these negative externalities the most direct way would be to allow insurers (or the government, though that would be less efficient) to set their healthcare premiums on the basis of any relevant health or lifestyle information.
While this happens to some extent (smokers or the obese may pay extra under various systems for example) it would be controversial in others (charging homosexuals higher premiums if they were at greater risk of contracting STDs for example). It would likely be even more controversial for conditions that are not generally perceived as due to bad personal choices (as smoking related illnesses or obesity are by many and homosexuality is by some on the Christian right). The suggestion that insurance companies might charge higher premiums based on genetic testing is widely regarded as unreasonable for example and I’m sure the same would apply if premiums for a government run system were so determined. Why such discrimination is considered outrageous for healthcare but is routine in some other areas of life is left as an exercise for the reader.
As to the problem of an infected individual increasing the risk of others getting infected, criminalization, civil tort law and pigovian taxes are all possible approaches to internalizing the externalities. My point that STDs present less of a problem in this regard than airborne infectious diseases like flu was that the parties put at risk are generally more able to control the risks themselves (easier to limit your exposure to STDs than to the flu) and that the source of the infection is generally easier to identify (most people have a much shorter list of candidates for infecting them with an STD than with the flu). There are fairly significant practical difficulties to prosecuting individuals who get infected with the flu and then spread it, to suing someone who has so infected you or to targeting taxes at those who put themselves at high risk of flu infection. All of these are practical to some degree with STDs.
Fundamentally my point is that if negative externalities related to infectious diseases are the real problem you are concerned about, there are standard ways of internalizing negative externalities that could be applied. Trying to justify misleading the general public about the actual risks of certain activities and the actual benefits of certain precautions on the basis of negative externalities raises the question of why you are not focusing your efforts on these other more direct and efficient means of internalizing those negative externalities.
Thank you very much for your reply. I really did want you to specify more clearly what you were talking about. It seems obvious to me (now) that anyone following your line of thought would have understood what you were talking about from your earlier comments, but I didn’t, and I hope you can forgive me for not understanding without further clarification.
As an aside, smoking has an obvious externality in second-hand smoke, which is often directly regulated by outlawing smoking in certain areas. What are the negative externalities of obesity? If we are to believe some recent studies, fat people may make the people around them fatter, which is a non-obvious externality, but does obesity have any commonly-recognized effects on people other than the obese when not considering subsidized healthcare, or is it only considered to have that externality when healthy-weight individuals are contributing to the costs of obese individuals?
to suing someone who has so infected you
This is at least possible with in some places regarding HIV, as well as pursuing criminal charges (examples).
Even more off topic, but on topic with the much more inflamed discussion in the adjacent thread, while looking for that reference I found this. I was thinking at first that a man infecting 13 women would be in contradiction with the extremely low transmission numbers for HIV. Here are what my numbers look like: with (2006 estimates—I couldn’t find number for late 1990′s) 8e5 HIV+ men in the US, a 50-50 chance that one of them would infect 13 women requires that they each engage in an average of 2845 acts with a 0.08% tx rate (unprotected P/V), 762 0.3% tx rate acts (low-income P/V) or a mere 139 acts with 1.7% tx rate (unprotected P/A). The probabilities get much more complicated when dividing demographics, but while those are unrealistically high numbers, they aren’t off by an order of magnitude; at least, the low-income P/V figure probably isn’t. Actually pulling any information off that data point, given that it is singular and unreliable is stupid, but if it were too be believed we should expect that HIV has a true aggregate transmission rate closer to 0.45%, given reasonable assumptions about average frequency of intercourse. Of course that aggregate can easily be composed by a few high-risk activities and lots and lots of low risk activities.
EDIT: I got so distracted I forgot what the main point was
Trying to justify misleading the general public about the actual risks of certain activities and the actual benefits of certain precautions on the basis of negative externalities raises the question of why you are not focusing your efforts on these other more direct and efficient means of internalizing those negative externalities.
I point out only your use of the word “efficient”. Misleading people is so easy it’s almost impossible not to do it, so the cost / benefit ratio doesn’t have to be very high to make it an efficient activity. As far as effectiveness, I agree that other, more direct measures can be much better.
As I mentioned earlier, negative externalities are easy to dream up. Many people consider it legitimate to complain about negative externalities caused by ugly buildings (such as power plants, wind farms or architecture that doesn’t fit with its surroundings) but complaining about the aesthetic negative externalities associated with unattractive people in public places is not generally considered legitimate.
In practice in democratic society these issues are generally resolved by who can shout the loudest or wield most political influence and not by any direct rational accounting of costs. It is not clear for example that the relatively small risks associated with second hand smoke justify trampling on the rights of smokers to indulge outside of their own homes, especially given that smoking is already subject to large Pigovian taxes in most countries with such bans.
Misleading people is so easy it’s almost impossible not to do it, so the cost / benefit ratio doesn’t have to be very high to make it an efficient activity.
It should at least be positive. It is not clear that it is in practice. It seems plausible to me that the general public distrust of government advice on risk that underlies phenomena like anti-vaccination movements is a direct result of an ongoing pattern of deliberately misleading people about risks. Overall I don’t see a strong reason to suppose the net effect is beneficial.
It seems plausible to me that the general public distrust of government advice on risk that underlies phenomena like anti-vaccination movements is a direct result of an ongoing pattern of deliberately misleading people about risks.
Best point brought up yet. While to some extent I think that mistrust of authority is indefatigable, increasing the risk of that is probably much more costly.
How do you feel about the specific example I mentioned, where the true risk of transmission of something is 1%, but the media outlet or whatever decides to omit the number and instead say something like “over the course of a week, an individual can spread disease X to over a hundred people”, and while true, that convinces individuals that the specific risk is much higher than 1%?
I personally find it a little irritating when the media omits information that would be necessary to work out actual risk numbers for myself. I don’t object if they communicate the numbers in a way designed to have maximum impact on the typical human mind (it’s been suggested that using frequencies rather than probabilities may help for example) but I do object if they leave out crucial information required to figure out true risk estimates. Of course I don’t generally assume this is some grand conspiracy but rather reflective of the innumeracy of the media in general.
I don’t believe in grand conspiracies because they just require too many contingencies. All this discussion, from my perspective, is about the potential for a tacit agreement between most (not all) of those disseminating information in various ways that the best method of talking about public risks is not necessarily to directly discuss low numbers associated with them.
As I indicated earlier, I think that this agreement effectively already exists regarding influenza, and probably also HIV and other infections as well.
To be consistent you would have to argue that we should make even more effort to avoid spreading the idea that airborne diseases like flu have low transmission rates (if true) than the idea that STDs have low transmission rates. Are you advocating a general policy of deliberately misleading people about the risks of various activities in an effort to correct for negative externalities?
Strictly speaking, Wei never claimed anything about what we should do. (Even if everything he said is correct — and it seems obviously so to me — it’s plausible that we’re best off with a policy of authorities never lying about risks, due to unintended consequences, public trust, slippery slopes, &c.)
And also, if you take some risky action that increases your chances of get infected, that also increases the chances of everyone else getting infected (causally, via yourself getting infected and then infecting others).
I’m not sure I get your point here. Whether it’s more or less of a problem doesn’t seem relevant to the original claim that spawned this subthread.
It’s relevant to using your negative externality argument to support the original claim. To be consistent you would have to argue that we should make even more effort to avoid spreading the idea that airborne diseases like flu have low transmission rates (if true) than the idea that STDs have low transmission rates. Are you advocating a general policy of deliberately misleading people about the risks of various activities in an effort to correct for negative externalities? I’m pretty sure more efficient and robust approaches could be found.
It would be consistent with Wei Dai’s claim just to argue that we should make an effort to not reveal how low the transmission rate of influenza is among people who don’t wash their hands; we know that hand-washing is a large factor in transmission, but actual transmission rate numbers are still low enough to fail to convince people to wash their hands.
From a brief study of those particular numbers (I worked on a team modeling the spread of H1N1), I feel like we already mislead the public about the numbers themselves by being truthful as to the societal benefits and somewhat optimistic about the individual benefits of hand washing. If you believe more robust methods are more efficient, by all means, advocate for them, but I’m reasonably happy with the current situation.
From another perspective, blood-borne pathogens are particularly worth focusing on because they are easier to control. If we could encourage the entire population of the world to behave safely (not reuse needles, use condoms for sex, etc.), it would be a fairly minor change for individuals, but could eradicate or nearly eliminate HIV over time. With the flu, safe behavior will limit the damage of seasonal infections, but it’s not realistic to actually eliminate the virus. Thus, over the long term, I think the negative externalities of HIV might outweigh those of influenza.
I think government policy makers and public health authorities already use a variety of approaches to reduce negative externalities related to infectious diseases, including subtle misinformation, such as making efforts to correct people’s beliefs about transmission rates when they are too low, but not when they are too high (anything really obvious wouldn’t work in a free society like ours). But it seems clear that large negative externalities still exist. What other approaches do you have in mind, and why haven’t they thought of it already?
I think we’re starting from quite different assumptions about how society works. I don’t believe that government policy makers or public health authorities are very rational. Even to the extent that they are rational, I don’t believe that their incentives are such as to reliably lead them to decisions that maximize utility by the kind of utilitarian calculus you seem to be assuming. So to the extent that we agree negative externalities exist (and I suspect we differ a fair bit on what they are and to what extent they exist) I have very little expectation that government policy makers or public health authorities will tend to take actions that minimize them.
What did you mean when you said “I’m pretty sure more efficient and robust approaches could be found”? You’re not offering any concrete ideas yourself, and apparently you weren’t thinking of government health authorities when you wrote that, so who is supposed to find and apply these approaches?
Think ‘market based’. Internalize negative externalities. To a first approximation this usually means reducing government involvement rather than increasing it. This is straying into politics though so maybe we should avoid further discussion of this topic.
Compared to the rest of this open thread, I don’t think you have anything to worry about!
Seriously though, I think we’d both like to hear you elaborate upon your market-based idea. I don’t think I got any useful information out of your blurb.
Let me first clarify the points I was making in this thread (which were not intended to lead to a debate about healthcare or politics in general). If you still feel we have substantive disagreements that we might be able to resolve through more explicitly political discussion I’m willing to continue the conversation unless there are strong objections from others.
First my points were not intended to imply any particular opinion on AIDS transmission rates specifically. My initial post was simply intended to point out that the utility of spreading the idea that AIDS transmission rates are low is dependent on the truth of the claim.
This was intended to be a more general point that exaggerating the risks of any particular activity is not a good general policy. In the AIDS case there are genuine costs to taking precautions against transmission, even if they are in fact greatly outweighed by the benefits. In a hypothetical world where transmission rates are negligible, maintaining that they are high would have negative utility.
Wei Dai responded by claiming that because of negative externalities associated with infectious diseases, exaggerating the transmission rate can improve social welfare. Now this is not incompatible with my original point, it is rather a claim of a mechanism by which exaggerating the risks of an activity can have positive utility. It is probably worth noting at this point that I am not a utilitarian so I am likely to disagree with utilitarians on what outcomes have positive utility but we can probably agree that in general internalizing negative externalities is a good thing.
I concede that it is possible in theory to imagine a situation where deliberately exaggerating risks has positive utility. The fun thing about negative externalities though is that it is very easy for an intelligent person to think of some and to propose plausible mechanisms by which any given action can be justified. I could easily argue for example that the credibility of science and scientists is undermined when they are caught making false claims and that the negative utility resulting from this outweighs any positive utility from individual acts of well intentioned deception.
Ultimately though I have what you might call a deontological normative belief about science that it should always pursue the truth and leave the task of judging when to strategically lie to others. I also suspect that this is a winning strategy for agents with imperfect powers of prediction but that is mere supposition.
Regardless, if negative externalities associated with infectious diseases are the real concern I’m pretty confident that you’d start with much higher expected value actions than lying about the facts in an effort to influence individual choices. If individuals are not bearing the full costs of their actions then there are more direct ways of changing their incentives such that the costs are better reflected than trying to influence their beliefs away from the truth by spreading false facts. This is the point at which I’d start to get into the politics of healthcare however and I don’t particularly want to do that here.
Agreed. I think Wei Dai and I also agree (without speaking for Wei Dai), but think that the idea transmission rate to spread is conceivably a function of the true transmission rate, rather than locked to the true transmission rate itself. It sounds like you agree with that too, but are a little more stringent with what makes that “conceivably” true.
To my mind, one such situation is one where a super-majority of the population don’t understand probability, and 1% effectively means “never” to those people. In this case, specifically lying about the number is less helpful than phrasing it alternate true ways, but because they don’t understand probability, if you asked them to estimate a probability based on the true facts you just gave, they might say 10%; I might consider them deceived, even if you weren’t doing anything that would deceive a rational agent.
This is a good point, much like Ends Don’t Justify Means (Among Humans). One small caveat I’d like to make is that I don’t think we are talking about science (or more accurately) scientists lying to the populace. I think what we are discussing are administrators, media outlets and policy makers that are informed of the numbers (it would probably be too generous to say informed of the science) choosing not to widely disseminate them in favor of reporting non-numerical information that might lead the populace to adopt an inaccurate belief about those numbers.
Yes. Please elaborate, this seems to be the most interesting part of the conversation, and the way you’re conversing now, I don’t see why you’re worried about a backlash. It would probably be wise to be as general as possible and not make any specific reference to a given countries system of healthcare (if that’s where you’re going), but if you have a real point I’d really like to hear it.
Again, I wasn’t purely speaking about false facts, more about choosing which true facts to announce, and in what way to achieve the desired impact. One doesn’t have to lie to manipulate, and it’s generally not the best strategy. I’m also not claiming that manipulation should be a primary goal, but selective revelation of data (which will always happen because of time and attention constants) is bound to be manipulative, even if accidentally, and it would be better to produce desired action than undesired action, so we should be aware of the manipulative potential of what we plan to say.
Ok, we’ve had two examples of negative externalities associated with infectious diseases. I brought up the free-rider / moral hazard problem of the costs of treating the disease not being fully born by an individual who engages in high risk activities. Wei Dai brought up the problem of an individual who gets infected as a result of high risk activities increasing the risks of others getting infected by increasing the incidence of the disease.
Now if negative externalities are your true concern then you should address them as directly as possible. There are a variety of a variety of possible solutions to addressing negative externalities. Note that lying about the actual damage caused is not a standard solution.
The moral hazard / free-rider problem is a general problem that affects healthcare as it is organized in most developed nations. A significant number of people consider this a feature rather than a bug however. If you actually wanted to internalize these negative externalities the most direct way would be to allow insurers (or the government, though that would be less efficient) to set their healthcare premiums on the basis of any relevant health or lifestyle information.
While this happens to some extent (smokers or the obese may pay extra under various systems for example) it would be controversial in others (charging homosexuals higher premiums if they were at greater risk of contracting STDs for example). It would likely be even more controversial for conditions that are not generally perceived as due to bad personal choices (as smoking related illnesses or obesity are by many and homosexuality is by some on the Christian right). The suggestion that insurance companies might charge higher premiums based on genetic testing is widely regarded as unreasonable for example and I’m sure the same would apply if premiums for a government run system were so determined. Why such discrimination is considered outrageous for healthcare but is routine in some other areas of life is left as an exercise for the reader.
As to the problem of an infected individual increasing the risk of others getting infected, criminalization, civil tort law and pigovian taxes are all possible approaches to internalizing the externalities. My point that STDs present less of a problem in this regard than airborne infectious diseases like flu was that the parties put at risk are generally more able to control the risks themselves (easier to limit your exposure to STDs than to the flu) and that the source of the infection is generally easier to identify (most people have a much shorter list of candidates for infecting them with an STD than with the flu). There are fairly significant practical difficulties to prosecuting individuals who get infected with the flu and then spread it, to suing someone who has so infected you or to targeting taxes at those who put themselves at high risk of flu infection. All of these are practical to some degree with STDs.
Fundamentally my point is that if negative externalities related to infectious diseases are the real problem you are concerned about, there are standard ways of internalizing negative externalities that could be applied. Trying to justify misleading the general public about the actual risks of certain activities and the actual benefits of certain precautions on the basis of negative externalities raises the question of why you are not focusing your efforts on these other more direct and efficient means of internalizing those negative externalities.
Thank you very much for your reply. I really did want you to specify more clearly what you were talking about. It seems obvious to me (now) that anyone following your line of thought would have understood what you were talking about from your earlier comments, but I didn’t, and I hope you can forgive me for not understanding without further clarification.
As an aside, smoking has an obvious externality in second-hand smoke, which is often directly regulated by outlawing smoking in certain areas. What are the negative externalities of obesity? If we are to believe some recent studies, fat people may make the people around them fatter, which is a non-obvious externality, but does obesity have any commonly-recognized effects on people other than the obese when not considering subsidized healthcare, or is it only considered to have that externality when healthy-weight individuals are contributing to the costs of obese individuals?
This is at least possible with in some places regarding HIV, as well as pursuing criminal charges (examples).
Even more off topic, but on topic with the much more inflamed discussion in the adjacent thread, while looking for that reference I found this. I was thinking at first that a man infecting 13 women would be in contradiction with the extremely low transmission numbers for HIV. Here are what my numbers look like: with (2006 estimates—I couldn’t find number for late 1990′s) 8e5 HIV+ men in the US, a 50-50 chance that one of them would infect 13 women requires that they each engage in an average of 2845 acts with a 0.08% tx rate (unprotected P/V), 762 0.3% tx rate acts (low-income P/V) or a mere 139 acts with 1.7% tx rate (unprotected P/A). The probabilities get much more complicated when dividing demographics, but while those are unrealistically high numbers, they aren’t off by an order of magnitude; at least, the low-income P/V figure probably isn’t. Actually pulling any information off that data point, given that it is singular and unreliable is stupid, but if it were too be believed we should expect that HIV has a true aggregate transmission rate closer to 0.45%, given reasonable assumptions about average frequency of intercourse. Of course that aggregate can easily be composed by a few high-risk activities and lots and lots of low risk activities.
EDIT: I got so distracted I forgot what the main point was
I point out only your use of the word “efficient”. Misleading people is so easy it’s almost impossible not to do it, so the cost / benefit ratio doesn’t have to be very high to make it an efficient activity. As far as effectiveness, I agree that other, more direct measures can be much better.
As I mentioned earlier, negative externalities are easy to dream up. Many people consider it legitimate to complain about negative externalities caused by ugly buildings (such as power plants, wind farms or architecture that doesn’t fit with its surroundings) but complaining about the aesthetic negative externalities associated with unattractive people in public places is not generally considered legitimate.
In practice in democratic society these issues are generally resolved by who can shout the loudest or wield most political influence and not by any direct rational accounting of costs. It is not clear for example that the relatively small risks associated with second hand smoke justify trampling on the rights of smokers to indulge outside of their own homes, especially given that smoking is already subject to large Pigovian taxes in most countries with such bans.
It should at least be positive. It is not clear that it is in practice. It seems plausible to me that the general public distrust of government advice on risk that underlies phenomena like anti-vaccination movements is a direct result of an ongoing pattern of deliberately misleading people about risks. Overall I don’t see a strong reason to suppose the net effect is beneficial.
Best point brought up yet. While to some extent I think that mistrust of authority is indefatigable, increasing the risk of that is probably much more costly.
How do you feel about the specific example I mentioned, where the true risk of transmission of something is 1%, but the media outlet or whatever decides to omit the number and instead say something like “over the course of a week, an individual can spread disease X to over a hundred people”, and while true, that convinces individuals that the specific risk is much higher than 1%?
I personally find it a little irritating when the media omits information that would be necessary to work out actual risk numbers for myself. I don’t object if they communicate the numbers in a way designed to have maximum impact on the typical human mind (it’s been suggested that using frequencies rather than probabilities may help for example) but I do object if they leave out crucial information required to figure out true risk estimates. Of course I don’t generally assume this is some grand conspiracy but rather reflective of the innumeracy of the media in general.
I don’t believe in grand conspiracies because they just require too many contingencies. All this discussion, from my perspective, is about the potential for a tacit agreement between most (not all) of those disseminating information in various ways that the best method of talking about public risks is not necessarily to directly discuss low numbers associated with them.
As I indicated earlier, I think that this agreement effectively already exists regarding influenza, and probably also HIV and other infections as well.
Strictly speaking, Wei never claimed anything about what we should do. (Even if everything he said is correct — and it seems obviously so to me — it’s plausible that we’re best off with a policy of authorities never lying about risks, due to unintended consequences, public trust, slippery slopes, &c.)