Open question on the certain ‘hot’ global issue of importance to FAI
A question: why anything about global warming gets downvoted, even popularly readable explanation of the fairly mainstream scientific consensus? edit: Okay, this is loaded. I should put it more carefully: why is the warming discussion generally considered inappropriate here? That seems to be the case; and there are pretty good reasons for this. But why can’t AGW debate be invoked as example controversy? The disagreement on AGW is pretty damn unproductive, and so it is a good example of argument where productivity may be improved.
The global warming is a pretty damn good reason to build FAI. It’s quite seriously possible that we won’t be able to do anything else about it. Even mildly superhuman intelligence, though, should be able to eat the problem for breakfast. Even practical sub-human AIs can massively help with the space based efforts to limit this issue (e.g. friendly space-worthy von Neumann machinery would allow to almost immediately solve the problem). We probably will still have extra CO2 in atmosphere, but that is overall probably not a bad thing—it is good for plants.
For that to be important it is sufficient to have 50⁄50 risk of global warming Even probabilities less than 0.5 for the ‘strong’ warning scenarios still are a big factor—in terms of ‘expected deaths’ and ‘expected suffering’ considering how many humans on this planet lack access to air conditioning. I frankly am surprised that the group of people fascinated with AI would have such a trouble with the warming controversy, as to make it too hot of a topic for an example of highly unproductive arguments.
I do understand that LW does not want political controversies. Politics is a mind killer. But this stuff matters. And I trust it has been explained here that non-scientists are best off not trying to second guess the science, but relying on the expert opinion. The global warming is our first example of the manmade problems which are going to kill us if there is no AI. The engineered diseases, the gray goo, that sort of stuff comes later, and will likely be equally controversial. For now we have coal.
The uFAI risk also is going to be extremely controversial as soon as those with commercial interests in the AI development take notice—way more controversial than AGW, for which we do have fairly solid science. If we cannot discuss AGW now, we won’t be able to discuss AI risks once Google—or any other player—deems those discussions a PR problem. The discussions at any time will be restricted to the issues about which no-one really has to do anything at the time.
I question the generalisation “Of the comments by User:Dmytry that have been downvoted recently some of them have been about global warming” --> “All discussion of global warming gets downvoted”. In fact, the claim can be trivially refuted by finding discussion relating to global warming that is not downvoted. As of the time of this comment one can find examples of comments discussing global warming that are upvoted or neutral by following the link to the Dmytry’s comment page, finding the heavily downvoted comments about global warming and then following links to the (currently upvoted) parent and children comments.
Once again, not all instances of people downvoting are part of a conspiracy. Sometimes it just means people disagree with you or object to your style.
As for whether the downvoted comments in question are, in fact, “popularly readable explanation of the fairly mainstream scientific consensus”—I have no idea. I have very little interest in the subject and have not followed the conversation closely. Someone else would have to give their evaluation.
As far as I’m concerned the global warming is a known problem with known insurmountable political (cooperation) problems preventing us from taking the drastic measures needed to solve it.
Nah, I did a search and there’s generally massive downvoting of discussions on AGW. No need to start personal attacks in public. It’s enough to tell me to f* off in private. edit: Also, I don’t care about having a big rating number, okay? I see the votes, as representative of likes/dislikes, a sort of one-bit commentary. Now, when the one-bit commentary is overwhelmingly negative while the more than one bit commentary, is not so, that leaves me wondering why there is such a disparity between two different types of commentary.
Global warming is not an existential risk regardless of its truth value.
Yeah, in the Bostrom sense of something irreversibly destroying all Earth-originating intelligent life or irreversibly preventing it from colonizing other planets it isn’t; but then neither is thermonuclear war—several hundred millions, possibly billions of people would likely survive one.
We don’t often discuss thermonuclear war in LessWrong either.
Yes, probably it’s not going to destroy all ape-originated intelligent life, provided it doesn’t set off
http://en.wikipedia.org/wiki/Anoxic_event
It is still going to kill a fairly significant number of us, and it still requires giving up something for prevention; if you can’t discuss it reasonably, then note that you are even less capable of discussing any issues that trigger more fear or require larger change.
It is an existential risk to continued personal existence of many persons, who live without air conditioning. Things don’t need to be existential risks at the civilization level, to have massive expected dis-utilities. There are many people who don’t put any utility to the civilization itself, beyond the combination of individual utilities.
I’m unclear as to what sort of cataclysm you think would be prevented by...air conditioning. You might want to form a more complete model of what various sources say will happen under different warming scenarios.
Death of a person. (times many). We don’t have uploading yet. Everyone’s memories are gone when they are dead. All that they know, all that they experienced, just gone. Only small shards remain.
In the short term, you can save many more lives with $100,000 worth of mosquito nets in malaria-infected areas than with $100,000 worth of air conditioners (or anything else). And in the long term, air conditioners will be pretty useless at fighting the worst-case scenario effects of AGW (say, rising sea levels) -- if anything, as long as fossil fuels are used to power them, they only make it worse.
Of course, but you can at same time provide the mosquito nets, stop continuing AGW, and the like, with only a rather modest decrease in quality of life in the west. The developed countries consume somewhere between 80% and 90% of natural resources (if you factor in resources spent making your computer in china). I’m not even sure what you guys think you are going to get out of e.g. FAI. You might get more friendliness than you want.
The fact that it may be not worth fighting AGW doesn’t imply anything about validity of AGW itself, by the way, or the dis-utility of AGW. What ever is the reason, there is the world of people living close to equator, without means to cool themselves, that is a fact, and if you can’t do anything about a fact, doesn’t make it any less of a fact.
I do think we should reduce the crap out of our fossil fuel consumption as soon as possible, probably more than most people around here do (and I’m baffled from the apparent near-taboo-ness of AGW-related discussions on LW, too). I was just pointing out that ‘because people in warm countries would be more likely to die from hyperthermia since they don’t have air conditioners’ is nowhere near the main reason for that.
Well, its a complex problem. They pretty much don’t have anything for coping with climate change, forget the aircons, they don’t have clean water, et cetera, and nobody’s ever going to take them as refugees. The aircon was figuratively speaking, a sort of exaggerated understatement. The whole fossil issue is really 80%..90% just the west burning fossils like there’s no tomorrow; rest of the world uses little resources. (That’s generally a taboo topic around westerners, but i don’t care about that. The thing is, the people are not only dying because you won’t share, people are dying directly because of your positive action. There is a strong bias to consider the inaction to be lesser way to commit evil. A few degrees can kill a lot of people via many mechanisms)
So were you, like, strawmanning yourself? :-/
Still not seeing how it is even strawman, btw. I was listing a broader category of people (those without aircon), which includes any narrow categories of people (those who lack clean water, sufficient food, …… (and aircon) for example). Furthermore those without aircon (or heating) themselves contribute rather little to the warming (both directly and indirectly), so it is pretty damn bad deal for them. I have not, however, implied that the suggested course of action consists of giving them aircon, or that stopping the AGW is the ‘cheapest’ thing you could do to help, or the like. That’s you guys making strawmans. In so much as you can understand that your leisure activities produce CO2, which results in warming, which kills off people somewhere else in the world—by direct action, akin to having fun and throwing a glass bottle out of moving vehicle and killing someone—you should think hard about somehow compensating them for this.
Maybe there’s so much fun in throwing a bottle out of a moving vehicle, and you’re so rich, that you don’t want to give this up and would rather pay. But compensate you must.
People here have a way of taking anything more literally than any other online community I interacted with. I don’t know if it is Aspergers, or rationality, or local rule to be extremely literal, or what.
If you don’t understand why this is, (largely local rule/custom) I would recommend reading or rereading the words sequence.
Well, a reasonable custom is to assume that the side you disagree with, has something not completely idiotic to say; if you are not to do that, then no amount of clarity ever helps.
People here generally do know what is meant by “death”. But currently the global epidemic of e.g. death by old age is much more widespread, much more lethal, much more unnoticed in comparison to its relative disutility than the most likely dangers of AGW.
We already know of cheap ways to terraform and cool down our planet if we need to.
Fighting global warming would probably save some, perhaps many lives. But ask yourself:
Instead of what?
At what cost?
Have you read the core sequences? If not I suggest you read up on optimal philanthropy. I also suggest you learn to reflexively on the 5 second level learn include or at least remind yourself of opportunity cost.
There’s a three-pronged answer to this, as I see it.
First: there’s a tacit moratorium on partisan-coded issues around here which do not directly concern the science of rationality or (to a lesser extent) AI. Even those on which a broad consensus exists: the reasoning most often given is that a vocally partisan position on such topics would position LW to attract like-minded partisans and thus dilute its rationality focus. The politics of religion is something of an exception; it’s essentially treated as a uniquely valuable example of certain biases, though I suspect that status in practice has more to do with the grandfather clause. Anthropogenic global warming is not a uniquely valuable example of any bias I can think of; it’s a salient one, but salience often comes with drawbacks.
Second: LW is not a debunking blog, nor a forum dedicated to cheering scientific consensus over folk wisdom, and it should not be except insofar as doing so serves the art and science of rational thinking. There’s considerable overlap between LW’s natural audience and that of sites which are devoted to those topics, which has on occasion misled (often ideologically opposed) newcomers into thinking it’s such a site, but even if a general consensus exists that LW’s theory and practice tends to lead to certain positions, it behooves us to guard against adopting those positions as markers of group identity. The easiest way to do that is not to talk about them.
Third, and probably most embarrassingly from the standpoint of healthy group epistemology: by the last census/survey LW is disproportionately politically libertarian, though adherents of that ideology are an absolute minority ([left-]liberalism is slightly more popular, socialism slightly less, other political theories much less). The severity of, proper response to, and to a lesser extent existence of anthropogenic global warming remains an active topic of debate in libertarian circles, though less so in recent years. Higher sensitivity to AGW than to other conservative-coded positions may in part be a response to these demographics.
I understand that, but would AI be able to stay an exception if any particular risks become as controversial as AGW ?
With regards to the global warming, if you provisionally take that rational person tends to have a stance on the AGW which is in alignment with scientific consensus, then the AGW supporters that join the issue are better on average at rationality; especially the applied rationality; not worse. If you, however, proposition that rational person tends to have a stance on the AGW in disagreement with the scientific consensus—then okay, that is a very valid point that you don’t want those aligned with scientific consensus to join in. Furthermore I don’t see what’s so special about religion.
I am a sort of atheist, but I see the support for atheism as much, much more shaky than support for AGW, and I know many people who are theists of various kinds, and are otherwise quite rational, while I do not know anyone even remotely rational in disagreement with scientific consensus, who is not a scientist doing novel research that disagrees with the consensus personally himself.
If AI in general or uFAI in paticular becomes a politicized issue (not quite identical to “controversial”) to the extent that AGW now is, I suspect it’ll be grandfathered in here by the same mechanism that religion now is; it’s too near and dear a topic to too many critical community members for it to ever be entirely dismissed. However, its relative prominence might go down a notch or two—moves to promote this may already be happening, given the Center for Modern Rationality’s upcoming differentiation from SIAI.
As to applied rationality and AGW: I view agreement with the mainstream climatology position as weak but positive evidence of sanity. However, I don’t view it as particularly significant to the LW mission in a direct sense, and I think taking a vocal position on the subject would likely lower the sanity waterline by way of scaring off ideologically biased folks who might be convinced to become less ideologically biased by a consciously nonpartisan approach. There’s a lot more to lose here, rationality-wise, than there is to gain.
Well, that’s too bad then. I came to post there after reading the Eliezer posts on the many worlds interpretation, where he tried to debunk the SI (now that really polarizes people politically, even though its not linked to politics. Trying to debunk a well established method that works). He is somewhat sloppy at quantum mechanics, and makes some technical errors, but it is very good content nonetheless that I really enjoyed. I don’t enjoy the meta-meta so much.
Er, what? What about diseases promoted by high-population densities in cities and spread via modern international travel? Surely those will kill many more—and are more important in practically every way.
Yes. But if you can’t reasonably talk about AGW without going into some form of denial—you won’t be able to talk about the diseases rationally either. The ability to talk rationally rapidly falls off with the scariness of the scenario; fear and importance of issue do not make you think clearer; only less clearly. AGW is a stepping stone of what such issue looks like when there is high quality science on the issue, telling us rather dis-comfortable things, and when we have to give up something for prevention.
Road traffic accidents kill over a million people per year. I doubt global warming will ever kill people at anywhere near that rate. But for all that, we talk about global warming a lot more than road accidents.
Pneumonia kills over 100 million people each year (!?! actually more like 4 million per year).
I doubt global warming will ever kill people at anywhere near that rate.
In fact, since—as Freeman Dyson says—more people die from cold than from heat, global warming will probably reduce the overall death rate:
According to “this report about the study “Causes for the recent changes in cold- and heat-related mortality in England and Wales”:
The study says that changes in heat-related mortality in the UK are smaller for warming than for cooling over the past 4 decades by two orders of magnitude.
However, one study produced different findings in the USA - here.
Where did you get that number? Wikipedia puts it at 4 million per year.
Oops—over 100 million people each year contract pneumonia.
There’s an extent to which I am totally willing to have a moratorium on global warming. After all, you don’t try to convince a religious fundamentalist that they’re wrong by going “you’re totally wrong, you have no sound basis for thinking the bible is inerrant”—anything that even assumes that is just going to raise their cognitive walls. You change their mind by helping them develop their critical thinking skills. Similarly, I wouldn’t say to a fundamentalist-equivalent skeptic “here, have a link to some IPCC report that shows that you’re wrong, I’m sure you can check it out yourself.” Counterproductive again. So even if theism or global warming are useful examples, I’m willing to avoid them, or certain statements about them, in many cases.
But at the same time I’d rather not just ignore the issue. Fortunately, I’ve found a certain forum that lets community members write posts to develop readers’ critical thinking skills :D So maybe I’ll do that.
Well, it is hard to discuss the improvement in productivity of arguments without any specific examples of arguing. It is hard to think correctly about abstract topics; people perform Wason Selection Task better when it is described in terms of anything specific, than letters and numbers.
The main problem with global warming is not that people who can’t afford air conditioning will be less comfortable—the warming is a few degrees, so people wouldn’t even notice it without measurements. The problem is that the warming might lead to rising sea levels (and there are a helluva lot of people living near the sea), and maybe (I’m not sure these things are well-understood) more and stronger hurricanes, and stuff like that.
Nah, they simply won’t notice the death rate increase without statistics, doesn’t mean it won’t be increasing; when you have population of camels under the load that breaks significant percentage of camel backs… adding extra weight has linear effect for small weights.
The reason it is a fairly productive topic wrt charity (in general), is that it is easy to rationalize lack of action, but it is harder to rationalize positive action that kills. Yes, theoretically, biases are bad for giving, practically, eliminating biases in giving decreases the giving (i think there was even a link to a study posted here, about that). People are biased and imperfect and are more likely to donate to better causes if one is aware that one is committing action that kills, rather than mere inaction.
Not saying anything directly about the immediate topic, argument screens off authority and we really do have examples of experts getting things terribly off. Even if that weren’t the case, “trust the experts” would still be a terrible heuristic as it’s likely to be gamed.
Interesting that this has no comments yet. I do not know why this subject is treated as ‘political’ or ‘controversial’. This group should not be anti-science or ‘head in the sand’, but it seems to be.