Has anyone else ever heard that a certain area of their city was “the bad part of town,” and that they should never go there, and then later realize (after perusing crime statistics and visiting that area) that the only “bad” thing about the area is that it’s predominantly Hispanic (for example)?
I thought areas like that are where poorer people live, which leads to a higher crime rate.
In some cases this is true, but the associate between income and crime rate is not hard and fast, and people often overestimate it. Plus, people will often mistake “concentration of racial minorities” for “unsafe.”
The book I just finished discusses a particular community where the geographical divide of a particular street also has functioned for decades as a racial divide, and people on the white side continue to believe that the black side is “the bad part of town,” and avoid it as dangerous, with businesses even refusing to deliver there, when crime statistics fail to bear out that it’s any more dangerous. In fact, there is a dangerous part of town, where crime rates are particularly elevated, but it’s only a small part of the alleged dangerous part of town.
Red Holsteins are the result of a recessive gene in the more usual black and white Holsteins(1). Considerable and expensive efforts were made to eliminate the trait until is was discovered that they’re at least as productive as Holsteins with black spots.
This should be remembered when you think tradition means people are doing something reasonable. Tradition, like evolution, probably means that disastrously bad traits are eliminated, not that exists is anywhere near excellence.
(1)]In the US, a white cow with black spots is the default image for a milk cow, and for those of us less educated about cattle, for cows in general.
I don’t think that’s necessarily it, I think it’s more like “avoiding that section of town costs me so little that I don’t want to bother to think about whether avoiding it makes sense”.
Just because there’s a higher crime rate there doesn’t mean that each individual there is looking to mug you, though. (Football fans may be more prone to fistfights than baseball fans, but that doesn’t mean that every football fan is going to beat you up — or that no baseball fan will do so.)
No, but I’d also expect that lower income areas would be more likely to mug someone in particular. You have to be pretty poor to be willing to attack someone for pocket change.
I suppose the point is that “higher crime rate” is a deceptive piece of language. When people use that phrase there tends to be an implicit assumption that “higher crime rate” = “exactly correspondingly higher danger to you, the average visitor.” This is rarely true. You may be subject to more risk in a high crime rate area than you would be in a nearby lower crime rate area, but that leads to the next point—the issue of unqualified relativity.
The word “higher” in “higher crime rate” is relative and allows you to attach whatever meaning you want. “Higher” relative to what? To adjacent neighborhoods? To neighborhoods in similar cities with similar demographics and/or socioeconomic profiles? How much risk do you subject yourself to by taking a stroll down the streets of that area? Is it more or less risk than you take by riding your bicycle a short distance without a helmet, or walking around outside in a rainstorm? I’m sure someone’s job is to quantify this information, but when the average citizen tells you “don’t go west of 43rd, it’s a high crime rate area,” you really have no idea what they’re trying to tell you and they probably have no concrete idea what they mean.
edit: Lest I confuse my own point, people tend to very often be flat-out wrong about the crime statistics in their own cities. Local consensus about which areas are dangerous follows an availability cascade, where you tell people that an area is dangerous because that’s what people told you when you moved there, and nobody ever goes online and notices that the crime rate in that area is actually no higher. I encourage everyone to Google the crime rates in their own cities, for you may be surprised.
Also, the difference in crime rate might amount to something like “if you walk through the areas once a day, you’ll be mugged on average once every ten years or once every thirty years.”
There may be a difference between the rate at which a resident (who’s probably at similar income to other residents in the area, and perceived as an insider) would be mugged, and the rate at which a visitor (seen as likely to be carrying more money, and perceived as a outsider, and the residents won’t have to come face to face with them or their families again) would be mugged.
That said, as I noted in a previous comment, there are definitely cases where people are prone to designate a place as “dangerous” when it’s not actually statistically more dangerous than where they already live. The fact that visiting these neighborhoods may be more dangerous per time spent than living in them doesn’t make it likely that people who refuse to visit them are assessing risk realistically.
Getting mugged once every thirty years means that there’s a 3.3% chance that you will get mugged in any given year. According to this data, the Robbery rate in Metropolitan Areas in 2009 was 133 per 100,000, meaning that each individaul stood about .133% chance of getting robbed that year. Note that this data likely includes instances of robbery that we wouldn’t think of as mugging, so the actual chance of getting mugged is probably lower.
Edit: apparently Metropolitan Areas include suburbs; To get a better picture of what crime rates are in urban areas, he’s the robbery rates for selected big cities (all per 100,000 people): New York 221; LA 317; Chicago 557; Houston 500; Dallas 426; Detroit 661; San Francisco 423; Boston 365. So, somwhat higher than the .133% number I gave earlier, but still well below the numbes that the Grandparent Post implied.
It would be nice if someone compiled a statistic with photos of various dangerous people, and how many micromorts each type of person represents for you. Also micromort maps of different towns and parts of town.
Even better, compare it with risk levels of other activities. For example “it is better to jump from an airplane with a parachute to avoid this person, but not to avoid that one”.
I thought areas like that are where poorer people live, which leads to a higher crime rate.
In some cases this is true, but the associate between income and crime rate is not hard and fast, and people often overestimate it. Plus, people will often mistake “concentration of racial minorities” for “unsafe.”
The book I just finished discusses a particular community where the geographical divide of a particular street also has functioned for decades as a racial divide, and people on the white side continue to believe that the black side is “the bad part of town,” and avoid it as dangerous, with businesses even refusing to deliver there, when crime statistics fail to bear out that it’s any more dangerous. In fact, there is a dangerous part of town, where crime rates are particularly elevated, but it’s only a small part of the alleged dangerous part of town.
Is there a term for the way biases persist because the cost of updating them seems high compared to the cost of maintaining the bias?
http://en.wikipedia.org/wiki/Rational_ignorance
Red Holsteins are the result of a recessive gene in the more usual black and white Holsteins(1). Considerable and expensive efforts were made to eliminate the trait until is was discovered that they’re at least as productive as Holsteins with black spots.
This should be remembered when you think tradition means people are doing something reasonable. Tradition, like evolution, probably means that disastrously bad traits are eliminated, not that exists is anywhere near excellence.
(1)]In the US, a white cow with black spots is the default image for a milk cow, and for those of us less educated about cattle, for cows in general.
Sounds like a sort of double-counting. “If I stopped believing the bad side of town is the bad side, then I might go over there and get mugged!”
I don’t think that’s necessarily it, I think it’s more like “avoiding that section of town costs me so little that I don’t want to bother to think about whether avoiding it makes sense”.
Or higher crime leads to poorer people, or some other factor causes both (this is my bet).
Just because there’s a higher crime rate there doesn’t mean that each individual there is looking to mug you, though. (Football fans may be more prone to fistfights than baseball fans, but that doesn’t mean that every football fan is going to beat you up — or that no baseball fan will do so.)
No, but I’d also expect that lower income areas would be more likely to mug someone in particular. You have to be pretty poor to be willing to attack someone for pocket change.
I was not disagreeing with you; I was clarifying a difference between the claim that you made and that which moridinamael made.
Also people who are dangerous, may not be dangerous to you specifically.
Or the other way around.
I suppose the point is that “higher crime rate” is a deceptive piece of language. When people use that phrase there tends to be an implicit assumption that “higher crime rate” = “exactly correspondingly higher danger to you, the average visitor.” This is rarely true. You may be subject to more risk in a high crime rate area than you would be in a nearby lower crime rate area, but that leads to the next point—the issue of unqualified relativity.
The word “higher” in “higher crime rate” is relative and allows you to attach whatever meaning you want. “Higher” relative to what? To adjacent neighborhoods? To neighborhoods in similar cities with similar demographics and/or socioeconomic profiles? How much risk do you subject yourself to by taking a stroll down the streets of that area? Is it more or less risk than you take by riding your bicycle a short distance without a helmet, or walking around outside in a rainstorm? I’m sure someone’s job is to quantify this information, but when the average citizen tells you “don’t go west of 43rd, it’s a high crime rate area,” you really have no idea what they’re trying to tell you and they probably have no concrete idea what they mean.
edit: Lest I confuse my own point, people tend to very often be flat-out wrong about the crime statistics in their own cities. Local consensus about which areas are dangerous follows an availability cascade, where you tell people that an area is dangerous because that’s what people told you when you moved there, and nobody ever goes online and notices that the crime rate in that area is actually no higher. I encourage everyone to Google the crime rates in their own cities, for you may be surprised.
Also, the difference in crime rate might amount to something like “if you walk through the areas once a day, you’ll be mugged on average once every ten years or once every thirty years.”
There may be a difference between the rate at which a resident (who’s probably at similar income to other residents in the area, and perceived as an insider) would be mugged, and the rate at which a visitor (seen as likely to be carrying more money, and perceived as a outsider, and the residents won’t have to come face to face with them or their families again) would be mugged.
That said, as I noted in a previous comment, there are definitely cases where people are prone to designate a place as “dangerous” when it’s not actually statistically more dangerous than where they already live. The fact that visiting these neighborhoods may be more dangerous per time spent than living in them doesn’t make it likely that people who refuse to visit them are assessing risk realistically.
Either of those would be really high, though!
Getting mugged once every thirty years means that there’s a 3.3% chance that you will get mugged in any given year. According to this data, the Robbery rate in Metropolitan Areas in 2009 was 133 per 100,000, meaning that each individaul stood about .133% chance of getting robbed that year. Note that this data likely includes instances of robbery that we wouldn’t think of as mugging, so the actual chance of getting mugged is probably lower.
Edit: apparently Metropolitan Areas include suburbs; To get a better picture of what crime rates are in urban areas, he’s the robbery rates for selected big cities (all per 100,000 people): New York 221; LA 317; Chicago 557; Houston 500; Dallas 426; Detroit 661; San Francisco 423; Boston 365. So, somwhat higher than the .133% number I gave earlier, but still well below the numbes that the Grandparent Post implied.
It would be nice if someone compiled a statistic with photos of various dangerous people, and how many micromorts each type of person represents for you. Also micromort maps of different towns and parts of town.
Even better, compare it with risk levels of other activities. For example “it is better to jump from an airplane with a parachute to avoid this person, but not to avoid that one”.