I suppose the point is that “higher crime rate” is a deceptive piece of language. When people use that phrase there tends to be an implicit assumption that “higher crime rate” = “exactly correspondingly higher danger to you, the average visitor.” This is rarely true. You may be subject to more risk in a high crime rate area than you would be in a nearby lower crime rate area, but that leads to the next point—the issue of unqualified relativity.
The word “higher” in “higher crime rate” is relative and allows you to attach whatever meaning you want. “Higher” relative to what? To adjacent neighborhoods? To neighborhoods in similar cities with similar demographics and/or socioeconomic profiles? How much risk do you subject yourself to by taking a stroll down the streets of that area? Is it more or less risk than you take by riding your bicycle a short distance without a helmet, or walking around outside in a rainstorm? I’m sure someone’s job is to quantify this information, but when the average citizen tells you “don’t go west of 43rd, it’s a high crime rate area,” you really have no idea what they’re trying to tell you and they probably have no concrete idea what they mean.
edit: Lest I confuse my own point, people tend to very often be flat-out wrong about the crime statistics in their own cities. Local consensus about which areas are dangerous follows an availability cascade, where you tell people that an area is dangerous because that’s what people told you when you moved there, and nobody ever goes online and notices that the crime rate in that area is actually no higher. I encourage everyone to Google the crime rates in their own cities, for you may be surprised.
Also, the difference in crime rate might amount to something like “if you walk through the areas once a day, you’ll be mugged on average once every ten years or once every thirty years.”
There may be a difference between the rate at which a resident (who’s probably at similar income to other residents in the area, and perceived as an insider) would be mugged, and the rate at which a visitor (seen as likely to be carrying more money, and perceived as a outsider, and the residents won’t have to come face to face with them or their families again) would be mugged.
That said, as I noted in a previous comment, there are definitely cases where people are prone to designate a place as “dangerous” when it’s not actually statistically more dangerous than where they already live. The fact that visiting these neighborhoods may be more dangerous per time spent than living in them doesn’t make it likely that people who refuse to visit them are assessing risk realistically.
Getting mugged once every thirty years means that there’s a 3.3% chance that you will get mugged in any given year. According to this data, the Robbery rate in Metropolitan Areas in 2009 was 133 per 100,000, meaning that each individaul stood about .133% chance of getting robbed that year. Note that this data likely includes instances of robbery that we wouldn’t think of as mugging, so the actual chance of getting mugged is probably lower.
Edit: apparently Metropolitan Areas include suburbs; To get a better picture of what crime rates are in urban areas, he’s the robbery rates for selected big cities (all per 100,000 people): New York 221; LA 317; Chicago 557; Houston 500; Dallas 426; Detroit 661; San Francisco 423; Boston 365. So, somwhat higher than the .133% number I gave earlier, but still well below the numbes that the Grandparent Post implied.
It would be nice if someone compiled a statistic with photos of various dangerous people, and how many micromorts each type of person represents for you. Also micromort maps of different towns and parts of town.
Even better, compare it with risk levels of other activities. For example “it is better to jump from an airplane with a parachute to avoid this person, but not to avoid that one”.
I suppose the point is that “higher crime rate” is a deceptive piece of language. When people use that phrase there tends to be an implicit assumption that “higher crime rate” = “exactly correspondingly higher danger to you, the average visitor.” This is rarely true. You may be subject to more risk in a high crime rate area than you would be in a nearby lower crime rate area, but that leads to the next point—the issue of unqualified relativity.
The word “higher” in “higher crime rate” is relative and allows you to attach whatever meaning you want. “Higher” relative to what? To adjacent neighborhoods? To neighborhoods in similar cities with similar demographics and/or socioeconomic profiles? How much risk do you subject yourself to by taking a stroll down the streets of that area? Is it more or less risk than you take by riding your bicycle a short distance without a helmet, or walking around outside in a rainstorm? I’m sure someone’s job is to quantify this information, but when the average citizen tells you “don’t go west of 43rd, it’s a high crime rate area,” you really have no idea what they’re trying to tell you and they probably have no concrete idea what they mean.
edit: Lest I confuse my own point, people tend to very often be flat-out wrong about the crime statistics in their own cities. Local consensus about which areas are dangerous follows an availability cascade, where you tell people that an area is dangerous because that’s what people told you when you moved there, and nobody ever goes online and notices that the crime rate in that area is actually no higher. I encourage everyone to Google the crime rates in their own cities, for you may be surprised.
Also, the difference in crime rate might amount to something like “if you walk through the areas once a day, you’ll be mugged on average once every ten years or once every thirty years.”
There may be a difference between the rate at which a resident (who’s probably at similar income to other residents in the area, and perceived as an insider) would be mugged, and the rate at which a visitor (seen as likely to be carrying more money, and perceived as a outsider, and the residents won’t have to come face to face with them or their families again) would be mugged.
That said, as I noted in a previous comment, there are definitely cases where people are prone to designate a place as “dangerous” when it’s not actually statistically more dangerous than where they already live. The fact that visiting these neighborhoods may be more dangerous per time spent than living in them doesn’t make it likely that people who refuse to visit them are assessing risk realistically.
Either of those would be really high, though!
Getting mugged once every thirty years means that there’s a 3.3% chance that you will get mugged in any given year. According to this data, the Robbery rate in Metropolitan Areas in 2009 was 133 per 100,000, meaning that each individaul stood about .133% chance of getting robbed that year. Note that this data likely includes instances of robbery that we wouldn’t think of as mugging, so the actual chance of getting mugged is probably lower.
Edit: apparently Metropolitan Areas include suburbs; To get a better picture of what crime rates are in urban areas, he’s the robbery rates for selected big cities (all per 100,000 people): New York 221; LA 317; Chicago 557; Houston 500; Dallas 426; Detroit 661; San Francisco 423; Boston 365. So, somwhat higher than the .133% number I gave earlier, but still well below the numbes that the Grandparent Post implied.
It would be nice if someone compiled a statistic with photos of various dangerous people, and how many micromorts each type of person represents for you. Also micromort maps of different towns and parts of town.
Even better, compare it with risk levels of other activities. For example “it is better to jump from an airplane with a parachute to avoid this person, but not to avoid that one”.