But some have questioned the group’s report, as noted in this USA Today article.
But government and other statistical experts take issue with the methods used to compile the ranking. The group averaged three years of data (2002-2004) from the Centers for Disease Control and Prevention's Behavioral Risk Factor Surveillance System, a state-by-state telephone system in which participants report their own weight and height.
Because people are believed to underestimate weight and overestimate height, some experts say actual obesity rates are higher than data suggest in self-reported surveys of this kind.
CDC officials say the ranking is misleading for a more technical reason. "This is not a valid statistical comparison," says Michael Link, a senior survey methodologist at the agency.
The percentage of obese adults in each state actually could be several points higher or lower than the numbers indicate, Link says. Because the sample size varies from state to state, each one has a different margin of error, which means the states can't be compared without giving that range, he says. [Emphasis added]
Advocacy groups often release studies and reports with no real intrinsic value other than the sensational media attention they generate (which, I realize, is the point with such groups). But it is disingenuous to try to influence public opinion with bad statistics. Or no statistics.
And that brings me to my annual moment of tooth-grinding frustration, when Men’s Fitness reveals its annual ranking of the fattest cities in America.
The Men’s Fitness survey is hopelessly flawed. It postures as a genuine statistical report, which is risible because its methods are lazy and not based on any legitimate method that I am aware of. Granted, I am not a professional in the field of statistics, but I question their choice of factors, and their blind reliance on an Internet telephone book for data strikes me as a poor method of statistical analysis.
Let me explain why I feel this way.
Men’s Fitness has graciously explained their method on their Web site (here), and I will address each portion as I see fit to explain why I look on the survey with contempt.
[The block quotes are from the Men’s Fitness Web site, and the emphases have been added by me.]
How We Did It - The 50 largest U.S. cities were selected using the most recent United States Census Bureau statistics available at the time of the survey, which was conducted from August 2004 through October 2004. Cities were assessed in 14 equally weighted categories, using data specific to each city, except as noted when data was available only for a metropolitan statistical area or for a state. (When no data was available, an average score was assigned.) The categories were selected as indicators, risk factors or relevant environmental determinants affecting fitness, obesity and health.
Indeed, there are a lot of categories, and I do not see why they all should be weighted equally. Should equal consideration be given to hard data and assumptions, as some of the categories surely cover?
Also, the survey makes up for a lack of data with an average score. Admittedly, I am not sure if this practice is common among statisticians and auditors, but I would think the report would just reflect that no data is available for certain markets.
The cities were ranked first to last and assigned numerical grades based on a relative curve. The scores were then translated into letter grades, which, while a more familiar point of reference, eliminated some of the scoring nuances. Since the survey is based on a comparative scale, with cities ranked solely in relation to each other, some positions and grades may have shifted from last year without necessarily indicating significant statistical changes.
Why was this done? Was something wrong with the numerical grades? How are letter grades more familiar than numerical grades?
Gyms/Sporting Goods - Composite score, equally weighing (a) total number of clubs, gyms and fitness studios ranked per 100,000 population, from YellowPages.com; and (b) total number of sporting-goods retailers ranked per 100,000 population, from YellowPages.com.
Does YellowPages.com include all clubs, gyms, fitness studios, and sporting-goods retailers in any given market? I doubt it. What about those clubs and gyms that aren’t listed? Without them, the survey is less accurate and less reliable.
Besides, counting fitness clubs does not take into account the people who work out at home or on the road. My wife and I work out together, but we do so in our house and not at an expensive gym.
Nutrition - Composite score, equally weighing (a) average frequency of fruit and vegetable consumption (percent that consumes five or more servings per day) in state-level data from the Centers for Disease Control and Prevention's Behavioral Risk Factor Surveillance System; and (b) total number of health-food stores ranked per 100,000 population, from YellowPages.com.
Again, I doubt all health-food stores in the nation are listed on YellowPages.com.
Also, the Men’s Fitness survey is relying on data from the same source as the scorned Trust for America’s Health report, i.e., the CDC surveillance system that relies on people to be honest about how tall they are and how much they weigh.
Exercise/Sports - Total participation in 103 sports and fitness-related activities. Measured by participants per 100 residents for the top 30 metropolitan statistical areas and by state. State-level data used when no metropolitan data available. Honolulu and Wichita, not surveyed, were given average scores. Data from the Superstudy of Sports Participation Geographic Supplement, from American Sports Data Inc.
A “superstudy” is the source of the data, but does it take into account people who work out at home?
Also, Honolulu and Wichita get the short end of the stick. They were assigned average scores. I wonder if that includes surfers?
Overweight/Sedentary - Composite score according to the Centers for Disease Control and Prevention's Behavioral Risk Factor Surveillance System, equally weighing (a) percentage of population that is obese; (b) percentage of population at risk for health problems related to being overweight; (c) percentage of population at risk for health problems related to lack of exercise; and (d) percentage of population not participating in physical activity. SMART (selected metropolitan-micropolitan area risk trends) data used for specific cities. State data used where city data unavailable.
Again, Men’s Fitness is relying on the CDC surveillance system, which (according to Michael Link, a survey methodologist at the CDC) doesn’t reflect the actual number of obese adults in any given state.
Junk Food - Total number of fast-food outlets, pizza parlors, ice cream shops and doughnut stores ranked per 100,000 population, from YellowPages.com.
Again with the YellowPages.com. Is that gospel?
Besides, a strict cumulative count of fast-food outlets does not take into account eating habits. Should MacDonald’s be given equal ranking along with Subway, which offers several low-fat menu items? What about new menu items that are much healthier? I went to Jack in the Box today, but I did not have cheeseburger. I ordered an Asian chicken salad with a diet Dr. Pepper. Might others in the land of fast-food chains be doing the same?
Alcohol - Composite score, equally weighing (a) total number of bars/taverns ranked per 100,000 population, from YellowPages.com; and (b) apparent alcohol consumption by state, from the surveillance report of the National Institute on Alcohol Abuse and Alcoholism.
OK. They've admitted this is just a complete guess. How statistically significant can it be to measure apparent data?
And, again, the gospel according to YellowPages.com.
TV - Metered Market HUT (Homes Using Television) Analysis, Primetime, June 1, 2003-May 31, 2004, from Nielsen Media Research. Average or regional scores assigned to cities where specific data unavailable.
Does anyone still believe the Nielsen numbers? Have they ever? I never have.
Air Quality - The air-quality index is based on annual reports from the Environmental Protection Agency. The number of ozone-alert days is used as an indicator of air quality, as are the amounts of pollutants, including particulates, carbon monoxide, sulfur dioxide, lead, and volatile organic chemicals. From Sperling's Best Places.
What does this have to with fatness? Don’t lean and hefty people all breathe the same air?
Climate - The climate index is based on National Weather Service data combining estimated annual days above 32 degrees and below 90 degrees, amounts of precipitation and sunshine, and the August heat/humidity index.
Wow. Cities in Texas and Florida get bad marks right from the start because of the weather. As does Alaska. That seems lame. Good thing Canada was not included in the survey.
Geography - Accessible recreational forests, lakes, rivers, waterways, mountains, and ocean beaches, compiled from almanacs and additional sources.
I don’t know about you or the researchers at Men’s Fitness, but I don’t need a mountain to do pushups. Besides, fatties can float on lakes, rivers, and waterways.
Commute - Based on the Travel Time Index, which measures traffic delays due to congestion, according to the Urban Mobility Report from the Texas Transportation Institute at Texas A&M University. Average score for small cities assigned to Tulsa and Wichita.
I’ll concede this category. Long commutes can discourage activity, which can lead generally worse health. We all could walk more.
Parks/Open Space - Composite score, equally weighing (a) total acreage per 10,000 population of federal and state recreation areas plus all listed water areas, from the Places Rated Almanac; (b) number of city parks per 10,000 population, according to a 2004 Men's Fitness custom survey; and (c) acres of city parks and recreational open space per 10,000 population, according to a 2004 Men's Fitness custom survey.
See my remarks to the “Geography” category above.
Recreation Facilities - Composite score based on totals per 10,000 population, from a a 2004 Men's Fitness custom survey, equally weighing (a) number of public basketball courts; (b) number of public swimming pools; (c) number of public tennis courts; and (d) number of public golf courses.
What about roadways for runners and bicyclists? Why weren’t they taken into consideration? Are golfers inherently more fit than bicycle riders? Do you think John Daly could take Lance Armstrong?
Health Care - Based on city-by-city ranking of health resources, access, cost of hospital stay, and cost of doctors' visits, as measured by Sperling's Best Places.
How is this a direct indicator of fitness? My last doctor visits had more to do with colds and allergies than anything else.
OK, I’m done. Needless to say, I think the rankings released by Men’s Fitness every year are bogus, and the media and local officials give the whole thing way too much consideration.
To which, I give a healthy “faugh”!
No comments:
Post a Comment