Always good to see this in the news.....
Post Reply
 
Thread Rating:
  • 0 Votes - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
14-09-2015, 02:32 PM
RE: Always good to see this in the news.....
(14-09-2015 02:05 PM)Imathinker Wrote:  
(14-09-2015 01:13 PM)cjlr Wrote:  Then absolutely - and provably - can.

There is room for methodological errors and bias, but population statistics are quite sound.

Exactly correct, as long as there are various demographics involved, the American public can be accurately represented by less people than you'd think. The 2.8-7.8 million is a wide interval but there is probably a very high (99%) confidence level. They could tell you it's 5-5.5 million at 70% confidence but that doesn't do anyone any good. It's the same way they poll people at election to predict the results. Stats are cool if done right.

Indeed. It always rather bothers me to see statistical data reported on without the interval and confidences, but since most people don't understand statistics anyway...

... this is my signature!
Find all posts by this user
Like Post Quote this message in a reply
[+] 1 user Likes cjlr's post
14-09-2015, 02:49 PM (This post was last modified: 14-09-2015 02:59 PM by goodwithoutgod.)
RE: Always good to see this in the news.....
(14-09-2015 02:05 PM)Imathinker Wrote:  
(14-09-2015 01:13 PM)cjlr Wrote:  Then absolutely - and provably - can.

There is room for methodological errors and bias, but population statistics are quite sound.

Exactly correct, as long as there are various demographics involved, the American public can be accurately represented by less people than you'd think. The 2.8-7.8 million is a wide interval but there is probably a very high (99%) confidence level. They could tell you it's 5-5.5 million at 70% confidence but that doesn't do anyone any good. It's the same way they poll people at election to predict the results. Stats are cool if done right.

Yeah I took statistics and am familiar with the concepts, but there are the variables like who gets polled for example. People who decide to participate tend to be more outspoken about their views, it is like comment cards at a business, most people who comment had an issue with the business, and if you read the comment cards and use that as a basis for the quality of the business you get a skewed view. people that are content with the business just pay and leave. People who participate in surveys wish to do so for a reason. Also how the question is presented and by whom can sway the results. But I agree that overall surveys and polls can give you a collective idea about how the majority may feel, but have never felt comfortable with the accuracy. Contributory factors like demographics, income, education, personal agendas, politics, religious views etc all can sway the answers which may not necessary be the litmus test for accuracy with broad statements like, "Majority of americans think blah blah blah"....no the majority of the 1400 people you polled, who had agreed to be available for polls said blah blah blah.

Pew Research did a study of their own studies recently. It was sort of a meta study, if you will. And they found that only nine percent of the people they tried to reach for their public opinion surveys actually respond. Just in 1997, when Pew did this same methodology survey, they found that 36 percent of the people they were trying to reach actually responded. So one must question why the decline in willingness to respond? Why did those that agreed to respond agree? Most likely they felt strong enough about stating their opinion to do so, those with strong opinions may not actually be the voice of millions..but rather a slice of the small percentage of people asked to do a poll....resulting in only 9% of them willing to do the poll. See my point? Again, I took statistics as I am sure you have and am well aware of the overall "accuracy rate" and "expected +/- errancy" rates, but I still insist the results are suspect based on my above points.

At my work (US NAVY) we do command wide surveys to ascertain if we have issues like fraternization, sexual harrassment, Equal Opportunity issues etc...as an Officer I get to sit down with the Captain as he shows the Wardroom the results so we can discuss and craft a plan of action to rectify the issue. What we find is a lot of those who bother to answer have agendas, issues, and personal problems. Those that don't answer can't be bothered because they dont have any issues to complain about. So we have to factor in that those that reply have a negative outlook, and do not represent the command as a whole. We still have to conduct training, and perform quiet investigations to all allegations, but with rare exception, this is the case. Those that provide answers do so because they have strong feelings, or an agenda. Thus, their input does not reflect the overall command as a whole....

Perhaps Pew Research center says it best, "The accuracy of a poll depends on how it was conducted. Most of Pew Research’s polling is done by telephone. By contrast, most online polls that use participants who volunteer to take part do not have a proven record of accuracy. There are at least two reasons for this. One is that not everyone in the U.S. uses the internet, and those who do not are demographically different from the rest of the public. Another reason is that people who volunteer for polls may be different from other people in ways that could make the poll unrepresentative. At worst, online polls can be seriously biased if people who hold a particular point of view are more motivated to participate than those with a different point of view."

http://www.pewresearch.org/2010/12/29/ho...ine-polls/

Just a different perspective from a guy who overthinks things Drooling

"Belief is so often the death of reason" - Qyburn, Game of Thrones

"The Christian community continues to exist because the conclusions of the critical study of the Bible are largely withheld from them." -Hans Conzelmann (1915-1989)
Find all posts by this user
Like Post Quote this message in a reply
14-09-2015, 04:02 PM (This post was last modified: 14-09-2015 04:09 PM by GirlyMan.)
RE: Always good to see this in the news.....
(14-09-2015 01:13 PM)cjlr Wrote:  
(14-09-2015 12:38 PM)goodwithoutgod Wrote:  Rolleyes yeah because 1143 people can accurately depict the opinion of 300 million people.

Then absolutely - and provably - can.

There is room for methodological errors and bias, but population statistics are quite sound.

I think that the individual random variables have to be identically distributed before the central limit theorem can approximate any distribution with a Gaussian. Apparently they are because its predictive value is strong. And I think sampling is itself a difficult problem. I usually go to Nate Silver for my stats and polling but he identifies several other polling firms he respects.

#sigh
Find all posts by this user
Like Post Quote this message in a reply
[+] 2 users Like GirlyMan's post
14-09-2015, 04:31 PM
RE: Always good to see this in the news.....
Goodwithoutgod, I think we are in agreement. Voluntary response bias, such as internet surveys where you can choose if you want to do it, are horribly biased. Internet survey are worth nothing. There is also bias in the wording of a question if the surveyor wants a specific answer. Phone surveys have nonresponse issues, so you're not reaching that portion of the population. I guess the point I was trying to put out there was there a survey of 35000 is more than powerful enough to represent the population in a mathematical sense, even if Pew can't eliminate nonresponse bias.

Girlyman I don't understand exactly what you're getting at, are you saying the variables they're looking for have to be normally distributed throughout the population? I thought almost everything is going to be normal when you look at a huge population. Yes it is essentially impossible to get a perfect simple random sample but with a large enough sample and efforts to weight the responses it can at least give you a pretty good idea. Hence the wide interval in this case, it's not like they're claiming to know anything exactly.

A man should not believe in an ism, he should believe in himself. -Ferris Bueller

That's what a ship is, you know. It's not just a keel and a hull and a deck and sails, that's what a ship needs but what a ship is... what the Black Pearl really is... is freedom. -Jack Sparrow
Find all posts by this user
Like Post Quote this message in a reply
[+] 1 user Likes Imathinker's post
14-09-2015, 04:43 PM
RE: Always good to see this in the news.....
Well let's hope so. I see American religiosity as a great danger to peace on earth and as such consider the country, although a great ally, volatile. I think most Australians would agree. America is the reason China is arming up.

NOTE: Member, Tomasia uses this site to slander other individuals. He then later proclaims it a joke, but not in public.
I will call him a liar and a dog here and now.
Banjo.
Find all posts by this user
Like Post Quote this message in a reply
14-09-2015, 04:57 PM
RE: Always good to see this in the news.....
(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Yeah I took statistics and am familiar with the concepts, but there are the variables like who gets polled for example. People who decide to participate tend to be more outspoken about their views, it is like comment cards at a business, most people who comment had an issue with the business, and if you read the comment cards and use that as a basis for the quality of the business you get a skewed view. people that are content with the business just pay and leave. People who participate in surveys wish to do so for a reason. Also how the question is presented and by whom can sway the results. But I agree that overall surveys and polls can give you a collective idea about how the majority may feel, but have never felt comfortable with the accuracy.

Are there quantifiable methodological factors you can point to, or it just a matter of feels? What confounding factors above and beyond what the polling firms recognise and account for do you think might be being neglected?
(and do you mean the accuracy, or the precision?)

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Contributory factors like demographics, income, education, personal agendas, politics, religious views etc all can sway the answers which may not necessary be the litmus test for accuracy with broad statements like, "Majority of americans think blah blah blah"....no the majority of the 1400 people you polled, who had agreed to be available for polls said blah blah blah.

If you accept the validity of representative statistics and assume even basic competency on behalf of the polling agency, then I'm entirely unsure as to where the skepticism arises...

Mass media headlines never contain the nuance and methodology, which I did just say was annoying, to be sure, but that certainly doesn't mean that information isn't available.

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Pew Research did a study of their own studies recently. It was sort of a meta study, if you will. And they found that only nine percent of the people they tried to reach for their public opinion surveys actually respond. Just in 1997, when Pew did this same methodology survey, they found that 36 percent of the people they were trying to reach actually responded. So one must question why the decline in willingness to respond? Why did those that agreed to respond agree? Most likely they felt strong enough about stating their opinion to do so, those with strong opinions may not actually be the voice of millions..but rather a slice of the small percentage of people asked to do a poll....resulting in only 9% of them willing to do the poll. See my point? Again, I took statistics as I am sure you have and am well aware of the overall "accuracy rate" and "expected +/- errancy" rates, but I still insist the results are suspect based on my above points.

But why? Do you think there are factors unaccounted for in the analysis?

Can you demonstrate - or even hypothesise - why willingness to respond would be correlated with any specific set of opinions or beliefs? Many questions nonetheless generate large numbers of moderate or even "don't know" answers - which contradicts your supposition.

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  At my work (US NAVY) we do command wide surveys to ascertain if we have issues like fraternization, sexual harrassment, Equal Opportunity issues etc...as an Officer I get to sit down with the Captain as he shows the Wardroom the results so we can discuss and craft a plan of action to rectify the issue. What we find is a lot of those who bother to answer have agendas, issues, and personal problems. Those that don't answer can't be bothered because they dont have any issues to complain about. So we have to factor in that those that reply have a negative outlook, and do not represent the command as a whole. We still have to conduct training, and perform quiet investigations to all allegations, but with rare exception, this is the case. Those that provide answers do so because they have strong feelings, or an agenda. Thus, their input does not reflect the overall command as a whole....

Yes, but that's just the sort of thing actual surveys account for. I mean, "I just don't think polling firms know how to do their jobs" isn't a very good objection, is it? Really?

Though, there is also a clear difference between sampling bias and response bias. And ways to treat both.

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Perhaps Pew Research center says it best, "The accuracy of a poll depends on how it was conducted. Most of Pew Research’s polling is done by telephone. By contrast, most online polls that use participants who volunteer to take part do not have a proven record of accuracy. There are at least two reasons for this. One is that not everyone in the U.S. uses the internet, and those who do not are demographically different from the rest of the public. Another reason is that people who volunteer for polls may be different from other people in ways that could make the poll unrepresentative. At worst, online polls can be seriously biased if people who hold a particular point of view are more motivated to participate than those with a different point of view."

http://www.pewresearch.org/2010/12/29/ho...ine-polls/

Just a different perspective from a guy who overthinks things Drooling

A direct statement from Pew acknowledging the difficulties and uncertainties they know of and try to account for makes you... less confident in their results? They're pretty clear on what they believe to be potential difficulties.

What do you think would substantially affect the accuracy and precision of the results, above and beyond what is already known and accounted for?

... this is my signature!
Find all posts by this user
Like Post Quote this message in a reply
[+] 2 users Like cjlr's post
14-09-2015, 04:59 PM
RE: Always good to see this in the news.....
(14-09-2015 04:02 PM)GirlyMan Wrote:  
(14-09-2015 01:13 PM)cjlr Wrote:  Then absolutely - and provably - can.

There is room for methodological errors and bias, but population statistics are quite sound.

I think that the individual random variables have to be identically distributed before the central limit theorem can approximate any distribution with a Gaussian. Apparently they are because its predictive value is strong. And I think sampling is itself a difficult problem. I usually go to Nate Silver for my stats and polling but he identifies several other polling firms he respects.

Indeed, it's that sort of meta-analysis that rather demonstrates the validity. How often are results outside the low-confidence intervals of individual polls, let alone their aggregate?

... this is my signature!
Find all posts by this user
Like Post Quote this message in a reply
14-09-2015, 05:09 PM (This post was last modified: 14-09-2015 05:17 PM by goodwithoutgod.)
RE: Always good to see this in the news.....
(14-09-2015 04:57 PM)cjlr Wrote:  
(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Yeah I took statistics and am familiar with the concepts, but there are the variables like who gets polled for example. People who decide to participate tend to be more outspoken about their views, it is like comment cards at a business, most people who comment had an issue with the business, and if you read the comment cards and use that as a basis for the quality of the business you get a skewed view. people that are content with the business just pay and leave. People who participate in surveys wish to do so for a reason. Also how the question is presented and by whom can sway the results. But I agree that overall surveys and polls can give you a collective idea about how the majority may feel, but have never felt comfortable with the accuracy.

Are there quantifiable methodological factors you can point to, or it just a matter of feels? What confounding factors above and beyond what the polling firms recognize and account for do you think might be being neglected?
(and do you mean the accuracy, or the precision?)

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Contributory factors like demographics, income, education, personal agendas, politics, religious views etc all can sway the answers which may not necessary be the litmus test for accuracy with broad statements like, "Majority of americans think blah blah blah"....no the majority of the 1400 people you polled, who had agreed to be available for polls said blah blah blah.

If you accept the validity of representative statistics and assume even basic competency on behalf of the polling agency, then I'm entirely unsure as to where the skepticism arises...

Mass media headlines never contain the nuance and methodology, which I did just say was annoying, to be sure, but that certainly doesn't mean that information isn't available.

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Pew Research did a study of their own studies recently. It was sort of a meta study, if you will. And they found that only nine percent of the people they tried to reach for their public opinion surveys actually respond. Just in 1997, when Pew did this same methodology survey, they found that 36 percent of the people they were trying to reach actually responded. So one must question why the decline in willingness to respond? Why did those that agreed to respond agree? Most likely they felt strong enough about stating their opinion to do so, those with strong opinions may not actually be the voice of millions..but rather a slice of the small percentage of people asked to do a poll....resulting in only 9% of them willing to do the poll. See my point? Again, I took statistics as I am sure you have and am well aware of the overall "accuracy rate" and "expected +/- errancy" rates, but I still insist the results are suspect based on my above points.

But why? Do you think there are factors unaccounted for in the analysis?

Can you demonstrate - or even hypothesise - why willingness to respond would be correlated with any specific set of opinions or beliefs? Many questions nonetheless generate large numbers of moderate or even "don't know" answers - which contradicts your supposition.

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  At my work (US NAVY) we do command wide surveys to ascertain if we have issues like fraternization, sexual harrassment, Equal Opportunity issues etc...as an Officer I get to sit down with the Captain as he shows the Wardroom the results so we can discuss and craft a plan of action to rectify the issue. What we find is a lot of those who bother to answer have agendas, issues, and personal problems. Those that don't answer can't be bothered because they dont have any issues to complain about. So we have to factor in that those that reply have a negative outlook, and do not represent the command as a whole. We still have to conduct training, and perform quiet investigations to all allegations, but with rare exception, this is the case. Those that provide answers do so because they have strong feelings, or an agenda. Thus, their input does not reflect the overall command as a whole....

Yes, but that's just the sort of thing actual surveys account for. I mean, "I just don't think polling firms know how to do their jobs" isn't a very good objection, is it? Really?

Though, there is also a clear difference between sampling bias and response bias. And ways to treat both.

(14-09-2015 02:49 PM)goodwithoutgod Wrote:  Perhaps Pew Research center says it best, "The accuracy of a poll depends on how it was conducted. Most of Pew Research’s polling is done by telephone. By contrast, most online polls that use participants who volunteer to take part do not have a proven record of accuracy. There are at least two reasons for this. One is that not everyone in the U.S. uses the internet, and those who do not are demographically different from the rest of the public. Another reason is that people who volunteer for polls may be different from other people in ways that could make the poll unrepresentative. At worst, online polls can be seriously biased if people who hold a particular point of view are more motivated to participate than those with a different point of view."

http://www.pewresearch.org/2010/12/29/ho...ine-polls/

Just a different perspective from a guy who overthinks things Drooling

A direct statement from Pew acknowledging the difficulties and uncertainties they know of and try to account for makes you... less confident in their results? They're pretty clear on what they believe to be potential difficulties.

What do you think would substantially affect the accuracy and precision of the results, above and beyond what is already known and accounted for?

Wasn't my intention to suggest they are complete bullshit, as they aren't. I also comprehend they are "overall" usually indicative of the average opinion. Too tired and disinterested to enter a huge statistics debate, my point is I have always (just my opinion, and we all know what that means...zip) cast a skeptical eye towards how they present polls and surveys as "majority of Americans think blah blah blah" when I am smart enough to realize most people dont respond to polls and surveys, and those that do I would suspect have specific personality traits, or more apt to take the time to make their opinion known, may even have similar outlooks as other responders...thus the data is potentially skewed because it is an indication of who bothered to respond, not a slice of every demographic grouping across every state....I am a numbers guy, I love numbers, but I just give polls and surveys a sideways looks as I dont always trust their results...too many factors come into play....just my humble and skeptical opinion Big Grin

The link you provided was very helpful. This is not my area of expertise by ANY stretch of the imagination, I was merely proffering my opinion in these matters as a layman with a couple IQ points rattling around in my tired old brain. I found this part of your link interesting:

"A majority of Pew Research Center surveys are conducted among the U.S. general public by telephone using a sampling method known as random digit dialing or “RDD.” This method ensures that all telephone numbers in the U.S – whether landline or cellphone – have a known chance of being included. As a result, samples based on RDD should be unbiased, and a margin of sampling error and a confidence level can be computed for them."

Again, the problem for me comes to WHO accepts that random call, and sits down for a phone survey? I would submit to you while those that respond could cover a wide spectrum of demographics, my suspicion is people with similar personality traits, and perhaps worldview commonalities would sit down to answer the poll/survey....and thus the data is NOT representative of average american opinion, but representative of those who chose to answer....does that make any sense? perhaps not, I do tend to wander down paths sometimes lol.

DISCLAIMER: Again, NOT my area of expertise so don't skewer me to the wall with facts Tongue

"Belief is so often the death of reason" - Qyburn, Game of Thrones

"The Christian community continues to exist because the conclusions of the critical study of the Bible are largely withheld from them." -Hans Conzelmann (1915-1989)
Find all posts by this user
Like Post Quote this message in a reply
[+] 1 user Likes goodwithoutgod's post
14-09-2015, 06:22 PM
RE: Always good to see this in the news.....
(14-09-2015 05:09 PM)goodwithoutgod Wrote:  Wasn't my intention to suggest they are complete bullshit, as they aren't. I also comprehend they are "overall" usually indicative of the average opinion. Too tired and disinterested to enter a huge statistics debate, my point is I have always (just my opinion, and we all know what that means...zip) cast a skeptical eye towards how they present polls and surveys as "majority of Americans think blah blah blah" when I am smart enough to realize most people dont respond to polls and surveys, and those that do I would suspect have specific personality traits, or more apt to take the time to make their opinion known, may even have similar outlooks as other responders...thus the data is potentially skewed because it is an indication of who bothered to respond, not a slice of every demographic grouping across every state....I am a numbers guy, I love numbers, but I just give polls and surveys a sideways looks as I dont always trust their results...too many factors come into play....just my humble and skeptical opinion Big Grin

But there is a world of difference between,
"57% of Americans believe that..." and
"57% of Americans, based on a representative phone sample of 30,000, accurate to within 4% 19 times out of 20, and ignoring responses of 'don't know', believe that..."
Right?
(headlines might give the former; competent pollsters, such as Pew based on their record, will always publish the latter, if not necessarily in a press release)

The points you raise are all valid and all potential errors. But you're not the first person to worry about how to compensate for them!
Tongue

(14-09-2015 05:09 PM)goodwithoutgod Wrote:  The link you provided was very helpful. This is not my area of expertise by ANY stretch of the imagination, I was merely proffering my opinion in these matters as a layman with a couple IQ points rattling around in my tired old brain. I found this part of your link interesting:

"A majority of Pew Research Center surveys are conducted among the U.S. general public by telephone using a sampling method known as random digit dialing or “RDD.” This method ensures that all telephone numbers in the U.S – whether landline or cellphone – have a known chance of being included. As a result, samples based on RDD should be unbiased, and a margin of sampling error and a confidence level can be computed for them."

Again, the problem for me comes to WHO accepts that random call, and sits down for a phone survey? I would submit to you while those that respond could cover a wide spectrum of demographics, my suspicion is people with similar personality traits, and perhaps worldview commonalities would sit down to answer the poll/survey....and thus the data is NOT representative of average american opinion, but representative of those who chose to answer....does that make any sense? perhaps not, I do tend to wander down paths sometimes lol.

DISCLAIMER: Again, NOT my area of expertise so don't skewer me to the wall with facts Tongue

In the end the only thing to be done is to try to compare survey results to more "empirical" measures. This may be easier or harder, depending on the topic...

Political inclination and affiliation, at least, are generally easy to gauge, because every couple years people vote based on them. Trivially - and ignoring that turnout, and correlations between belief, demographics, and turnout, also exist and are robustly investigated on their own terms - we can compare opinion polling with the actual election results. And by and large any large polling agency is within their margin of error for almost all elections, especially when aggregated. Exceptions tend to be where polling isn't available in crucial periods - take the most recent UK general election, or past provincial elections in BC or Alberta up here; if the election is on the 20th (made up date) and the last in-field rolling poll ended the 16th, there could be swings outside the margin of error within that timeframe. But then, that's more a matter of the difficulties of turning snapshots into predictions.

I apologise for being kinda snarky. I'm hardly a statistician by trade, but it rubs me the wrong way to see the field accused of neglecting precisely the problems it is essentially peoples' full time job to account for.

... this is my signature!
Find all posts by this user
Like Post Quote this message in a reply
[+] 3 users Like cjlr's post
14-09-2015, 07:39 PM
RE: Always good to see this in the news.....
(14-09-2015 02:32 PM)cjlr Wrote:  ...
most people don't understand statistics anyway...

Most?

What was your sample size?

Shocking

Find all posts by this user
Like Post Quote this message in a reply
[+] 2 users Like DLJ's post
Post Reply
Forum Jump: