False Arguments Against Evidence of Vote Fraud

X
APA
Agresti, J. D. (2017, July 10). False Arguments Against Evidence of Vote Fraud. Retrieved from https://www.justfactsdaily.com/false-arguments-against-evidence-of-vote-fraud
MLA
Agresti, James D. “False Arguments Against Evidence of Vote Fraud.” Just Facts. 10 July 2017. Web. 3 December 2024.<https://www.justfactsdaily.com/false-arguments-against-evidence-of-vote-fraud>.
Chicago (for footnotes)
James D. Agresti, “False Arguments Against Evidence of Vote Fraud.” Just Facts. July 10, 2017. https://www.justfactsdaily.com/false-arguments-against-evidence-of-vote-fraud.
Chicago (for bibliographies)
Agresti, James D. “False Arguments Against Evidence of Vote Fraud.” Just Facts. July 10, 2017. https://www.justfactsdaily.com/false-arguments-against-evidence-of-vote-fraud.

By James D. Agresti
July 10, 2017

In the wake of a new study by Just Facts that found 594,000 to 5.7 million non-citizens illegally voted in the 2008 presidential election, several publications—namely Snopes, PolitiFact, and the Huffington Post—have claimed the study is wrong. These conclusions are rooted in falsehoods that are exposed below.

Sample Sizes

Just Facts’ study is primarily based on a 2014 paper in the journal Electoral Studies, which was authored by two professors and a researcher from Old Dominion University. These individuals analyzed voting and registration data on 339 self-declared non-citizens among 32,800 people who participated in a 2008 Harvard/YouGov survey.

The “fact checking” organization Snopes argues that this peer-reviewed paper and Just Facts’ study are “false,” mainly because the number of non-citizens in the survey is too small to provide meaningful results. In the words of Snopes:

  • “These numbers rest on the assumption that a subset of 38 (possible) non-citizen votes out of 339 non-citizens can be used to extrapolate countrywide voting behavior.”
  • “If extrapolating to a number based from Internet survey response data from a pool of 339 non-citizens into the millions sounds problematic to you, you are not alone.”
  • “The JustFacts.com post also does very little to address the fact that the … non-citizen dataset was so limited.”
  • Just Fact’s study is “based on applying broad estimates of behavior from an exceedingly small subpopulation.”

PolitiFact’s analysis of this issue makes a similar argument, declaring that Just Facts’ study:

  • uses “39 respondents out of 32,800 people who are now being used to extrapolate millions of illegal voters.”
  • “is based on an extrapolation of a controversial study that relied on a very small number of responses.”

Both Snopes and PolitiFact are making a simplistic math error, for as the textbook Statistics for K–8 Educators explains:

It is remarkable that the margin of error when you estimate a percentage depends only on p [the proportion of people who answer a poll question in a certain way] and n [the number of respondents]. This explains why a national random sample of 1,000 people can accurately represent 200 million people. The p from the sample of 1,000 people is not likely to be more than 3% off from what you would find if you did include all 200 million.

Likewise, the textbook Statistics: Concepts and Controversies states:

The variability of a statistic from a random sample does not depend on the size of the population as long as the population is at least 100 times larger than the sample.

Why does the size of the population have little influence on the behavior of statistics from random samples? Imagine sampling harvested corn by thrusting a scoop into a lot of corn kernels. The scoop doesn’t know whether it is surrounded by a bag of corn or by an entire truckload. As long as the corn is well mixed (so that the scoop selects a random sample), the variability of the result depends only on the size of the scoop.

To be specific, for a sample of 339 people, the difference in the margin of sampling error (with at least 95% confidence) for a population of 2,000 people versus 20 million people is less than a single percentage point (±4.8 versus ±5.3).

Just Facts’ study also includes the margin of sampling error for a group of 140 non-citizens who appear in a database of consumer and voting data. With this sample size, the difference in sampling error for a population of 2,000 people versus 20 million people is under half a percentage point (±8.0 versus ±8.3).

To provide added confidence, Just Facts uses a conservative formula to calculate the sampling error, which yields at least 95% confidence in the results.

PolitiFact and Snopes also misconstrue the implications of the number of voters in the survey. For a given sample size and margin of error, confidence in the results actually increases when the portion of people who answer a poll question in a certain way is well below or above 50%, as is the case with this poll data. Per the textbook Mind on Statistics, in such instances the “conservative approximation of the margin of error … usually overestimates the actual size of the 95% margin of error, and thus leads to an underestimate of how confident we can be that our interval estimate covers the truth.”

In this sample of 339 non-citizens, the ±5.3 sampling error shown above would have 95% confidence if the data indicated that 170 people voted, but since it indicates that 38 people voted, it has 99.8% confidence. Yet, Snopes and PolitiFact mislead their readers to believe that 38 voters in the survey makes the data pointless.

In accord with scientific research methods, Just Facts fully accounts for the sampling error in its study, and this is why the results span a wide range of 594,000 to 5.7 million illegal voters.

Snopes declares that the “absurdly large” range of results in the studies by Just Facts and Electoral Studies is evidence of “problems,” but this is the polar opposite of the truth. Honest researchers convey such ranges, and they are a mark of integrity, not problems.

Given the poor math skills of many adults, the arguments of Snopes and PolitiFact may sound convincing to some people, but these so-called fact checkers are using mathematically illiterate notions instead of concrete, quantifiable facts.

Representative Samples

In order to draw sound conclusions from a poll, the respondents must accurately represent the population of interest, which in this case, is non-citizens. Another way of saying this is that the survey should provide a random, unbiased sample of non-citizens. Per a book about polling published by Pennsylvania State University Press, “Scientific polls use sampling procedures where random samples are used, that is, where each individual in the group has an equal chance of being selected into the sample, or where some variation on this pattern is used to account” for any differences.

PolitiFact’s article uncritically parrots Brian Schaffner of UMass Amherst, the co-principal investigator of the Harvard/YouGov survey, who claims that the Old Dominion scholars “assume this is a random sample of all noncitizens in the country, which it probably isn’t.” Without additional context (which PolitiFact fails to provide), that statement is deceptive. This is because it ignores what the Old Dominion scholars state in their 2014 paper:

It is impossible to tell for certain whether the non-citizens who responded to the survey were representative of the broader population of non-citizens, but some clues can be gained by examining education levels. … We confront this issue primarily by weighting the data. … Weighting produces a non-citizen sample that appears to be a better match with Census estimates of the population.

Weighting is a common method of adjusting survey data to make it representative of a population. As explained in the academic book Designing and Conducting Survey Research: A Comprehensive Guide, weighting “is one of the most common approaches” that researchers use to “present results that are representative of the target population….”

The book goes on to explain that weighting is far from foolproof, and both Just Facts and the Old Dominion scholars state this in their studies. Nonetheless, weighting is a generally accepted means of making polling data representative, and PolitiFact’s omission of this fact is misleading given its quote of Schaffner.

Furthermore, the accuracy of this weighted survey data is corroborated by a scientific bilingual poll of 800 Latinos conducted in 2013. In this survey, 264 respondents stated they were non-citizens, and 13% of these non-citizens said they were registered to vote. These results have a margin of sampling error of ±6 percentage points with at least 95% confidence. Similarly, the weighted 2008 YouGov/Harvard data shows that 15% ±5 percentage points of non-citizens stated they were registered to vote. In short, these polls support one another.

PolitiFact’s point also rings with hypocrisy, because PolitiFact repeatedly cites an article in the journal Eos that purports to “assess the scientific consensus on climate change through an unbiased survey of a large and broad group of Earth scientists.” In reality, this study is based on an unweighted internet survey with a 31% response rate. Thus, it can hardly be described as unbiased or representative, but PolitiFact cites it without mentioning these caveats.

Misrepresentations and Assumptions

Compounding the misinformation, Snopes reports that a December 2016 article from Just Facts “serves mainly as an effort to debunk the claim made by Schaffner and his colleagues in their 2015 paper that ‘zero’ non-citizen votes were cast in the 2007 [sic, 2008] presidential election.” Snopes then quotes Schaffner:

What we are saying […] is that once you account for measurement, the best estimate of the number of non-citizen voters is zero. That doesn’t mean we actually think there are zero non-citizen voters.

This is a strawman, as Just Facts never made such an argument. The article in question analyzes the research of the Old Dominion scholars and their critics, and as a part of this, debunks Schaffner and his coauthors’ explicit claim that “the rate of non-citizen voting in the United States is likely 0.” The key words are “likely” and “rate,” meaning probably 0%, not definitely zero. At no point does Just Facts state or imply otherwise.

Instead, the article details how Schaffner and his coauthors arrive at their conclusion by using unrealistic assumptions. To summarize, they assume that poll respondents who:

  • said “I definitely voted” and specifically identified the person they voted for (mostly Obama) actually did not vote unless a database of consumer and voting data shows they voted—even though:
    • nearly all illegal immigrants use false identifications to interact with government.
    • about 60% of non-citizens in this survey are not in this database.
    • a 2016 paper in the journal Public Opinion Quarterly found that “several apparently viable methods of matching survey respondents to government records severely underestimate the proportion of Americans who were registered to vote.”
  • are shown by voting records to have voted actually did not vote if they denied being registered to vote—even though:
    • Schaffner and company treat this database as the only valid evidence of voting throughout the rest of their paper.
    • non-citizens have incentive to lie about being registered to vote, because this is an illegal act that is punishable by imprisonment, fines, and/or deportation.
  • said they were non-citizens in one year of the survey but said they were citizens in another year actually are citizens—even though:
    • illegal immigrants have incentive to lie about their immigration status to avoid expulsion from the U.S.
    • a 2013 study published in the journal Demographic Research found that certain non-citizens—including Mexican men of all ages, Mexican women aged 40 and older, and immigrants who have been in the U.S. for less than five years—frequently misrepresent themselves as citizens.
    • in their original 2014 paper and two follow-up working papers, the Old Dominion scholars detail four lines of evidence indicating that most survey respondents who identified themselves as non-citizens and said they voted or appeared in the voting database were, in fact, non-citizens.

Worse still, Schaffner and his colleagues employ these irrational assumptions without even spelling them out, much less proving they are warranted. This is in direct contradiction to the RAND Corporation’s Standards for High-Quality Research and Analysis, which require that assumptions be “explicit and justified.”

Likewise, the Handbook of Social Research Ethics states, “Basic and often unexamined assumptions” have the effect of “blinding the reader to alternative interpretations.” Therefore, “a fully ethical presentation of scientific data requires an understanding on the part of the readers and the author alike of the assumptions that gird the research and findings.”

In contrast to Schaffner and company, the Old Dominion scholars and Just Facts itemize the assumptions in their studies, and both document why some of these assumptions are justified.

Turning reality on its head, Schaffner tells reporter Sam Levine of the Huffington Post that Just Facts and the Old Dominion scholars are guilty of “ignoring measurement error” on the citizenship question in the survey. Actually, Just Facts addresses this issue in its study, and the Old Dominion scholars devote 1,300+ words to it in their original 2014 paper. Yet, Schaffner and his colleagues authored a rebuttal to that paper that completely ignores the fact that the Old Dominion scholars already tackled their primary argument. Furthermore, the two follow-up working papers of the Old Dominion scholars are almost exclusively about this issue.

Nonetheless, Schaffner repeats this canard twice to the Huffington Post and then attempts to refute Just Facts by saying:

In addition to ignoring the major issue with the original study, they also claim that we should take any supposed non-citizen at their word if they claim to have voted even if we can’t match them to a vote record because they probably used a fraudulent identity. However, the issue here is why would a non-citizen who is going through the trouble of using a fraudulent identity to vote then admit to voting in a survey and give us their actual name and address?

The obvious answer to Schaffner’s question is that non-citizens don’t need false identification to take an internet poll, but they do need false identification to register to vote. And again, virtually all illegal immigrants use fraudulent identities to interact with government.

Such identity fraud is also evidenced by the fact that about 60% of non-citizens in this survey don’t appear in the database of consumer and voting data, even though Schaffner wrote in 2015 that this database contains “information for nearly every American adult.”

Moreover, even if Just Facts disregarded all non-citizens who said they voted and limited its analysis only to votes confirmed by voting records, the data would show that 590,000 to 3.9 million non-citizens voted in the 2008 election.

Twisting Words

Snopes declares that the 2014 study by the Old Dominion scholars provides “material evidence for five (not even six!) non-citizen votes” in 2008. In truth, the study provides material evidence for 38 non-citizen votes, including 27 respondents who said “I definitely voted” and 16 respondents shown by voting records to have voted. The overlap between these groups is five people, which leaves 38 votes (27 + 16 – 5).

However, Snopes invents its own definition of material evidence, which is limited only to the overlap group. This is at odds with the actual definition of material evidence, which is “evidence that is likely to affect the determination of a matter or issue” (Merriam-Webster’s Dictionary of Law, 1996).

Also, Snopes, PolitiFact, and the Huffington Post repeatedly warn that Just Facts’ study is based on an “extrapolation.” In reality, the study is based on a straightforward application of survey data—and these same organizations along with every major media outlet routinely cite similar figures without calling them “extrapolations.”

Because it is often impractical to collect information from every person in the United States, governments frequently obtain key data through surveys. This is true of official government data on crime, education, employment, the economy, and an enormous array of Census data. These surveys typically use much larger samples than the one used by Just Facts, and thus, the margins of sampling error are smaller, but the basic method and principle are the same: survey a representative sample and apply the results to the U.S. population.

An extrapolation, on the other hand, is estimating “the value (of a function or quantity) outside a range in which some values are known” (American Heritage Dictionary of Science, 1986). This conveys a form of uncertainty that is not present in Just Facts’ study. One could broadly parse this definition to include Just Facts’ study, but then it would also apply to reams of government data that are rarely (if ever) called extrapolations.

The Old Dominion scholars use the word “extrapolation” to refer to a calculation they perform to find a “best estimate” of illegal voting, and this word is appropriate here, because the calculation does not account for sampling error. Thus, it stretches the accuracy of this particular data beyond what is known. However, the upper and lower bounds of their findings do account for sampling error, and Just Facts accounts for sampling error at every step of the way. This is one of the strengths of Just Facts’ study.

Too Complex for Snopes

After misrepresenting several aspects of this issue, Snopes criticizes the Washington Times for reporting that Just Facts’ study involves “a series of complicated calculations,” which is a “surefire way to make it sound like something carries authority without actually understanding any aspect of the topic.” Snopes continues:

Outside of the fact that these calculations are found in the 1,010th footnote of the JustFacts.com report, the calculations (shown below) don’t involve much more complicated mathematics than multiplication, subtraction, and addition (no division, thankfully).

The remark about the 1,010th footnote is baffling, because it proves nothing except that the research contains a lot of documentation. Incidentally, this research covers broad aspects of immigration, not just illegal voting by non-citizens.

Claims that the calculations appear in a single footnote and don’t involve much more than “multiplication, subtraction, and addition” are false. The calculations appear in multiple footnotes and also involve division and square roots. More importantly, it is not the math that is complex but the conceptual framework for the calculations, including everything discussed above and more.

Discrediting the Messenger

Throughout its analysis, Snopes portrays Just Facts as a mere “web site.” In reality, Just Facts is a research and educational institute that has been widely cited by major media outlets, university professors, government entities at local, state and national levels, peer-reviewed journals, and think-tanks from across the political spectrum.

The Huffington Post labels Just Facts as a “conservative think tank,” and PolitiFact calls Just Facts “conservative/libertarian” twice. These statements are untrue. Just Facts is a non-ideological, independent organization that has published hundreds of facts that are indifferent or challenging to conservative and libertarian views. In the spirit of transparency, Just Facts states that its staff and board members are generally conservative/libertarian but emphasizes that “we do not favor facts that support our viewpoints.”

In stark contrast to their partisan mislabeling of Just Facts, Snopes, the Huffington Post, and PolitiFact rely heavily on the claims of Schaffner while failing to reveal that he donated to Hillary Clinton’s 2016 presidential campaign and also to America Coming Together, a liberal organization “heavily funded by billionaire George Soros.”

PolitiFact summons other scholars to contest Just Facts while neglecting to mention that two of them donated to Barack Obama, another donated to the Democratic Congressional Campaign Committee, and another wrote an op-ed warning Millennials not to vote for third-party candidates because this may help Donald Trump win the election.

Summary

Using textbook scientific procedures, Just Facts’ study of survey data and voting records finds that 594,000 to 5.7 million non-citizens voted illegally in the 2008 presidential election. These results are subject to uncertainties that Just Facts candidly enumerates.

Attempts to refute this study by Snopes, PolitiFact, and the Huffington Post are littered with sophomoric inconsistencies, irrationalities, and outright falsehoods. And unless these organizations are misreporting what UMass Amherst professor Brian Schaffner says, he is using them to spread misinformation that happens to align with his political donations.

  • July 10, 2017 at 2:49 PM
    Permalink

    As far as I am concerned, ONE illegal vote is too many. And, we know without a doubt that there are many more than simply ONE!! There can be no valid excuse for not at least, investigating the phenomenon!!!

    Reply
  • July 10, 2017 at 7:38 PM
    Permalink

    Great refutation!

    Reply
  • July 10, 2017 at 10:47 PM
    Permalink

    I’m not sure how, or even IF, this could be done, but it would be interesting to investigate the possibility of whether illegal votes had a material effect on the actual Electoral College votes in any of the past few presidential elections.
    In other words, is there a possibility that enough illegal votes were counted in a state that swung that state for a candidate that would not otherwise have won that state?

    Reply
    • August 7, 2017 at 1:47 PM
      Permalink

      Not only do illegal aliens voting affect the electoral college votes but even if they don’t vote they are counted as residents increasing the number of electoral votes a state has.

      Reply
  • July 11, 2017 at 3:37 PM
    Permalink

    Sad that snopes and the rest of these liberal pontificators are so sold out to their political ideology, that they are willing to sacrifice their reputations and integrity, to refute facts from studies that don’t align with those ideologies.

    It’s just more rainbows and unicorns while the country burns down around these bubble heads. God help us!

    Reply
  • July 11, 2017 at 5:08 PM
    Permalink

    Please accept the gratitude of many of us citizens who know intuitively that the allegations of illegal aliens voting in our elections is a fact. We know, because we see many of the other underhanded ways in which illegals conduct business, and because survival is an instinct that justifies nearly any actions in the minds of those who feel threatened.
    Grateful also for the exposure of Snope’s bias, along with that of other so-called “disinterested” or “unbiased” fact-checker sites, which clearly are not at all true self-descriptions. We must refute the efforts of those who manipulate media to confuse or deceive Americans with every resource available.

    Reply
  • July 11, 2017 at 10:06 PM
    Permalink

    An activist friend of mine tells me that about 10% of the people selected for jury duty decline to serve because they are not citizens. To serve, they must sign a statement stating, under oath, that they are citizens. Their names were selected from the voter rolls.

    Reply
    • July 12, 2017 at 3:17 PM
      Permalink

      Robb K, I don’t know what state you live in, but in Arizona, juror names are also taken from the driver’s license rolls, and one need not be a citizen to legally obtain a driver’s license. A driver’s license is only permission to operate a motor vehicle on public roads, and any legal resident can obtain that. I suspect that many non-legal residents obtain that with fraudulent documentation as well.
      Using the driver’s license rolls was done because it was believed that some citizens were not registering to vote in order to avoid jury duty.

      Reply
      • December 18, 2017 at 5:27 PM
        Permalink

        Motor voter laws,in many states when you get a D.L. you are placed on the voter roll. If your a non citizen it’s not supposed to happen,unfortunately in many cases it does.

        Reply
  • July 13, 2017 at 3:31 PM
    Permalink

    Thank you so much for this timely, scholarly article. There are far too many ignorant, naïve people in our country who will never undertake anything that involves in-depth research or thought. The masses are so easily swayed by popular, and convenient, pseudo-sources such as Snopes. We truly are in the fight of our lives, and America is undergoing a COUP right before our eyes. I’m afraid that our society, as we have come to know it, is rapidly giving way to media-driven, corrupt, bought-and-paid-for demagogues, and hapless, brainwashed students who appear to be taking control. You are the “voice of one crying in the wilderness,” but never doubt that there are many more thinking, patriotic, Americans who are alive and well! I, for one, will NOT go down quietly. As Jefferson once said, ” One man with courage is a majority!” Keep the faith!

    Reply
    • July 14, 2017 at 12:01 AM
      Permalink

      Thank you for your kind words.

      Reply
  • August 19, 2017 at 1:31 PM
    Permalink

    Thank God and the Just Facts organization for accurately determining truth. There are so many that just write their opinions or simply search for internet ideas that support their present beliefs instead of actually looking for facts!
    It used to be that the media reported the NEWS not their own VIEWS! And when they did report their own views it was made clear that this was an opinion of one man or woman. Eric Sevareid would be a prime example of that in the 60’s to 70’s on CBS I believe. Now the media simply states their mix of opinion and news and never declares what part it news and what part opinion!

    Reply
  • December 31, 2017 at 9:37 AM
    Permalink

    This is number one issue for me, under The Constitution if my vote is not counted or cancelled out because of an illegal vote, to me, that election or vote is UNCONSTITUTIONAL.

    Reply
  • July 1, 2018 at 2:34 AM
    Permalink

    After Investors Business Daily published an op-ed citing the paper in Electoral Studies, my attention was called to a piece from the Brennan Center. One comment made about the Electoral Studies paper was that it was based on a survey where a number of respondents mistakenly identified themselves as noncitizens. If this is true, and if the false positive rate on survey answers is as high as claimed, arguments regarding population size and representativeness of the sample become somewhat moot.
    Link: https://www.brennancenter.org/publication/truth-about-voter-fraud

    Reply
  • August 21, 2018 at 3:28 PM
    Permalink

    If only 339 people took part in this survey and less than 40 actually did what they claimed (i.e. voted), shouldn’t this be pretty easy to verify? Voting is public record and I would like verification rather than arguing over the validity of possible validity (if that makes sense). In other words, these sweeping claims do not have a lot of merit unless proven true, and for something this important, it should be more than “your” words or any researcher’s for that matter. What state allowed this? Have they addressed it? To say that 8% of all illegals vote is not a claim that should be made without more proof.

    Reply

Make a Comment

Your email address will not be published. Required fields are marked *

Articles by Topic