Abstract Policymakers may wish to take into account public opinion on climate change as they craft legislation, but if public opinion changes substantially in response to seemingly trivial changes in survey questionnaire design, perhaps such reliance would be unwise. This paper examines 110 experiments implemented in surveys of truly random samples of American adults between 2012 and 2018 (N = 4414), exploring the extent to which answers to questions were influenced by order and wording manipulations. Of 144 tests, 31 (22%) yielded statistically significant effects. Adjustments for multiple hypothesis tests reduced this percentage to between 7 and 9%. The effect sizes are routinely small. These results are consistent with the conclusion that survey results on climate change issues are relatively robust, so policymakers can take them seriously if they wish to do so.
展开▼