Place; date:
Milano, Italija, Ljubljana, Slovenija; 2016
Financiranje iz lastnih sredstev.
no information
Keywords ADP: reference, ki se nanašajo na spletne panele, izbira študij o spletnih panelih, spletni paneli v anketni metodologiji, sistematični pregled raziskav spletnih panelov
Keywords ELSST:
METODOLOGIJA
Topic Classification CESSDA
DRUGO
Topic Classification CERIF
Raziskovalna metodologija v znanosti
Topic Classification ADP
REFERENCE
PANEL
KAKOVOST PANELNIH PODATKOV
RAZISKAVA
RAZISKOVALNA VPRAŠANJA PANELA
Študija je sistematični pregled, katerega cilj je oceniti (i) značilnosti spletnih panelov, ki se uporabljajo v anketni metodologiji, (ii) kakovost teh spletnih panelov, (iii) značilnosti posameznih študij panela in (iv) uporabo spletnih panelov kot vzorčnega vira za raziskave o kakovosti anketnih podatkov. Za izbor empiričnih študij s področja anketne metodologije, ki se nanašajo na spletne panele, je bila uporabljena bibliografska baza podatkov WebSM. Za obravnavo raziskovalnih vprašanj so bile za opis značilnosti spletnih panelov, kakor tudi namena uporabe spletnih panelov, definirane specifične kodirane kategorije. V povezavi z namenom študije so bili kodirani različni vidiki kvalitete panelnih podatkov za študije o kvaliteti samih panelov, kakor tudi metodološka vprašanja za študije, ki uporabljajo panele kot vzorčni vir za raziskave anketne metodologije.
Collection date: | 6. oktober 2016, 8. november 2016 |
---|---|
Date of production: | 2016 |
Country: | no information |
Geographic coverage: | no information |
Unit of analysis: |
Drugo Referenca Spletni panel Posamezna študija spletnega panela |
Universe: |
Vsi članki in poglavja iz knjig zabeleženih v WebSM bibliografski podatkovni bazi, ki so bili objavljeni med januarjem 2012 in junijem 2016, so napisani v angleškem jeziku in v naslovu vsebujejo vsaj eno ključno besedo: panel, verjetnost, ne-verjetnost, neverjetnost, utež, rezultat in reprezentativnost. |
Excluded: | no information |
Data collected by: |
Respi, Chiara |
Sampling procedure: |
Zajeta celotna populacija Iz bibliografske podatkovne baze WebSM so bile izbrane reference, ki so bile razvrščene v tri na novo oblikovane podatkovne baze. Le-te so vsebovale kodirane značilnosti spletnih panelov, ki so jih navajale pridobljene reference. V bibliografski bazi podatkov WebSM so bili v iskalno polje najprej vpisani splošni kriteriji vključitve, in sicer: članek / poglavje v knjigi, objavljeno od januarja 2012 do junija 2016 ter celotno besedilo v angleščini. Tako je bila pridobljena zbirka 847 najnovejših objavljenih referenc s področja spletne anketne metodologije. Dodatno je bilo v zbirko vključenih še 9 referenc t. j. 6 poglavij iz knjige avtorja Das et al., 2011, ki sicer ne ustreza kriteriju datuma objave, vendar se poglavja izrecno nanašajo na spletne panele ter 3 objavljeni članki, ki so jih navajale pridobljene reference oz. s pomočjo interneta. Tako je bilo končno število identificiranih referenc 856. Po odstranitvi 4 podvojenih zapisov (iz pridobljenega WebSM izbora) je bil uporabljen poseben kriterij vključitve, tj. vsebovanje vsaj ene od sedmih ključnih besed: panel, verjetnost, ne-verjetnost, neverjetnost, utež, rezultat in reprezentativnost v naslovu reference. Kriterij je botroval izločitvi 762 referenc. V zadnjem koraku je bilo na preostalih 90 referencah opravljeno preverjanje upravičenosti preko ocenjevanja povzetka in celotnega besedila. Najprej je bil pregledan povzetek, da bi ugotovili ali je raziskava ustrezala določeni temi, če ni, je bila referenca izključena. Potem je bilo pregledano celotno besedilo in sprejeta je bila odločitev, ali naj se le-ta vključi oz. izključi. Glavni razlog za neupravičenost so bile spletne ankete, ki so potekale brez panelnih članov, uporaba ne spletnih panelov in teoretični prispevki, ki so preučevali spletne panele brez predstavitve empiričnih dokazov o njihovi uporabi in kakovosti. Izključenih je bilo 16 referenc, pri čemer je bilo v sistematični pregled vključenih 74 referenc. |
Mode of data collection: |
Kodiranje vsebine |
Weighting: |
Brez uteževanja. |
Podatkovna datoteka je uporabnikom dostopna brez registracije pod mednarodno licenco Creative Commons 0. Uporabnik se zaveže, da bo skrbel za tajnost podatkov in opravljal analize brez poskusov identifikacije posameznikov ter spoštoval profesionalne etične kodekse. Avtorske pravice pridržane. Arhiv izroča podatke uporabnikom samo za namen, ki ga posebej opredelijo, ob zagotovitvi spoštovanja profesionalnih etičnih kodeksov. Uporabnik se posebej zaveže, da bo skrbel za tajnost podatkov in opravljal analize brez poskusov identifikacije posameznika.
Contact: Arhiv družboslovnih podatkov
Pri objavah, ki bi sledile na podlagi podatkov, je potrebno polno citirati avtorja in Arhiv.
Vsak uporabnik je dolžan opozoriti na morebitne pomanjkljivosti gradiva in poslati Arhivu 2 kopiji nastalih besedil.
Uporabnik naj pred uporabo pozorno prebere spremljajočo dokumentacijo in se v primeru nejasnosti obrne na avtorje raziskave ali Arhiv.
Title of Data file: Usage of online panels in survey methodology field, 2016: A systematic review - reference [Podatkovna datoteka]
File ID: F1
Author of Data file: Respi, Chiara; Lozar Manfreda, Katja
Format: *.txt - TEKST
License: cc0
Version: 12. september 2019
Title of Data file: Usage of online panels in survey methodology field, 2016: A systematic review - panel [Podatkovna datoteka]
File ID: F2
Author of Data file: Respi, Chiara; Lozar Manfreda, Katja
Format: *.txt - TEKST
License: cc0
Version: 12. september 2019
Title of Data file: Usage of online panels in survey methodology field, 2016: A systematic review - individual panel study [Podatkovna datoteka]
File ID: F3
Author of Data file: Respi, Chiara; Lozar Manfreda, Katja
Format: *.txt - TEKST
License: cc0
Version: 12. september 2019
Title of Data file: Usage of online panels in survey methodology field, 2016: A systematic review. Complete answers [Podatkovna datoteka]
File ID: F4
Author of Data file: Respi, Chiara; Lozar Manfreda, Katja
Format: *.xlsx - TABELE
License: cc0
Version: 16. september 2019
ID_REF Identification code of the reference
Identification code of the reference
Value 13 | Frequency | |
---|---|---|
001 | 1 | |
002 | 1 | |
003 | 1 | |
004 | 1 | |
005 | 1 | |
006 | 1 | |
007 | 1 | |
008 | 1 | |
009 | 1 | |
010 | 1 | |
011 | 1 | |
012 | 1 | |
013 | 1 | |
014 | 1 | |
015 | 1 | |
016 | 1 | |
017 | 1 | |
018 | 1 | |
019 | 1 | |
020 | 1 | |
021 | 1 | |
022 | 1 | |
023 | 1 | |
024 | 1 | |
025 | 1 | |
026 | 1 | |
027 | 1 | |
028 | 1 | |
029 | 1 | |
030 | 1 | |
031 | 1 | |
032 | 1 | |
033 | 1 | |
034 | 1 | |
035 | 1 | |
036 | 1 | |
037 | 1 | |
038 | 1 | |
039 | 1 | |
040 | 1 | |
041 | 1 | |
042 | 1 | |
043 | 1 | |
044 | 1 | |
045 | 1 | |
046 | 1 | |
047 | 1 | |
048 | 1 | |
049 | 1 | |
050 | 1 | |
051 | 1 | |
052 | 1 | |
053 | 1 | |
054 | 1 | |
055 | 1 | |
056 | 1 | |
057 | 1 | |
058 | 1 | |
059 | 1 | |
060 | 1 | |
061 | 1 | |
062 | 1 | |
063 | 1 | |
064 | 1 | |
065 | 1 | |
066 | 1 | |
067 | 1 | |
068 | 1 | |
069 | 1 | |
070 | 1 | |
071 | 1 | |
072 | 1 | |
073 | 1 | |
074 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
TITLE Title of the reference
Title of the reference
Value 22 | Frequency | |
---|---|---|
A Comparison of Different Online Sampling Approaches for Generating National Samples | 1 | |
A Comparison of Four Probability-Based Online and Mixed-Mode Panels in Europe | 1 | |
A Comparison of the Quality of Questions in a Face-to-face and a Web Survey | 1 | |
A multi-group analysis of online survey respondent data quality: Comparing a regular USA consumer panel to MTurk samples | 1 | |
Accuracy of Estimates in Access Panel Based Surveys (in Improving survey methods) | 1 | |
An empirical test of the impact of smartphones on panel-based online data collection (in Online Panel Research: A Data Quality Perspective) | 1 | |
Assessing representativeness of a probability-based online panel in Germany (in Online Panel Research: A Data Quality Perspective) | 1 | |
Attention and Usability in Internet Surveys: Effects of Visual Feedback in Grid Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Attitudes Toward Risk and Informed Consent for Research on Medical Practices: A Cross-Sectional Survey | 1 | |
Can Biomarkers Be Collected in an Internet Survey? A Pilot Study in the LISS Panel (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Can a non-probabilistic online panel achieve question quality similar to that of the European Social Survey? | 1 | |
Challenges in Reaching Hard-to-Reach Groups in Internet Panel Research (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Comparing Survey Results Obtained via Mobile Devices and Computers: An Experiment With a Mobile Web Survey on a Heterogeneous Group of Mobile Devices Versus a Computer-Assisted Web Survey | 1 | |
Comparison of Smartphone and Online Computer Survey Administration | 1 | |
Comparison of US Panel Vendors for Online Surveys | 1 | |
Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities | 1 | |
Correcting for non-response bias in contingent valuation surveys concerning environmental non-market goods: an empirical investigation using an online panel | 1 | |
Data Quality in PC and Mobile Web Surveys | 1 | |
Determinants of the starting rate and the completion rate in online panel studies (in Online Panel Research: A Data Quality Perspective) | 1 | |
Does It Pay Off to Include Non-Internet Households in an Internet Panel? | 1 | |
Does the Inclusion of Non-Internet Households in a Web Panel Reduce Coverage Bias? | 1 | |
Effects of Lotteries on Response Behavior in Online Panels | 1 | |
Efficiency of Different Recruitment Strategies for Web Panels | 1 | |
Estimating the effects of nonresponses in online panels through imputation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study | 1 | |
Evaluation of an online (opt-in) panel for public participation geographic information systems surveys | 1 | |
How Do Lotteries and Study Results Influence Response Behavior in Online Panels? | 1 | |
How Representative Are Online Panels? Problems of Coverage and Selection and Possible Solutions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Improving Response Rates and Questionnaire Design for Mobile Web Surveys | 1 | |
Improving Survey Response Rates in Online Panels: Effects of Low-Cost Incentives and Cost-Free Text Appeal Interventions | 1 | |
Improving web survey quality: Potentials and constraints of propensity score adjustments (in Online Panel Research: A Data Quality Perspective) | 1 | |
Informing panel members about study results: Effects of traditional and innovative forms of feedback on participation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Internet panels, professional respondents, and data quality | 1 | |
Lotteries and study results in market research online panels | 1 | |
Making Mobile Browser Surveys Smarter Results from a Randomized Experiment Comparing Online Surveys Completed via Computer or Smartphone | 1 | |
Measurement invariance and quality of composite scores in a face-to-face and a web survey | 1 | |
Measuring Attitudes Toward Controversial Issues in Internet Surveys: Order Effects of Open and Closed Questioning (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Mobile Response in Web Panels | 1 | |
Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face-to-Face Reference Surveys | 1 | |
Motives for joining nonprobability online panels and their association with survey participation behavior (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison With a Probability Sample Interview Survey | 1 | |
Nonresponse and attrition in a probability-based online panel for the general population (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonresponse and measurement error in an online panel: Does additional effort to recruit reluctant respondents result in poorer quality data? (in Online Panel Research: A Data Quality Perspective) | 1 | |
Online panels and validity: Representativeness and attrition in the Finnish eOpinion panel (in Online Panel Research: A Data Quality Perspective) | 1 | |
Panel Attrition - Separating Stayers, Fast Attriters, Gradual Attriters, and Lurkers | 1 | |
Panel Conditioning in Difficult Attitudinal Questions | 1 | |
Professional respondents in nonprobability online panels (in Online Panel Research: A Data Quality Perspective) | 1 | |
Recruiting A Probability Sample For An Online Panel: Effects Of Contact Mode, Incentives, And Information | 1 | |
Recruiting an Internet Panel Using Respondent-Driven Sampling | 1 | |
Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments | 1 | |
Respondent Screening and Revealed Preference Axioms: Testing Quarantining Methods for Enhanced Data Quality in Web Panel Survey | 1 | |
Response Behavior in an Adaptive Survey Design for the Setting-Up Stage of a Probability-Based Access Panel in Germany (in Improving survey methods) | 1 | |
Sample composition discrepancies in different stages of a probability-based online panel | 1 | |
Selection bias of internet panel surveys: A comparison with a paper-based survey and national governmental statistics in Japan | 1 | |
Sensitive topics in PC Web and mobile web surveys: Is there a difference? | 1 | |
Setting Up an Online Panel Representative of the General Population The German Internet Panel | 1 | |
Straightlining in Web survey panels over time | 1 | |
Survey Mode Effects on Data Quality: Comparison of Web and Mail Modes in a U.S. National Panel Survey | 1 | |
Survey Participation in a Probability-Based Internet Panel in the Netherlands (in Improving survey methods) | 1 | |
Surveying Rare Populations Using a Probability based Online Panel | 1 | |
The Access Panel of German Official Statistics as a Selection Frame (in Improving survey methods) | 1 | |
The Design of Grids in Web Surveys | 1 | |
The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability-based General Population Panel | 1 | |
The Utility of an Online Convenience Panel for Reaching Rare and Dispersed Populations | 1 | |
The comparison of road safety survey answers between web-panel and face-to-face; Dutch results of SARTRE-4 survey | 1 | |
The impact of speeding on data quality in nonprobability and freshly recruited probability-based online panels (in Online Panel Research: A Data Quality Perspective) | 1 | |
The relationship between nonresponse strategies and measurement error: Comparing online panel surveys to traditional surveys (in Online Panel Research: A Data Quality Perspective) | 1 | |
The role of topic interest and topic salience in online panel web surveys. | 1 | |
The untold story of multi-mode (online and mail) consumer panels: From optimal recruitment to retention and attrition (in Online Panel Research: A Data Quality Perspective) | 1 | |
The use of Pcs, smartphones and tablets in a probability based panel survey. Effects on survey measurement error. | 1 | |
Using Interactive Features to Motivate and Probe Responses to Open-Ended Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Validating respondents’ identity in online samples: The impact of efforts to eliminate fraudulent respondents (in Online Panel Research: A Data Quality Perspective) | 1 | |
What Happens if You Offer a Mobile Option to Your Web Panel? Evidence From a Probability-Based Panel of Internet Users | 1 | |
What do web survey panel respondents answer when asked "Do you have any other comment?" | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 |
EDITOR Editor of the reference
Editor of the reference
Value 31 | Frequency | |
---|---|---|
Annals of Internal Medicine 162 (10) | 1 | |
Asia-Pacific Journal of Public Health | 1 | |
Center for Crime and Justice Policy, CCJP 1 | 1 | |
Field Methods, 26, 4 | 1 | |
Field Methods, 27, 4. pp. 391-408 | 1 | |
Field Methods, Published online before print February 21, 2013 | 1 | |
Field Methods, Published online before print January 29, 2013 | 1 | |
International Gambling Studies | 1 | |
International Journal of Internet Science, 8, 1, p. 17-29 | 1 | |
International Journal of Market Research, 55, 1, pp. 59-80 | 1 | |
International Journal of Market Research, 55, 5, pp. 611-616 | 1 | |
International Journal of Market Research, 57, 3, pp. 395-412 | 1 | |
International Journal of Public Opinion Research, 24, 2, pp. 238-249 | 1 | |
International Journal of Public Opinion Research, 24, 4, pp. 534-545 | 1 | |
International Journal of Public Opinion Research, 25, 2, pp. 242-253 | 1 | |
JMIR Publications, 15, 11 | 1 | |
Journal of Business Research, 69, 8, pp. 3139-3148 | 1 | |
Journal of Environmental Planning and Management | 1 | |
Journal of Medical Internet Research, 16, 12, e276 | 1 | |
Journal of Official Statistics, 30, 2, pp. 291-310 | 1 | |
Journal of Safety Research, 46, pp. 13-20 | 1 | |
Methodology, 11, 3, pp. 81-88 | 1 | |
PLOS one, 10, 12 | 1 | |
Public Opinion Quarterly (POQ), 76, 3, pp. 470-490 | 1 | |
Public Opinion Quarterly (POQ), 79, 3, pp. 687-709 | 1 | |
Public Opinion Quarterly (POQ), First published online: November 14, 2014 | 1 | |
Public Opinion Quarterly (POQ), First published online: September 16, 2013 (77, 3, pp. 783-797) | 1 | |
Routledge | 10 | |
Social Science Computer Review, 30, 2, pp. 212-228 | 1 | |
Social Science Computer Review, 31, 3, p. 322-345 | 1 | |
Social Science Computer Review, 31, 3, pp. 371-385 | 1 | |
Social Science Computer Review, 31, 4, p. 482-504 | 1 | |
Social Science Computer Review, 31, 6, p. 725-743 | 1 | |
Social Science Computer Review, 32 no. 2, pp. 238-255 | 1 | |
Social Science Computer Review, 32, 4, 544-560 | 1 | |
Social Science Computer Review, 32, 6, pp. 728-742 | 1 | |
Social Science Computer Review, 34, 1, 2016, pp. 8-25, Published online before print March 31, 2015 | 1 | |
Social Science Computer Review, 34, 1, pp. 41-58 | 1 | |
Social Science Computer Review, 34, 1, pp. 99-115 | 1 | |
Social Science Computer Review, 34, 2, pp. 229-243, Published online before print December 17, 2014 | 1 | |
Social Science Computer Review, vol. 34, 1: pp. 78-94. First Published February 26, 2015. | 1 | |
Sociological Methods & Research | 1 | |
Survey Methods: Insights from the field | 1 | |
Survey Practice, 5, 3 | 1 | |
Survey Research Methods, 7, 1, pp. 17-28 | 1 | |
Survey Research Methods, 7, 3 | 1 | |
Survey Research Methods, 9, 2, pp. 125-137 | 1 | |
Wiley | 15 | |
methods, data, analyses | 1 | |
methods, data, analyses, 9, 1, pp.87-110 | 1 | |
methods, data, analyses, 9, 2, pp. 185-212 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
ID_REF Identification code of the reference
Identification code of the reference
Value 188 | Frequency | |
---|---|---|
001 | 1 | |
002 | 1 | |
003 | 1 | |
004 | 1 | |
005 | 1 | |
006 | 1 | |
007 | 1 | |
008 | 1 | |
009 | 1 | |
010 | 1 | |
011 | 1 | |
012 | 1 | |
013 | 1 | |
014 | 1 | |
015 | 1 | |
016 | 1 | |
017 | 1 | |
018 | 1 | |
019 | 1 | |
020 | 1 | |
021 | 1 | |
022 | 1 | |
023 | 1 | |
024 | 1 | |
025 | 1 | |
026 | 1 | |
027 | 1 | |
028 | 1 | |
029 | 1 | |
030 | 1 | |
031 | 1 | |
032 | 1 | |
033 | 1 | |
034 | 1 | |
035 | 1 | |
036 | 1 | |
037 | 1 | |
038 | 1 | |
039 | 1 | |
040 | 1 | |
041 | 1 | |
042 | 1 | |
043 | 1 | |
044 | 1 | |
045 | 1 | |
046 | 1 | |
047 | 1 | |
048 | 1 | |
049 | 1 | |
050 | 1 | |
051 | 1 | |
052 | 1 | |
053 | 1 | |
054 | 1 | |
055 | 1 | |
056 | 1 | |
057 | 1 | |
058 | 1 | |
059 | 1 | |
060 | 1 | |
061 | 1 | |
062 | 1 | |
063 | 1 | |
064 | 1 | |
065 | 1 | |
066 | 1 | |
067 | 1 | |
068 | 1 | |
069 | 1 | |
070 | 1 | |
071 | 1 | |
072 | 1 | |
073 | 1 | |
074 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
TITLE Title of the reference
Title of the reference
Value 287 | Frequency | |
---|---|---|
A Comparison of Different Online Sampling Approaches for Generating National Samples | 1 | |
A Comparison of Four Probability-Based Online and Mixed-Mode Panels in Europe | 1 | |
A Comparison of the Quality of Questions in a Face-to-face and a Web Survey | 1 | |
A multi-group analysis of online survey respondent data quality: Comparing a regular USA consumer panel to MTurk samples | 1 | |
Accuracy of Estimates in Access Panel Based Surveys (in Improving survey methods) | 1 | |
An empirical test of the impact of smartphones on panel-based online data collection (in Online Panel Research: A Data Quality Perspective) | 1 | |
Assessing representativeness of a probability-based online panel in Germany (in Online Panel Research: A Data Quality Perspective) | 1 | |
Attention and Usability in Internet Surveys: Effects of Visual Feedback in Grid Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Attitudes Toward Risk and Informed Consent for Research on Medical Practices: A Cross-Sectional Survey | 1 | |
Can Biomarkers Be Collected in an Internet Survey? A Pilot Study in the LISS Panel (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Can a non-probabilistic online panel achieve question quality similar to that of the European Social Survey? | 1 | |
Challenges in Reaching Hard-to-Reach Groups in Internet Panel Research (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Comparing Survey Results Obtained via Mobile Devices and Computers: An Experiment With a Mobile Web Survey on a Heterogeneous Group of Mobile Devices Versus a Computer-Assisted Web Survey | 1 | |
Comparison of Smartphone and Online Computer Survey Administration | 1 | |
Comparison of US Panel Vendors for Online Surveys | 1 | |
Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities | 1 | |
Correcting for non-response bias in contingent valuation surveys concerning environmental non-market goods: an empirical investigation using an online panel | 1 | |
Data Quality in PC and Mobile Web Surveys | 1 | |
Determinants of the starting rate and the completion rate in online panel studies (in Online Panel Research: A Data Quality Perspective) | 1 | |
Does It Pay Off to Include Non-Internet Households in an Internet Panel? | 1 | |
Does the Inclusion of Non-Internet Households in a Web Panel Reduce Coverage Bias? | 1 | |
Effects of Lotteries on Response Behavior in Online Panels | 1 | |
Efficiency of Different Recruitment Strategies for Web Panels | 1 | |
Estimating the effects of nonresponses in online panels through imputation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study | 1 | |
Evaluation of an online (opt-in) panel for public participation geographic information systems surveys | 1 | |
How Do Lotteries and Study Results Influence Response Behavior in Online Panels? | 1 | |
How Representative Are Online Panels? Problems of Coverage and Selection and Possible Solutions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Improving Response Rates and Questionnaire Design for Mobile Web Surveys | 1 | |
Improving Survey Response Rates in Online Panels: Effects of Low-Cost Incentives and Cost-Free Text Appeal Interventions | 1 | |
Improving web survey quality: Potentials and constraints of propensity score adjustments (in Online Panel Research: A Data Quality Perspective) | 1 | |
Informing panel members about study results: Effects of traditional and innovative forms of feedback on participation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Internet panels, professional respondents, and data quality | 1 | |
Lotteries and study results in market research online panels | 1 | |
Making Mobile Browser Surveys Smarter Results from a Randomized Experiment Comparing Online Surveys Completed via Computer or Smartphone | 1 | |
Measurement invariance and quality of composite scores in a face-to-face and a web survey | 1 | |
Measuring Attitudes Toward Controversial Issues in Internet Surveys: Order Effects of Open and Closed Questioning (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Mobile Response in Web Panels | 1 | |
Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face-to-Face Reference Surveys | 1 | |
Motives for joining nonprobability online panels and their association with survey participation behavior (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison With a Probability Sample Interview Survey | 1 | |
Nonresponse and attrition in a probability-based online panel for the general population (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonresponse and measurement error in an online panel: Does additional effort to recruit reluctant respondents result in poorer quality data? (in Online Panel Research: A Data Quality Perspective) | 1 | |
Online panels and validity: Representativeness and attrition in the Finnish eOpinion panel (in Online Panel Research: A Data Quality Perspective) | 1 | |
Panel Attrition - Separating Stayers, Fast Attriters, Gradual Attriters, and Lurkers | 1 | |
Panel Conditioning in Difficult Attitudinal Questions | 1 | |
Professional respondents in nonprobability online panels (in Online Panel Research: A Data Quality Perspective) | 1 | |
Recruiting A Probability Sample For An Online Panel: Effects Of Contact Mode, Incentives, And Information | 1 | |
Recruiting an Internet Panel Using Respondent-Driven Sampling | 1 | |
Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments | 1 | |
Respondent Screening and Revealed Preference Axioms: Testing Quarantining Methods for Enhanced Data Quality in Web Panel Survey | 1 | |
Response Behavior in an Adaptive Survey Design for the Setting-Up Stage of a Probability-Based Access Panel in Germany (in Improving survey methods) | 1 | |
Sample composition discrepancies in different stages of a probability-based online panel | 1 | |
Selection bias of internet panel surveys: A comparison with a paper-based survey and national governmental statistics in Japan | 1 | |
Sensitive topics in PC Web and mobile web surveys: Is there a difference? | 1 | |
Setting Up an Online Panel Representative of the General Population The German Internet Panel | 1 | |
Straightlining in Web survey panels over time | 1 | |
Survey Mode Effects on Data Quality: Comparison of Web and Mail Modes in a U.S. National Panel Survey | 1 | |
Survey Participation in a Probability-Based Internet Panel in the Netherlands (in Improving survey methods) | 1 | |
Surveying Rare Populations Using a Probability based Online Panel | 1 | |
The Access Panel of German Official Statistics as a Selection Frame (in Improving survey methods) | 1 | |
The Design of Grids in Web Surveys | 1 | |
The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability-based General Population Panel | 1 | |
The Utility of an Online Convenience Panel for Reaching Rare and Dispersed Populations | 1 | |
The comparison of road safety survey answers between web-panel and face-to-face; Dutch results of SARTRE-4 survey | 1 | |
The impact of speeding on data quality in nonprobability and freshly recruited probability-based online panels (in Online Panel Research: A Data Quality Perspective) | 1 | |
The relationship between nonresponse strategies and measurement error: Comparing online panel surveys to traditional surveys (in Online Panel Research: A Data Quality Perspective) | 1 | |
The role of topic interest and topic salience in online panel web surveys. | 1 | |
The untold story of multi-mode (online and mail) consumer panels: From optimal recruitment to retention and attrition (in Online Panel Research: A Data Quality Perspective) | 1 | |
The use of Pcs, smartphones and tablets in a probability based panel survey. Effects on survey measurement error. | 1 | |
Using Interactive Features to Motivate and Probe Responses to Open-Ended Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Validating respondents’ identity in online samples: The impact of efforts to eliminate fraudulent respondents (in Online Panel Research: A Data Quality Perspective) | 1 | |
What Happens if You Offer a Mobile Option to Your Web Panel? Evidence From a Probability-Based Panel of Internet Users | 1 | |
What do web survey panel respondents answer when asked "Do you have any other comment?" | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 |
EDITOR Editor of the reference
Editor of the reference
Value 386 | Frequency | |
---|---|---|
Annals of Internal Medicine 162 (10) | 1 | |
Asia-Pacific Journal of Public Health | 1 | |
Center for Crime and Justice Policy, CCJP 1 | 1 | |
Field Methods, 26, 4 | 1 | |
Field Methods, 27, 4. pp. 391-408 | 1 | |
Field Methods, Published online before print February 21, 2013 | 1 | |
Field Methods, Published online before print January 29, 2013 | 1 | |
International Gambling Studies | 1 | |
International Journal of Internet Science, 8, 1, p. 17-29 | 1 | |
International Journal of Market Research, 55, 1, pp. 59-80 | 1 | |
International Journal of Market Research, 55, 5, pp. 611-616 | 1 | |
International Journal of Market Research, 57, 3, pp. 395-412 | 1 | |
International Journal of Public Opinion Research, 24, 2, pp. 238-249 | 1 | |
International Journal of Public Opinion Research, 24, 4, pp. 534-545 | 1 | |
International Journal of Public Opinion Research, 25, 2, pp. 242-253 | 1 | |
JMIR Publications, 15, 11 | 1 | |
Journal of Business Research, 69, 8, pp. 3139-3148 | 1 | |
Journal of Environmental Planning and Management | 1 | |
Journal of Medical Internet Research, 16, 12, e276 | 1 | |
Journal of Official Statistics, 30, 2, pp. 291-310 | 1 | |
Journal of Safety Research, 46, pp. 13-20 | 1 | |
Methodology, 11, 3, pp. 81-88 | 1 | |
PLOS one, 10, 12 | 1 | |
Public Opinion Quarterly (POQ), 76, 3, pp. 470-490 | 1 | |
Public Opinion Quarterly (POQ), 79, 3, pp. 687-709 | 1 | |
Public Opinion Quarterly (POQ), First published online: November 14, 2014 | 1 | |
Public Opinion Quarterly (POQ), First published online: September 16, 2013 (77, 3, pp. 783-797) | 1 | |
Routledge | 10 | |
Social Science Computer Review, 30, 2, pp. 212-228 | 1 | |
Social Science Computer Review, 31, 3, p. 322-345 | 1 | |
Social Science Computer Review, 31, 3, pp. 371-385 | 1 | |
Social Science Computer Review, 31, 4, p. 482-504 | 1 | |
Social Science Computer Review, 31, 6, p. 725-743 | 1 | |
Social Science Computer Review, 32 no. 2, pp. 238-255 | 1 | |
Social Science Computer Review, 32, 4, 544-560 | 1 | |
Social Science Computer Review, 32, 6, pp. 728-742 | 1 | |
Social Science Computer Review, 34, 1, 2016, pp. 8-25, Published online before print March 31, 2015 | 1 | |
Social Science Computer Review, 34, 1, pp. 41-58 | 1 | |
Social Science Computer Review, 34, 1, pp. 99-115 | 1 | |
Social Science Computer Review, 34, 2, pp. 229-243, Published online before print December 17, 2014 | 1 | |
Social Science Computer Review, vol. 34, 1: pp. 78-94. First Published February 26, 2015. | 1 | |
Sociological Methods & Research | 1 | |
Survey Methods: Insights from the field | 1 | |
Survey Practice, 5, 3 | 1 | |
Survey Research Methods, 7, 1, pp. 17-28 | 1 | |
Survey Research Methods, 7, 3 | 1 | |
Survey Research Methods, 9, 2, pp. 125-137 | 1 | |
Wiley | 15 | |
methods, data, analyses | 1 | |
methods, data, analyses, 9, 1, pp.87-110 | 1 | |
methods, data, analyses, 9, 2, pp. 185-212 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
AUTHOR Author of the reference
Author of the reference
Value 485 | Frequency | |
---|---|---|
Arn, B.; Klug, S.; Kolodziejski, J. | 1 | |
Avendano, M., Scherpenzeel, A. C., Mackenbach, J. P. | 1 | |
Baker, R., Miller, C., Kachhi, D., Lange, K., Wilding-Brown, L., Tucker, J. | 1 | |
Binswanger, J., Schunk, D., Toepoel, V. | 1 | |
Blom, A. G., Bosnjak, M., Cornilleau, A., Cousteaux, A-S., Das, M., Douhou, S., Krieger, U. | 1 | |
Blom, A. G.; Gathmann, C.; Krieger, U. | 1 | |
Bonnichsen, O.; Boye Olsen, S. | 1 | |
Bosnjak, M., Haas, I., Galesic, M., Kaczmirek, L., Bandilla, W., Couper, M. P. | 1 | |
Bosnjak, M.; Struminskaya, B.; Weyandt, K. | 1 | |
Brown, G., Weber, D., Zanon, D., de Bie, K. | 1 | |
Buskirk, T. D., Andrus, C. | 1 | |
Cella, D., Craig, B. M., Hays, R. D., Pickard, A. S., Reeve, B. B., Revicki, D. A. | 1 | |
Cho, Mildred K., David Magnus, Melissa Constantine, Sandra Soo-Jin Lee, Maureen Kelley, Stephanie Alessi, Diane Korngiebel, et al. | 1 | |
Couper, M. P., Tourangeau, R., Conrad, F. G., Zhang, C. | 1 | |
Drewes, F. | 1 | |
Eckman, S. | 1 | |
Enderle, T., and Münnich, R. | 1 | |
Engel, U. | 1 | |
Erens, B.; Burkill, S.; Couper, M. P.; C., Clifton, S., Tanton, C., Phelps, A., Datta, J., Mercer, C. H., Sonnenberg, P., Prah, P., Mitchell, K. R., Wellings, K., Johnson, Anne M., Copas, A.Conrad, F. C. | 1 | |
Ester, P., Vinken, H. | 1 | |
Goeritz, A., Luthe, S. C. | 3 | |
Golden, L.; Albaum, G.; Roster, C. A.; Smith, S. M. | 1 | |
Goldenbeld, C., de Craen, S. | 1 | |
Greszki, R., Meyer, M., Schoen, H. | 1 | |
Grönlund, K., Strandberg, K. | 1 | |
Göritz, A. S. | 1 | |
Hansen, K. M., Pedersen, R. T. | 1 | |
Heen, Miliaikeala SJ, Joel D. Lieberman, and Terance D. Miethe | 1 | |
Hillygus, D. S., Jackson, N., Young, M. | 1 | |
Jones, M. S.; House, L. A.; Zhifeng, G. | 1 | |
Kaczmirek, L. | 1 | |
Keusch, F. | 1 | |
Keusch, F., Batinic, B., Mayerhofer, W. | 1 | |
Lee, C.-K.; Back, K.-J.; Williams, Ro. J.; Ahn, S.-S. | 1 | |
Leenheer, J., Scherpenzeel, A. | 1 | |
Lugtig, P. J. | 1 | |
Lugtig, P., Das, M., Scherpenzeel, A. | 1 | |
Malhotra, N., Miller, J. M., Wedeking, J. | 1 | |
Matthijsse, S.M., de Leeuw, E.D., & Hox, J.J. | 1 | |
Mavletova, A. M. | 1 | |
Mavletova, A. M., Couper, M. P. | 1 | |
McCutcheon, A. L., Rao, K., Kaminska, O. | 1 | |
Oudejans, M., Christian, L. M.0 | 1 | |
Pedersen, M. J.; Nielsen, C. V. | 1 | |
Peugh, J., Wright, G. | 1 | |
Rendtel, U., and Amarov, B. | 1 | |
Revilla, M. | 1 | |
Revilla, M., Saris, W. E. | 1 | |
Revilla, M.; Saris, W. E.; Loewe, G.; Ochoa, C. | 1 | |
Roberts, C., Allum, N., Sturgis, P. | 1 | |
Scherpenzeel, A. C. | 1 | |
Scherpenzeel, A. C., Bethlehem, J. G. | 1 | |
Scherpenzeel, A., Toepoel, V. | 2 | |
Schonlau, M. | 1 | |
Schonlau, M., Weidmer, B., Kapteyn, A. | 1 | |
Sell, R.; Goldberg, S.; Conron, K. | 1 | |
Shin, E., Johnson, T. P., Rao, K. | 1 | |
Steinmetz, S., Bianchi, A., Tijdens, K., Biffignandi, S. | 1 | |
Struminskaya, B. | 1 | |
Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla W. | 1 | |
Struminskaya, B.; de Leeuw, E. D.; Kaczmirek, L. | 1 | |
Toepoel, V., Lugtig, P. J. | 1 | |
Toepoel, V.; Lugtig, P. J. | 1 | |
Toepoel, V.; Schonlau, M. | 1 | |
Tsuboi, S., Yoshida, H., Ae, R., Kojo, T., Nakamura, Y., Kitamura, K. | 1 | |
Vis, C. M., Marchand, M. A. G. | 1 | |
Wells, T., Bailey, J., Link, M. W. | 1 | |
Zhang, W. | 1 | |
de Bruijne, M., Wijnant, A. | 3 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 |
YEAR Year of the reference
Year of the reference
Value 584 | Frequency | |
---|---|---|
2011 | 6 | |
2012 | 5 | |
2013 | 15 | |
2014 | 28 | |
2015 | 14 | |
2016 | 6 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
Valid range from 2011 to 2016
COUNT_ST Country of the study
Country of the study
Value 683 | Frequency | |
---|---|---|
Australia | 1 | |
Austria | 2 | |
Denmark | 3 | |
Finland | 1 | |
Germany | 14 | |
Germany and USA | 1 | |
Japan | 1 | |
Russia | 2 | |
South Korea | 1 | |
Spain | 1 | |
Switzerland | 1 | |
The Netherlands | 26 | |
The Netherlands, Germany and France | 1 | |
USA | 18 | |
United Kingdom | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
TYPE_RES Type of resource
Type of resource
Value 782 | Frequency | |
---|---|---|
Book section, Edited book | 25 | |
Journal article | 49 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
CP_POP Target population
Target population
Value 881 | Frequency | |
---|---|---|
American Jewish population (=rare population) | 1 | |
Internet users aged 18 and older | 3 | |
US adults | 3 | |
car drivers, motorcyclists or other road users | 1 | |
different walks of life (i.e. employees, unknown employment status, temporary workers, and students) | 1 | |
general population | 36 | |
grocery shoppers who have purchased fresh blueberries in the last year | 1 | |
immigrants | 1 | |
market research company's clients | 1 | |
mobile web population | 2 | |
people aged 14+ and users of smartphones and tablets | 1 | |
people aged 16+ who use a smartphone with an Internet connection | 1 | |
people aged 18 and older | 6 | |
people aged 18 and older with access to the Internet and in possession of a smartphone | 1 | |
people aged 18+ and entitled to vote for the German Federal Parliament | 1 | |
people aged between 18 and 65 | 1 | |
people from all walks of life | 3 | |
residents in regional Victoria or Melbourne, who visited at least one of the nine specific state or national parks in the study region within the last 12 months | 1 | |
smartphone owners | 1 | |
target population of the tested area and a panel sample | 1 | |
users of many websites | 1 | |
users of the Google Opinion Rewards application who have smartphones operated by Google’s Android operating system | 1 | |
visitors of a large variety of websites | 1 | |
visitors of popular websites | 1 | |
visitors of the agency web page and users of affiliated programs | 1 | |
na | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
72 | 2 |
PANEL_NO Number of panels referred to
Number of panels referred to
Value 980 | Frequency | |
---|---|---|
1 | panel | 62 |
2 | panels | 7 |
3 | panels | 1 |
4 | panels | 2 |
7 | panels | 1 |
19 | panels | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
Valid range from 1 to 19
STUDY_NO Number of studies on panel members reported by the reference
Number of studies on panel members reported by the reference
Value 1079 | Frequency | |
---|---|---|
1 | study | 68 |
2 | studies | 4 |
3 | studies | 1 |
4 | studies | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
Valid range from 1 to 4
RE_TOPIC Topic of the reference
Topic of the reference
Value 1178 | Frequency | |
---|---|---|
1 | panel itself | 46 |
2 | panel as a sample source | 8 |
3 | both panel itself and as a sample source | 20 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
74 | 0 |
Valid range from 1 to 3
ID_REF Identification code of the reference
Identification code of the reference
Value 1277 | Frequency | |
---|---|---|
001 | 1 | |
002 | 1 | |
003 | 1 | |
004 | 1 | |
005 | 1 | |
006 | 1 | |
007 | 1 | |
008 | 1 | |
009 | 1 | |
010 | 2 | |
011 | 1 | |
012 | 1 | |
013 | 1 | |
014 | 1 | |
015 | 1 | |
016 | 1 | |
017 | 1 | |
018 | 1 | |
019 | 2 | |
020 | 1 | |
021 | 1 | |
022 | 1 | |
023 | 7 | |
024 | 1 | |
025 | 1 | |
026 | 1 | |
027 | 2 | |
028 | 4 | |
029 | 4 | |
030 | 1 | |
031 | 1 | |
032 | 1 | |
033 | 1 | |
034 | 1 | |
035 | 1 | |
036 | 1 | |
037 | 1 | |
038 | 1 | |
039 | 2 | |
040 | 1 | |
041 | 2 | |
042 | 1 | |
043 | 1 | |
044 | 1 | |
045 | 1 | |
046 | 1 | |
047 | 1 | |
048 | 1 | |
049 | 1 | |
050 | 1 | |
051 | 1 | |
052 | 1 | |
053 | 2 | |
054 | 1 | |
055 | 1 | |
056 | 1 | |
057 | 1 | |
058 | 1 | |
059 | 1 | |
060 | 1 | |
061 | 1 | |
062 | 1 | |
063 | 1 | |
064 | 2 | |
065 | 1 | |
066 | 3 | |
067 | 1 | |
068 | 1 | |
069 | 1 | |
070 | 1 | |
071 | 1 | |
072 | 1 | |
073 | 19 | |
074 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
ID_PANEL Identification code of the panel
Identification code of the panel
Value 1376 | Frequency | |
---|---|---|
001.001 | 1 | |
002.001 | 1 | |
003.001 | 1 | |
004.001 | 1 | |
005.001 | 1 | |
006.001 | 1 | |
007.001 | 1 | |
008.001 | 1 | |
009.001 | 1 | |
010.001 | 1 | |
010.002 | 1 | |
011.001 | 1 | |
012.001 | 1 | |
013.001 | 1 | |
014.001 | 1 | |
015.001 | 1 | |
016.001 | 1 | |
017.001 | 1 | |
018.001 | 1 | |
019.001 | 1 | |
019.002 | 1 | |
020.001 | 1 | |
021.001 | 1 | |
022.001 | 1 | |
023.001 | 1 | |
023.002 | 1 | |
023.003 | 1 | |
023.004 | 1 | |
023.005 | 1 | |
023.006 | 1 | |
023.007 | 1 | |
024.001 | 1 | |
025.001 | 1 | |
026.001 | 1 | |
027.001 | 1 | |
027.002 | 1 | |
028.001 | 1 | |
028.002 | 1 | |
028.003 | 1 | |
028.004 | 1 | |
029.001 | 1 | |
029.002 | 1 | |
029.003 | 1 | |
029.004 | 1 | |
030.001 | 1 | |
031.001 | 1 | |
032.001 | 1 | |
033.001 | 1 | |
034.001 | 1 | |
035.001 | 1 | |
036.001 | 1 | |
037.001 | 1 | |
038.001 | 1 | |
039.001 | 1 | |
039.002 | 1 | |
040.001 | 1 | |
041.001 | 1 | |
041.002 | 1 | |
042.001 | 1 | |
043.001 | 1 | |
044.001 | 1 | |
045.001 | 1 | |
046.001 | 1 | |
047.001 | 1 | |
048.001 | 1 | |
049.001 | 1 | |
050.001 | 1 | |
051.001 | 1 | |
052.001 | 1 | |
053.001 | 1 | |
053.002 | 1 | |
054.001 | 1 | |
055.001 | 1 | |
056.001 | 1 | |
057.001 | 1 | |
058.001 | 1 | |
059.001 | 1 | |
060.001 | 1 | |
061.001 | 1 | |
062.001 | 1 | |
063.001 | 1 | |
064.001 | 1 | |
064.002 | 1 | |
065.001 | 1 | |
066.001 | 1 | |
066.002 | 1 | |
066.003 | 1 | |
067.001 | 1 | |
068.001 | 1 | |
069.001 | 1 | |
070.001 | 1 | |
071.001 | 1 | |
072.001 | 1 | |
073.001 | 1 | |
073.002 | 1 | |
073.003 | 1 | |
073.004 | 1 | |
073.005 | 1 | |
073.006 | 1 | |
073.007 | 1 | |
073.008 | 1 | |
073.009 | 1 | |
073.010 | 1 | |
073.011 | 1 | |
073.012 | 1 | |
073.013 | 1 | |
073.014 | 1 | |
073.015 | 1 | |
073.016 | 1 | |
073.017 | 1 | |
073.018 | 1 | |
073.019 | 1 | |
074.001 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
TITLE Title of the reference
Title of the reference
Value 1475 | Frequency | |
---|---|---|
A Comparison of Different Online Sampling Approaches for Generating National Samples | 3 | |
A Comparison of Four Probability-Based Online and Mixed-Mode Panels in Europe | 4 | |
A Comparison of the Quality of Questions in a Face-to-face and a Web Survey | 1 | |
A multi-group analysis of online survey respondent data quality: Comparing a regular USA consumer panel to MTurk samples | 2 | |
Accuracy of Estimates in Access Panel Based Surveys (in Improving survey methods) | 1 | |
An empirical test of the impact of smartphones on panel-based online data collection (in Online Panel Research: A Data Quality Perspective) | 1 | |
Assessing representativeness of a probability-based online panel in Germany (in Online Panel Research: A Data Quality Perspective) | 1 | |
Attention and Usability in Internet Surveys: Effects of Visual Feedback in Grid Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Attitudes Toward Risk and Informed Consent for Research on Medical Practices: A Cross-Sectional Survey | 1 | |
Can Biomarkers Be Collected in an Internet Survey? A Pilot Study in the LISS Panel (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Can a non-probabilistic online panel achieve question quality similar to that of the European Social Survey? | 1 | |
Challenges in Reaching Hard-to-Reach Groups in Internet Panel Research (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Comparing Survey Results Obtained via Mobile Devices and Computers: An Experiment With a Mobile Web Survey on a Heterogeneous Group of Mobile Devices Versus a Computer-Assisted Web Survey | 1 | |
Comparison of Smartphone and Online Computer Survey Administration | 1 | |
Comparison of US Panel Vendors for Online Surveys | 7 | |
Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities | 1 | |
Correcting for non-response bias in contingent valuation surveys concerning environmental non-market goods: an empirical investigation using an online panel | 1 | |
Data Quality in PC and Mobile Web Surveys | 1 | |
Determinants of the starting rate and the completion rate in online panel studies (in Online Panel Research: A Data Quality Perspective) | 1 | |
Does It Pay Off to Include Non-Internet Households in an Internet Panel? | 1 | |
Does the Inclusion of Non-Internet Households in a Web Panel Reduce Coverage Bias? | 1 | |
Effects of Lotteries on Response Behavior in Online Panels | 1 | |
Efficiency of Different Recruitment Strategies for Web Panels | 1 | |
Estimating the effects of nonresponses in online panels through imputation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study | 1 | |
Evaluation of an online (opt-in) panel for public participation geographic information systems surveys | 1 | |
How Do Lotteries and Study Results Influence Response Behavior in Online Panels? | 1 | |
How Representative Are Online Panels? Problems of Coverage and Selection and Possible Solutions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Improving Response Rates and Questionnaire Design for Mobile Web Surveys | 1 | |
Improving Survey Response Rates in Online Panels: Effects of Low-Cost Incentives and Cost-Free Text Appeal Interventions | 1 | |
Improving web survey quality: Potentials and constraints of propensity score adjustments (in Online Panel Research: A Data Quality Perspective) | 1 | |
Informing panel members about study results: Effects of traditional and innovative forms of feedback on participation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Internet panels, professional respondents, and data quality | 19 | |
Lotteries and study results in market research online panels | 1 | |
Making Mobile Browser Surveys Smarter Results from a Randomized Experiment Comparing Online Surveys Completed via Computer or Smartphone | 1 | |
Measurement invariance and quality of composite scores in a face-to-face and a web survey | 1 | |
Measuring Attitudes Toward Controversial Issues in Internet Surveys: Order Effects of Open and Closed Questioning (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Mobile Response in Web Panels | 2 | |
Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face-to-Face Reference Surveys | 1 | |
Motives for joining nonprobability online panels and their association with survey participation behavior (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison With a Probability Sample Interview Survey | 4 | |
Nonresponse and attrition in a probability-based online panel for the general population (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonresponse and measurement error in an online panel: Does additional effort to recruit reluctant respondents result in poorer quality data? (in Online Panel Research: A Data Quality Perspective) | 1 | |
Online panels and validity: Representativeness and attrition in the Finnish eOpinion panel (in Online Panel Research: A Data Quality Perspective) | 1 | |
Panel Attrition - Separating Stayers, Fast Attriters, Gradual Attriters, and Lurkers | 1 | |
Panel Conditioning in Difficult Attitudinal Questions | 2 | |
Professional respondents in nonprobability online panels (in Online Panel Research: A Data Quality Perspective) | 1 | |
Recruiting A Probability Sample For An Online Panel: Effects Of Contact Mode, Incentives, And Information | 1 | |
Recruiting an Internet Panel Using Respondent-Driven Sampling | 1 | |
Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments | 1 | |
Respondent Screening and Revealed Preference Axioms: Testing Quarantining Methods for Enhanced Data Quality in Web Panel Survey | 1 | |
Response Behavior in an Adaptive Survey Design for the Setting-Up Stage of a Probability-Based Access Panel in Germany (in Improving survey methods) | 1 | |
Sample composition discrepancies in different stages of a probability-based online panel | 1 | |
Selection bias of internet panel surveys: A comparison with a paper-based survey and national governmental statistics in Japan | 1 | |
Sensitive topics in PC Web and mobile web surveys: Is there a difference? | 1 | |
Setting Up an Online Panel Representative of the General Population The German Internet Panel | 1 | |
Straightlining in Web survey panels over time | 1 | |
Survey Mode Effects on Data Quality: Comparison of Web and Mail Modes in a U.S. National Panel Survey | 1 | |
Survey Participation in a Probability-Based Internet Panel in the Netherlands (in Improving survey methods) | 1 | |
Surveying Rare Populations Using a Probability based Online Panel | 1 | |
The Access Panel of German Official Statistics as a Selection Frame (in Improving survey methods) | 1 | |
The Design of Grids in Web Surveys | 2 | |
The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability-based General Population Panel | 1 | |
The Utility of an Online Convenience Panel for Reaching Rare and Dispersed Populations | 1 | |
The comparison of road safety survey answers between web-panel and face-to-face; Dutch results of SARTRE-4 survey | 1 | |
The impact of speeding on data quality in nonprobability and freshly recruited probability-based online panels (in Online Panel Research: A Data Quality Perspective) | 2 | |
The relationship between nonresponse strategies and measurement error: Comparing online panel surveys to traditional surveys (in Online Panel Research: A Data Quality Perspective) | 2 | |
The role of topic interest and topic salience in online panel web surveys. | 1 | |
The untold story of multi-mode (online and mail) consumer panels: From optimal recruitment to retention and attrition (in Online Panel Research: A Data Quality Perspective) | 1 | |
The use of Pcs, smartphones and tablets in a probability based panel survey. Effects on survey measurement error. | 1 | |
Using Interactive Features to Motivate and Probe Responses to Open-Ended Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Validating respondents’ identity in online samples: The impact of efforts to eliminate fraudulent respondents (in Online Panel Research: A Data Quality Perspective) | 1 | |
What Happens if You Offer a Mobile Option to Your Web Panel? Evidence From a Probability-Based Panel of Internet Users | 1 | |
What do web survey panel respondents answer when asked "Do you have any other comment?" | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 |
EDITOR Editor of the reference
Editor of the reference
Value 1574 | Frequency | |
---|---|---|
Annals of Internal Medicine 162 (10) | 1 | |
Asia-Pacific Journal of Public Health | 1 | |
Center for Crime and Justice Policy, CCJP 1 | 3 | |
Field Methods, 26, 4 | 1 | |
Field Methods, 27, 4. pp. 391-408 | 1 | |
Field Methods, Published online before print February 21, 2013 | 1 | |
Field Methods, Published online before print January 29, 2013 | 1 | |
International Gambling Studies | 1 | |
International Journal of Internet Science, 8, 1, p. 17-29 | 1 | |
International Journal of Market Research, 55, 1, pp. 59-80 | 1 | |
International Journal of Market Research, 55, 5, pp. 611-616 | 1 | |
International Journal of Market Research, 57, 3, pp. 395-412 | 1 | |
International Journal of Public Opinion Research, 24, 2, pp. 238-249 | 1 | |
International Journal of Public Opinion Research, 24, 4, pp. 534-545 | 1 | |
International Journal of Public Opinion Research, 25, 2, pp. 242-253 | 1 | |
JMIR Publications, 15, 11 | 7 | |
Journal of Business Research, 69, 8, pp. 3139-3148 | 2 | |
Journal of Environmental Planning and Management | 1 | |
Journal of Medical Internet Research, 16, 12, e276 | 4 | |
Journal of Official Statistics, 30, 2, pp. 291-310 | 1 | |
Journal of Safety Research, 46, pp. 13-20 | 1 | |
Methodology, 11, 3, pp. 81-88 | 19 | |
PLOS one, 10, 12 | 1 | |
Public Opinion Quarterly (POQ), 76, 3, pp. 470-490 | 1 | |
Public Opinion Quarterly (POQ), 79, 3, pp. 687-709 | 1 | |
Public Opinion Quarterly (POQ), First published online: November 14, 2014 | 1 | |
Public Opinion Quarterly (POQ), First published online: September 16, 2013 (77, 3, pp. 783-797) | 2 | |
Routledge | 10 | |
Social Science Computer Review, 30, 2, pp. 212-228 | 1 | |
Social Science Computer Review, 31, 3, p. 322-345 | 2 | |
Social Science Computer Review, 31, 3, pp. 371-385 | 1 | |
Social Science Computer Review, 31, 4, p. 482-504 | 1 | |
Social Science Computer Review, 31, 6, p. 725-743 | 1 | |
Social Science Computer Review, 32 no. 2, pp. 238-255 | 1 | |
Social Science Computer Review, 32, 4, 544-560 | 1 | |
Social Science Computer Review, 32, 6, pp. 728-742 | 2 | |
Social Science Computer Review, 34, 1, 2016, pp. 8-25, Published online before print March 31, 2015 | 4 | |
Social Science Computer Review, 34, 1, pp. 41-58 | 1 | |
Social Science Computer Review, 34, 1, pp. 99-115 | 1 | |
Social Science Computer Review, 34, 2, pp. 229-243, Published online before print December 17, 2014 | 1 | |
Social Science Computer Review, vol. 34, 1: pp. 78-94. First Published February 26, 2015. | 1 | |
Sociological Methods & Research | 1 | |
Survey Methods: Insights from the field | 2 | |
Survey Practice, 5, 3 | 1 | |
Survey Research Methods, 7, 1, pp. 17-28 | 1 | |
Survey Research Methods, 7, 3 | 1 | |
Survey Research Methods, 9, 2, pp. 125-137 | 1 | |
Wiley | 17 | |
methods, data, analyses | 1 | |
methods, data, analyses, 9, 1, pp.87-110 | 1 | |
methods, data, analyses, 9, 2, pp. 185-212 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
AUTHOR Author of the reference
Author of the reference
Value 1673 | Frequency | |
---|---|---|
Arn, B.; Klug, S.; Kolodziejski, J. | 1 | |
Avendano, M., Scherpenzeel, A. C., Mackenbach, J. P. | 1 | |
Baker, R., Miller, C., Kachhi, D., Lange, K., Wilding-Brown, L., Tucker, J. | 1 | |
Binswanger, J., Schunk, D., Toepoel, V. | 2 | |
Blom, A. G., Bosnjak, M., Cornilleau, A., Cousteaux, A-S., Das, M., Douhou, S., Krieger, U. | 4 | |
Blom, A. G.; Gathmann, C.; Krieger, U. | 1 | |
Bonnichsen, O.; Boye Olsen, S. | 1 | |
Bosnjak, M., Haas, I., Galesic, M., Kaczmirek, L., Bandilla, W., Couper, M. P. | 1 | |
Bosnjak, M.; Struminskaya, B.; Weyandt, K. | 1 | |
Brown, G., Weber, D., Zanon, D., de Bie, K. | 1 | |
Buskirk, T. D., Andrus, C. | 1 | |
Cella, D., Craig, B. M., Hays, R. D., Pickard, A. S., Reeve, B. B., Revicki, D. A. | 7 | |
Cho, Mildred K., David Magnus, Melissa Constantine, Sandra Soo-Jin Lee, Maureen Kelley, Stephanie Alessi, Diane Korngiebel, et al. | 1 | |
Couper, M. P., Tourangeau, R., Conrad, F. G., Zhang, C. | 2 | |
Drewes, F. | 1 | |
Eckman, S. | 1 | |
Enderle, T., and Münnich, R. | 1 | |
Engel, U. | 1 | |
Erens, B.; Burkill, S.; Couper, M. P.; C., Clifton, S., Tanton, C., Phelps, A., Datta, J., Mercer, C. H., Sonnenberg, P., Prah, P., Mitchell, K. R., Wellings, K., Johnson, Anne M., Copas, A.Conrad, F. C. | 4 | |
Ester, P., Vinken, H. | 1 | |
Goeritz, A., Luthe, S. C. | 3 | |
Golden, L.; Albaum, G.; Roster, C. A.; Smith, S. M. | 2 | |
Goldenbeld, C., de Craen, S. | 1 | |
Greszki, R., Meyer, M., Schoen, H. | 2 | |
Grönlund, K., Strandberg, K. | 1 | |
Göritz, A. S. | 1 | |
Hansen, K. M., Pedersen, R. T. | 1 | |
Heen, Miliaikeala SJ, Joel D. Lieberman, and Terance D. Miethe | 3 | |
Hillygus, D. S., Jackson, N., Young, M. | 1 | |
Jones, M. S.; House, L. A.; Zhifeng, G. | 1 | |
Kaczmirek, L. | 1 | |
Keusch, F. | 1 | |
Keusch, F., Batinic, B., Mayerhofer, W. | 1 | |
Lee, C.-K.; Back, K.-J.; Williams, Ro. J.; Ahn, S.-S. | 1 | |
Leenheer, J., Scherpenzeel, A. | 1 | |
Lugtig, P. J. | 1 | |
Lugtig, P., Das, M., Scherpenzeel, A. | 1 | |
Malhotra, N., Miller, J. M., Wedeking, J. | 2 | |
Matthijsse, S.M., de Leeuw, E.D., & Hox, J.J. | 19 | |
Mavletova, A. M. | 1 | |
Mavletova, A. M., Couper, M. P. | 1 | |
McCutcheon, A. L., Rao, K., Kaminska, O. | 1 | |
Oudejans, M., Christian, L. M.0 | 1 | |
Pedersen, M. J.; Nielsen, C. V. | 1 | |
Peugh, J., Wright, G. | 1 | |
Rendtel, U., and Amarov, B. | 1 | |
Revilla, M. | 1 | |
Revilla, M., Saris, W. E. | 1 | |
Revilla, M.; Saris, W. E.; Loewe, G.; Ochoa, C. | 1 | |
Roberts, C., Allum, N., Sturgis, P. | 1 | |
Scherpenzeel, A. C. | 1 | |
Scherpenzeel, A. C., Bethlehem, J. G. | 1 | |
Scherpenzeel, A., Toepoel, V. | 2 | |
Schonlau, M. | 2 | |
Schonlau, M., Weidmer, B., Kapteyn, A. | 1 | |
Sell, R.; Goldberg, S.; Conron, K. | 1 | |
Shin, E., Johnson, T. P., Rao, K. | 1 | |
Steinmetz, S., Bianchi, A., Tijdens, K., Biffignandi, S. | 1 | |
Struminskaya, B. | 1 | |
Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla W. | 1 | |
Struminskaya, B.; de Leeuw, E. D.; Kaczmirek, L. | 1 | |
Toepoel, V., Lugtig, P. J. | 1 | |
Toepoel, V.; Lugtig, P. J. | 1 | |
Toepoel, V.; Schonlau, M. | 1 | |
Tsuboi, S., Yoshida, H., Ae, R., Kojo, T., Nakamura, Y., Kitamura, K. | 1 | |
Vis, C. M., Marchand, M. A. G. | 1 | |
Wells, T., Bailey, J., Link, M. W. | 1 | |
Zhang, W. | 1 | |
de Bruijne, M., Wijnant, A. | 4 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 |
YEAR Year of the reference
Year of the reference
Value 1772 | Frequency | |
---|---|---|
2011 | 6 | |
2012 | 7 | |
2013 | 22 | |
2014 | 36 | |
2015 | 33 | |
2016 | 9 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 2011 to 2016
COUNT_ST Country of the study
Country of the study
Value 1871 | Frequency | |
---|---|---|
Australia | 1 | |
Austria | 2 | |
Denmark | 3 | |
Finland | 1 | |
Germany | 14 | |
Germany and USA | 2 | |
Japan | 1 | |
Russia | 2 | |
South Korea | 1 | |
Spain | 1 | |
Switzerland | 1 | |
The Netherlands | 47 | |
The Netherlands, Germany and France | 4 | |
USA | 29 | |
United Kingdom | 4 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
TYPE_RES Type of resource
Type of resource
Value 1970 | Frequency | |
---|---|---|
Book section, Edited book | 27 | |
Journal article | 86 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
PAN_NAME Name of the panel
Name of the panel
Value 2069 | Frequency | |
---|---|---|
ANES Panel Study of the National Science Foundation | 3 | |
American Life Panel (ALP) | 1 | |
Authentic Response panel | 1 | |
CentERpanel | 4 | |
DemoSCOPE panel | 1 | |
ELIPSS panel | 1 | |
Embrain online panel | 1 | |
GESIS Online Panel Pilot (GOPP) | 3 | |
GESIS panel | 3 | |
GLES panel of the German National Science Foundation | 1 | |
Gallup Panel | 2 | |
German Internet Panel (GIP) | 2 | |
Google Android Panel | 1 | |
Harris Interactive AG (German section) | 1 | |
Immigrant panel | 1 | |
Knowledge Networks Panel | 4 | |
Kompas Kommunikation panel | 1 | |
LISS panel | 21 | |
MarketResponse (SAMR) panel | 1 | |
Mechanical Turk panel | 2 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 1 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 10 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 11 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 12 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 13 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 14 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 15 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 16 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 17 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 18 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 19 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 2 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 3 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 4 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 5 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 6 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 7 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 8 | 1 | |
Netherlands Online Panel Comparison study (NOPVO) - commercial panel 9 | 1 | |
Netquest panel | 1 | |
Online Market Intelligence (OMI) volunteer online access panel | 2 | |
Opinions Online panel | 2 | |
Priority Programme on Survey Methodology access panel (PPSM) of the German Research Foundation | 1 | |
Qualtrics panel | 2 | |
Survey Monkey panel | 1 | |
Survey Sampling International (SSI) Panel | 2 | |
TNS Gallup A/S | 1 | |
Turner Research Network panel | 1 | |
WiSo-Panel | 3 | |
YouGov panel | 1 | |
a German commercial market research panel | 1 | |
a German household access panel recruited by the German Federal Statistical Office (Destatis) | 2 | |
a Japanese commercial research agency panel | 1 | |
a US web panel firm | 1 | |
an Australian panel by Newspoll Market Research and Lightspeed Research Australia | 1 | |
eOpinion panel of the Abo Akademi University | 1 | |
panel vendor 1 | 1 | |
panel vendor 2 | 1 | |
panel vendor 3 | 1 | |
panel vendor 4 | 1 | |
panel vendor 5 | 1 | |
panel vendor 6 | 1 | |
panel vendor 7 | 1 | |
the online panel of the performing research bureau Motivaction | 1 | |
uSamp's panel + other panels and river samples | 1 | |
volunteer web panel 1 | 1 | |
volunteer web panel 2 | 1 | |
volunteer web panel 3 | 1 | |
volunteer web panel 4 | 1 | |
na | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
UNIQ_PAN Dummy variable that identifies unique/duplicated panels
Dummy variable that identifies unique/duplicated panels
Value 2168 | Frequency | |
---|---|---|
0 | no | 42 |
1 | yes | 69 |
99 | not applicable | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
Valid range from 0 to 1
TP_COVER Type of panel - geographic coverage of the panel
Type of panel - geographic coverage of the panel
Value 2267 | Frequency | |
---|---|---|
1 | national | 98 |
2 | international | 13 |
99 | not applicable | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
Valid range from 1 to 2
TP_COMP Type of panel - membership composition
Type of panel - membership composition
Value 2366 | Frequency | |
---|---|---|
general population | 72 | |
general population (LISS panel) | 1 | |
general population (access panel) | 4 | |
general population (commercial panel) | 6 | |
general population (commercial panel) + other panels and river samples | 1 | |
general population (commercial panel) + river sample of Internet users | 1 | |
general population (consumer panel) | 4 | |
general population (national opt-in commercial panel implemented by a statutory authority responsible for national park planning and management) | 1 | |
general population (opt-in panel) | 3 | |
general population (the CentERpanel) | 1 | |
general population (the LISS panel) | 1 | |
general population (university-based nonprofit panel) | 3 | |
general population (volunteer access panel) | 2 | |
general population (volunteer panel maintained by a communication agency) | 1 | |
general population (volunteer panel) | 7 | |
proprietary | 2 | |
specialty | 1 | |
na | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
TP_COMPN Type of panel - membership composition (recode)
Type of panel - membership composition (recode)
Value 2465 | Frequency | |
---|---|---|
1 | general population | 108 |
2 | specialty | 1 |
3 | proprietary | 2 |
99 | not applicable | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
Valid range from 1 to 3
TP_FIELD Type of panel - field in which the panel is established
Type of panel - field in which the panel is established
Value 2564 | Frequency | |
---|---|---|
1 | commercial | 66 |
2 | research, non-commercial | 6 |
3 | academic | 39 |
99 | not applicable | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
Valid range from 1 to 3
TP_RECR Type of panel - recruitment
Type of panel - recruitment
Value 2663 | Frequency | |
---|---|---|
non-probability | 59 | |
non-probability (a sample generated using Mechanical Turk) | 1 | |
probability | 50 | |
probability (three-stage area sample) | 1 | |
na | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
TP_RECRN Type of panel - recruitment (recode)
Type of panel - recruitment (recode)
Value 2762 | Frequency | |
---|---|---|
1 | probability | 51 |
2 | non-probability | 60 |
99 | not applicable | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
Valid range from 1 to 2
TP_SAMP Type of panel - sampling
Type of panel - sampling
Value 2861 | Frequency | |
---|---|---|
non-probability sampling | 54 | |
probability sampling | 49 | |
probability sampling (ANES Panel Study) | 1 | |
probability sampling (GfK/Knowledge Networks - address based) | 1 | |
probability sampling (GfK/Knowledge Networks) | 1 | |
probability sampling (online panel - ANES Panel Study) | 1 | |
propensity sampling | 1 | |
quota sampling (a German non-probability online panel - GLES) | 1 | |
stratified sampling (by gender and age profile of the mobile web population) | 2 | |
na | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
TP_SAMPN Type of panel - sampling (recode)
Type of panel - sampling (recode)
Value 2960 | Frequency | |
---|---|---|
1 | probability sampling | 55 |
2 | non-probability sampling | 56 |
99 | not applicable | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
Valid range from 1 to 2
CP_SIZE Size of the panel
Size of the panel
Value 3059 | Frequency | |
---|---|---|
1026 (ELIPSS) | 1 | |
106878 (panel size=valid e-mails) | 1 | |
1142 (all panelists from the CentERpanel) | 1 | |
11599 (recruitment stage) | 1 | |
13000 | 1 | |
135000 | 1 | |
144411 | 1 | |
1602 (GIP) | 1 | |
1603 | 1 | |
1665 | 2 | |
17677 households | 2 | |
2000 (all panelists from the CentERpanel) | 1 | |
2164 (ANES Panel Study) | 1 | |
25221 | 1 | |
2722 | 2 | |
2892 | 1 | |
30000 | 1 | |
3045 | 1 | |
35000 | 1 | |
4888 (GESIS) | 1 | |
490000 | 1 | |
5000 households | 1 | |
50000 | 1 | |
5142 households | 1 | |
567591 (survey requests across all 204 studies) | 1 | |
6000 | 1 | |
6162 | 1 | |
65000 (GLES) | 1 | |
8000 | 7 | |
80000 | 1 | |
8089 | 1 | |
8148 | 1 | |
8781 | 1 | |
8849 (LISS) | 1 | |
9761 | 1 | |
between 6000 and 10000 | 1 | |
more than 450000 (panel size for all the countries involved, not only Spain) | 1 | |
- | 64 | |
na | 3 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
46 | 67 |
CP_SIZER Size of the panel (classes of values)
Size of the panel (classes of values)
Value 3158 | Frequency | |
---|---|---|
1 | 1000-3500 | 12 |
2 | 3501-5999 | 1 |
3 | 6000-10000 | 17 |
4 | 10001-65000 | 7 |
5 | 65001-145000 | 4 |
6 | 145001-449999 | 0 |
7 | 450000-490000 | 2 |
99 | missing | 70 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
43 | 70 |
Valid range from 1 to 7
CP_POP Target population
Target population
Value 3257 | Frequency | |
---|---|---|
American Jewish population (=rare population) | 1 | |
Internet users aged 18 and older | 3 | |
US adults | 3 | |
car drivers, motorcyclists or other road users | 1 | |
different walks of life (i.e. employees, unknown employment status, temporary workers, and students) | 1 | |
general population | 54 | |
grocery shoppers who have purchased fresh blueberries in the last year | 1 | |
market research company's clients | 1 | |
mobile web population | 2 | |
people aged 14+ and users of smartphones and tablets | 1 | |
people aged 16+ who use a smartphone with an Internet connection | 1 | |
people aged 18 and older | 10 | |
people aged 18 and older with access to the Internet and in possession of a smartphone | 1 | |
people aged 18+ and entitled to vote for the German Federal Parliament | 1 | |
people aged between 18 and 65 | 19 | |
people from all walks of life | 3 | |
residents in regional Victoria or Melbourne, who visited at least one of the nine specific state or national parks in the study region within the last 12 months | 1 | |
smartphone owners | 1 | |
target population of the tested area and a panel sample | 1 | |
users of many websites | 1 | |
users of the Google Opinion Rewards application who have smartphones operated by Google’s Android operating system | 1 | |
visitors of a large variety of websites | 1 | |
visitors of popular websites | 1 | |
visitors of the agency web page and users of affiliated programs | 1 | |
na | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
111 | 2 |
PDQ_CGS Panel data quality - comparison of point estimates with the gold standard (variables)
Panel data quality - comparison of point estimates with the gold standard (variables)
Value 3356 | Frequency | |
---|---|---|
WageIndicator Survey (LW) data, i.e., mean wage, socio-demographics, and wage-related covariates compared with LISS panel data and Statistics Netherlands | 1 | |
age, gender, education, and personality traits compared with ALLBUS national CAPI survey | 1 | |
attitudinal (respondents’ assessment of the current economic situation in Germany and the economic situation in one year, the assessment of respondents’ own financial situation and prospective financial situation in one year, general health, religios | 1 | |
date of birth, gender, race/ethnicity, socioeconomic status, health status, and geographic location compared with US Census | 7 | |
demographic (gender, age, education, legal marital status, employment, and immigration background) and attitudinal variables (political interest, satisfaction with the government, generalized trust, self-rated health status, assessment of the state o | 1 | |
demographics, and residential characteristics compared with US Census | 3 | |
demographics, attitudinal variables (political interest, satisfaction with democracy, and social and institutional trust), and voting behavior compared with a face-to-face national survey (FNES) and a telephone survey; political participation and act | 1 | |
demographics, voting behavior, and Internet access of the LISS panel members compared with the Dutch population | 1 | |
gender identity and sex questions, ethnicity, and race compared with two population-based surveys (NHIS and NESARC) | 1 | |
male, median age, over 65 years old, household composition and size, home-owner, urbanicity, and voting behavior compared with register data from Statistics Netherlands | 1 | |
satisfaction, and trust in istitutions compared with ESS using MTMM approach | 1 | |
sex, age, education, marriage, living region, and living place compared with National Statistics (GS) and with a paper-based survey | 1 | |
socio-demographic variables, urbanization region, ethnical background, and voting behavior compared with the Dutch population | 1 | |
socio-demographics and, opinion and behavior questions compared with Natsal-3 and external benchmark data (the UK population census, the ONS Integrated Household Survey - IHS, and the National Travel Survey) | 4 | |
socio-economic variables compared with the target population of the tested area | 1 | |
time spent on different media, satisfaction, political orientation, social and political trust, and left-right orientation compared with ESS using a Split-Ballot-MTMM approach | 1 | |
urbanicity, region, sex, age, household type, and unemployment rate, level of education, purchasing power, and immigration compared with street-level data from a commercial provider and population statistics from the 2011 census | 1 | |
na | 85 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
28 | 85 |
PDQ_CGSV Panel data quality - comparison of point estimates with the gold standard (recoded variables)
Panel data quality - comparison of point estimates with the gold standard (recoded variables)
Value 3455 | Frequency | |
---|---|---|
attitudinal variables | 1 | |
attitudinal variables, and use of media | 1 | |
gender identity and sex questions, ethnicity, and race | 1 | |
socio-demographics and, attitudinal and behavioral variables | 4 | |
socio-demographics, Internet access, and voting behavior | 1 | |
socio-demographics, and attitudinal variables | 1 | |
socio-demographics, and geographic location | 1 | |
socio-demographics, and personality traits | 1 | |
socio-demographics, and residential characteristics | 3 | |
socio-demographics, and wage-related variables | 1 | |
socio-demographics, attitudinal variables, Internet access and use, political participation, and voting behavior | 1 | |
socio-demographics, attitudinal variables, religious confession, and residential characteristics | 1 | |
socio-demographics, health status, and geographic location | 7 | |
socio-demographics, residential characteristics, urbanicity, and voting behavior | 1 | |
socio-demographics, urbanicity, and geographic location | 1 | |
socio-demographics, urbanicity, and voting behavior | 1 | |
socio-economic variables | 1 | |
na | 85 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
28 | 85 |
PDQ_CGSN Panel data quality - comparison of point estimates with the gold standard (dummy)
Panel data quality - comparison of point estimates with the gold standard (dummy)
Value 3554 | Frequency | |
---|---|---|
0 | no | 85 |
1 | yes | 28 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_COM Panel data quality - comparison of point estimates with another mode of data collection (variables)
Panel data quality - comparison of point estimates with another mode of data collection (variables)
Value 3653 | Frequency | |
---|---|---|
GLES data, i.e., distribution of speeders, speeding*age, speeding*education, speeding*evaluation (and its determinants) of Merkel's or Bush's handling of the economy, and speeding*turnout intention compared with ANES data | 2 | |
age, level of formal education, income, general reason for visiting parks, and self-reported knowledge of parks in the region compared with self-selected public; indicators of mapping effort and data usability compared with self-selected public, rand | 1 | |
demographics, socio-economic variables, and co-morbidities compared with a RDD survey | 1 | |
demographics, voting behavior, and Internet access of the LISS panel members compared with a traditional national survey (the Dutch Parliamentary Electoral Study), with an online survey (self-selected sample), and with samples from 19 online panels | 1 | |
gender, age, date of birth, education, State, ZIP code, annual household income, and response time between the 7 panel vendors surveys | 7 | |
gender, age, race, home, urbanity, education, income, employment, marital status, and length of stay in panel compared with the mail survey mode; the likelihood of completing the survey by mode (logistic models); the effects of mode on item nonrespon | 1 | |
proportion of Jews with Jewish denominational affiliations compared with an RDD survey and two opt-in surveys | 1 | |
questions about traf?c behaviors, about agreement with existing road safety measures, about agreement with new car safety provisions, and about personal willingness to reduce car usage for a cleaner environment compared with a probability-based face- | 1 | |
respondent quality (median annual household income, demographics, mean number of panels belonged to, average number of surveys completed per week), opinions about economy/well-being of the country/quality of life/personal relationships/use of social | 2 | |
to compare the effect of nonresponse strategies on satisficing between the Internet and telephone modes | 1 | |
trust, and attitude toward immigration compared with a face-to-face survey (ESS) | 1 | |
na | 94 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
19 | 94 |
PDQ_COMN Panel data quality - comparison of point estimates with another mode of data collection (dummy)
Panel data quality - comparison of point estimates with another mode of data collection (dummy)
Value 3752 | Frequency | |
---|---|---|
0 | no | 94 |
1 | yes | 19 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_REL Panel data quality - relationships among variables
Panel data quality - relationships among variables
Value 3851 | Frequency | |
---|---|---|
age*education*gender*urbanization*income*social class*household composition*household size*use of mobile for online survey completion | 1 | |
age*ethnicity*hours a week online*privacy online*agree/disagree questions related to privacy*provided/refused personally identifying information | 1 | |
age*gender*race*education*income*metrics for fraudulence identification (low-probability screening questions)*inattentiveness (straightlining, speeding, inconsistency of age reporting, and trap questions)*survey version (standard probit model); trap | 1 | |
age*gender*smartphone ownership; (the same first two variables)*survey access via mobile devices; (the same first two variables)*private Internet access via smartphone; (the same first two variables)*location for private Internet access via smartphon | 1 | |
age*origin*education*housing*need for a simPC and a broadband connection; (the same first 4 variables)*device on loan or not*gender*position in the household*terminated panel participation (=percentage of fully completed questionnaires among those on | 1 | |
demographics*opinion and policy preference (38 items) | 1 | |
demographics*self-rated health status*life satisfaction (regression model for each of the 3 surveys) | 1 | |
design effect*device type; age*online survey components ("Appealing visual design", and "That smartphones and tablets can be used to take part in the study")*smartphone user or not | 1 | |
device type*age*gender*education*nationality*living alone*in paid work*online survey experience*indicators of NRE and ME | 1 | |
device used in month*device used in subsequent month (=device switch) | 1 | |
distribution of speeders; speeding*age; speeding*education; speeding*evaluation (and its determinants) of Merkel's or Bush's handling of the economy (also excluding page-specific speeders from the analysis); speeding*turnout intention (also excluding | 2 | |
experimental condition*gender*education*age*employed or not*married or not*Internet use*4 questions*questionnaire was interesting*response; (the same 9 variables)*number of words provided to the 4 open-ended questions | 1 | |
experimental conditions*item nonresponse; experimental conditions*interactive terms (among each condition)*item nonresponse; experimental conditions*interactive terms (among each condition)*age*education*SIMpc*item nonresponse; the same three models | 1 | |
gender*age*Region*5 treatments*response rate | 1 | |
gender*age*education*race/ethnicity*family income*region*marital status*arm 4 and arm 5 (seeds and recruits) | 1 | |
gender*age*education*working status*trust*political interest*self-assessed health*survey participation previous 12 months*evaluation questionnaire 1*incentive*unconditioned/conditioned group; knowledge questions about nuclear power plants ("don't kno | 1 | |
gender*age*education*working status*urbanization*household composition*type of device; (the same first 6 variables)*early/late adopter of new technology (to predict unintended mobile responding); (the same first 6 variables)*smartphone usage charact | 2 | |
low/high-frequency scale (3 questions)*survey mode (experiment 1); closed- ended/half- open "Other" category (3 questions)*survey mode (experiment 2); small/large text box (3 questions)*survey mode (experiment 3); alphabetized/randomized response li | 1 | |
mean age*gender*political attitude*political interest*percentage of final recruitment*respondents who have completed the entire recruitment questionnaire | 1 | |
mean responses to 26 items*experimental condition (ANOVA); subjective and objective lenght of the questionnaire*experimental condition (ANOVA); questionnaire evaluation (5 questions)*experimental condition (ANOVA); location at the time of survey comp | 1 | |
noti?cation and permission preferences for research on medical practices (no noti?cation, general information, discussion plus verbal permission, discussion plus written permission)*research scenario (medical record review, randomization - hypertensi | 1 | |
one-person household*average age*ethnical background*income*urbanization region*highest educational degree*Internet/non-Internet household | 1 | |
open-ended question (coded reasons for optimism/pessimism)*degree of optimism/pessimism about the future (5-point scale); open-ended question (self-mentioned societal issues)*offered societal issues (6 of the 21 items) | 1 | |
sensitive indices (positive attitude towards deviant practices, deviant behavior, monthly alcohol-related behavior, daily alcohol consumption, and household income)*survey mode; context variables (place where the respondent filled in the questionnair | 1 | |
sex*age*edication*invitation mode*response rate | 1 | |
sex*age*education*frequency of Internet use*panel tenure*openness for experience*lottery/no lottery*splitting the payout*outcome variables (nondifferentiation, item nonresponse, and completion) | 1 | |
sex*age*highly educated individuals; sex*age*heavy drinkers; sex*age*habitual smokers | 1 | |
sex*age*living with partner or not*education*income*biomarkers (blood cholesterol, salivary cortisol, and waist circumference) | 1 | |
sexual orientation*gender*ethnicity; sexual orientation*gender*race; transgender status*ethnicity; transgender status*race | 1 | |
socio-demographics*willingness-to-pay (model with and without correlation parameter) | 1 | |
survey topic*topic salience*personal interest*membership tenure*accepted survey invitations during the last year*number of other online panels enrolled*gender*education*age*full participation; (the same variables)*speeding; (the same variables)*strai | 1 | |
na | 80 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
33 | 80 |
PDQ_RELN Panel data quality - relationships among variables (dummy)
Panel data quality - relationships among variables (dummy)
Value 3950 | Frequency | |
---|---|---|
0 | no | 80 |
1 | yes | 33 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_WEI Panel data quality - weighting techniques
Panel data quality - weighting techniques
Value 4049 | Frequency | |
---|---|---|
design weights, post-stratification weights (age, gender, and education based on unweighted data from ALLBUS) | 1 | |
multiple weights (i.e., Microcensus sample probability, participation and continuation in the access panel, DE-SILC sample probability, and participation in DE-SILC); large-scale Monte Carlo simulation study: comparison of two estimation techniques ( | 1 | |
post hoc weighting for age*gender | 1 | |
post-hoc weights according to language, gender, and age | 1 | |
post-stratification using 2011 Census: age within sex | 4 | |
post-stratification weights | 1 | |
post-stratification weights (to correct for nonresponse bias)*design weights (to correct for selection bias) | 1 | |
propensity score weights | 1 | |
propensity scores to estimate nonresponse bias: mean values of three survey attitudes estimated using propensity scores based on 1. information about the contact course and “concern/refusal conversion” patterns, and 2. district-level data (the mean s | 1 | |
regression models calculate predicted values as imputation of missing responses | 1 | |
sampling weights applied to 18 demographic quotas, each defined by age, gender, and race/ethnicity | 7 | |
sampling weights based on past "trap questions" research | 1 | |
unweighted composite scores and composite scores based on regression weights | 1 | |
weights (gender-age*education) | 1 | |
weights for the LISS sample (working time, age, type of contract, occupation, and education); propensity score adjustment weights (individual propensity weights, average propensity weights, and propensity post-stratified weights) for the LW sample co | 1 | |
na | 89 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
24 | 89 |
PDQ_WEIR Panel data quality - weighting techniques (recode)
Panel data quality - weighting techniques (recode)
Value 4148 | Frequency | |
---|---|---|
1 | design weights | 9 |
2 | post-stratification weights | 8 |
3 | propensity scores | 3 |
4 | imputation of missing responses | 1 |
5 | a combination of different types of weights | 3 |
99 | not applicable | 89 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
24 | 89 |
Valid range from 1 to 5
PDQ_WEIN Panel data quality - weighting techniques (dummy)
Panel data quality - weighting techniques (dummy)
Value 4247 | Frequency | |
---|---|---|
0 | no | 89 |
1 | yes | 24 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_PRE Panel data quality - professional respondents
Panel data quality - professional respondents
Value 4346 | Frequency | |
---|---|---|
do survey answers of trained respondents differ systematically from answers of novice respondents | 2 | |
mean number of panel memberships, mean number of completed questionnaires in the past 4 weeks, frequency of checking email accounts, incentive as motivation, and fun as motivation by 4 Latent Classes of respondents (altruistic nonprofessional, semi-a | 19 | |
mean number of self-reported surveys in the past four weeks; mean number of self-reported online panel memberships; number of surveys*age*gender*race*income*marital status*education*full-time work status*political knowledge/interest/activity*turnout* | 1 | |
number of panels, and number of surveys taken per week | 2 | |
na | 89 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
24 | 89 |
PDQ_PREN Panel data quality - professional respondents (dummy)
Panel data quality - professional respondents (dummy)
Value 4445 | Frequency | |
---|---|---|
0 | no | 89 |
1 | yes | 24 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_SPE Panel data quality - speeders
Panel data quality - speeders
Value 4544 | Frequency | |
---|---|---|
6 minutes (half the announced estimated time to complete) to fill in the questionnaire is the threshold defined to classify respondents as speeders (<6 minutes) or non-speeders (>6 minutes) | 1 | |
GLES and ANES data: distribution of speeders using 3 thresholds (responses more than 30% or 40% or 50% faster than the median response time); speeding*age; speeding*education;speeding*evaluation (and its determinants) of Merkel's or Bush's handling o | 2 | |
alternatives" | 1 | |
difference in average time to complete each question block (in seconds) for 3 sample groups (USA panel Regular, USA Mturk, and MTurk non-USA) | 2 | |
difference in mean duration for questionnaires completed with no interruption between experienced (=received only one questionnaire prior to the experimental questionnaire) and not experienced (=received from 4 to 5 questionnaire prior to the experim | 1 | |
positive speeding identification occurs when a respondent took less than four seconds on average to answer each of the 13 choices between optimized sets of product | 1 | |
na | 105 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
8 | 105 |
PDQ_SPEN Panel data quality - speeders (dummy)
Panel data quality - speeders (dummy)
Value 4643 | Frequency | |
---|---|---|
0 | no | 106 |
1 | yes | 7 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_FRA Panel data quality - fraudulent or inattentive respondents
Panel data quality - fraudulent or inattentive respondents
Value 4742 | Frequency | |
---|---|---|
"trap questions" (=attention filter questions), straightlining, low-probability screening questions, and inconsistency of age reporting to identify inattentive respondents | 1 | |
cheating (attention filter questions and straightlining), test-retest reliability (duplicate questions), and duplicated IP addresses | 2 | |
the full personally identifying information (name, address, date of birth, and e-mail address) for respondents who provided it was sent to four validation services | 1 | |
na | 109 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
4 | 109 |
PDQ_FRAN Panel data quality - fraudulent or inattentive respondents (dummy)
Panel data quality - fraudulent or inattentive respondents (dummy)
Value 4841 | Frequency | |
---|---|---|
0 | no | 109 |
1 | yes | 4 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_PCO Panel data quality - panel conditioning effect
Panel data quality - panel conditioning effect
Value 4940 | Frequency | |
---|---|---|
comparison of the use of Internet applications between Internet households and non-Internet households (after providing them Internet access) | 1 | |
explorative analysis of whether the lottery condition in Study 1 influenced response behavior 5 months later in Study 2 (general longitudinal effects) | 1 | |
percentage of straightlining across 10 grid questions (about personality, politics, health, leisure, religion, and income) in core modules by wave (from 1 to 7); straightlining*number of months in panel; straightlining*number of previous surveys | 1 | |
regressions for the effect of panel conditioning on respondents' preferences regarding pension income (risky pensions, life-cycle spending, risk attitude, and minimum spending) using socio-demographics and interaction terms (LISS*survey completion ti | 2 | |
two field experiments to study panel conditioning due to learning the surveying process: experiment 1 - advantageous conditioning (choice of ‘‘don’t know’’ answers and social desirability reduction), experiment 2 - disadvantageous conditioning (infor | 1 | |
na | 107 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
6 | 107 |
PDQ_PCON Panel data quality - panel conditioning effect (dummy)
Panel data quality - panel conditioning effect (dummy)
Value 5039 | Frequency | |
---|---|---|
0 | no | 107 |
1 | yes | 6 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_REC Panel data quality - recruitment strategies for setting up the panel
Panel data quality - recruitment strategies for setting up the panel
Value 5138 | Frequency | |
---|---|---|
7 reasons for joining the online panel (by gender, age, and education) and 7 materialism items (factor analysis: first factor is "materialism toward money", and second factor is "materialism toward possessions") | 1 | |
GLES data: self-recruitment, and recruitment through extern link*response time | 1 | |
face-to-face; unconditional and conditional monetary incentives (GESIS) | 1 | |
face-to-face; unconditional and conditional monetary incentives (GIP) | 1 | |
multi-mode (phone and mail) recruitment experiment with different combinations (8 recruitment-assignment groups) of 3 response inducements (advance letter, prepaid monetary incentive, and phone follow-up) | 1 | |
recruitment experiment (after asking for the number of friends): 1. respondents are asked to give a list of friends (5 arms: 1. to list one person; 2. to list 10 persons; 3. to list 10 persons but if they are fewer than those indicated in the precedi | 1 | |
recruitment experiment (with 500 individuals aged 18-69 from the Danish Civil Registration System divided into 5 samples) for a web panel; pre-recruitment online questionnaire answered by all 5 samples' members; 5 recruitment strategies (=treatment g | 1 | |
recruitment of 1200 addresses from the address frame of Statistics Netherlands; 8 experiments that are a combination of contact mode (CATI/CAPI), content of the advance letter (standard/special), incentive payment (none/prepaid/promised), incentive a | 1 | |
telephone and face-to-face; unconditional monetary incentives (LISS) | 1 | |
two incentives experiments: unconditional/conditional cash incentive in the F2F phase; unconditional/no incentive in the first e-mail reminder to register for the online survey | 1 | |
unconditional monetary incentives, tablet PCs and 3G Internet (ELIPSS) | 1 | |
na | 102 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
11 | 102 |
PDQ_RECN Panel data quality - recruitment strategies for setting up the panel (dummy)
Panel data quality - recruitment strategies for setting up the panel
Value 5237 | Frequency | |
---|---|---|
0 | no | 102 |
1 | yes | 11 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_MAI Panel data quality - retention strategies for maintaining the panel
Panel data quality - recruitment strategies for setting up the panel
Value 5336 | Frequency | |
---|---|---|
2 experiments: traditional forms of information (cards, ring binder, advance letter and incentive, e-cards, and newsletter), and innovative forms of feedback information (videos with interviews and graphs showing panel results); effect of the feedbac | 1 | |
incentive*type of sleeper*sleeper reactivation (percentages, and logistic regression coeffcients) | 1 | |
reminders, monetary incentives, hotline, e-mail and messages, presentation of study results and research teams on website, feedback possibilities in each questionnaire (GESIS) | 1 | |
reminders, monetary incentives, payout, toll-free hotline, e-mail and messages, presentation of study results and research teams on website, feedback possibilities in each questionnaire, greetings (GIP) | 1 | |
reminders, monetary incentives, payout, toll-free hotline, e-mail and messages, presentation of study results on website, newsletter, feedback possibilities in each questionnaire, greetings (LISS) | 1 | |
reminders, personal use of tablet and 3G Internet connection, hotline, e-mail and push messages, presentation of study results on applet, feedback possibilities in each questionnaire (ELIPSS) | 1 | |
to make contact: two routes to access the questionnaire (e-mail invitation and study's website); to gain cooperation: feedback study results on the website as a way to make the survey salient to panel members, e-mail and phone reminders, monetary inc | 1 | |
na | 106 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
7 | 106 |
PDQ_MAIN Panel data quality - retention strategies for maintaining the panel (dummy)
Panel data quality - retention strategies for maintaining the panel (dummy)
Value 5435 | Frequency | |
---|---|---|
0 | no | 106 |
1 | yes | 7 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_LOY Panel data quality - participants loyalty to the panel (vs. attrition) and membership tenure
Panel data quality - participants loyalty to the panel (vs. attrition) and membership tenure
Value 5534 | Frequency | |
---|---|---|
GLES regular members: duration of panel membership, and number of completed surveys in the previous four weeks*response time | 1 | |
attrition (and demographics) of incentivized/non-incentivized panel members; member-related (demographics) and panel-related (number of surveys assigned and completed, token gift sent, and number of non-response reminder postcards sent) predictors of | 1 | |
attrition rate after 7 surveys in the 3 months of data collection, and socio-demographics of the participants dropping off during the panel (no systematic patterns among participants dropping out of the panel) | 1 | |
attrition rate by treatment group (conditioned and unconditioned) | 1 | |
demographics*2 experiments*response in month following mailing or response in second month after mailing or participated all months or still active at end of period (logistic regression models) | 1 | |
members who are staying longer in the panel have higher response rates than more recent members; comparing long stay panelists by survey mode, web survey members participate less than mail survey ones | 1 | |
non-Internet households show a high degree of loyalty (average monthly response rate, and attrition rate); likelihood that a household (socio-demographics) leaves the panel | 1 | |
panel cooperativeness (completed the profile survey, mean number of complete waves, completed from 1 to 10 waves, completed from 11 to 20 waves, completed all waves, skipped last two waves, and answered no waves)*number of calls*initial refusals*prot | 1 | |
posterior response probabilities for the 9 Latent Classes of respondents in each wave; socio-demographics*psychological variables*survey attitute*9 class memberships (regression model) | 1 | |
retention of individual panel members in 2008 and 2012 | 1 | |
retention rates of households from 2005 to 2009 (for both the access panel recruitment and the DE-SILC respondents sample); recruitment (for the access panel) and response (for the DE-SILC sample) success*federal state*year (logit model); household s | 1 | |
sample composition (over 65 years old, household composition and size, home-owner, urbanicity, and voting behavior) of the surviving panel members; response propensity of the 9 Latent Classes of respondents in each wave; sample composition (male, med | 1 | |
splitting the lottery deters more recently registered panelists from finishing the survey; retainees in the split lottery are more open for experience than retainees in the lump sum lottery (tenure in years) | 1 | |
straightlining*number of months in panel; straightlining*number of previous surveys | 1 | |
the main effect of membership tenure is not significant, but members with shorter membership tenure, who have been enrolled in the online panel for six months or less, participated to a higher degree if the topic was made highly salient | 1 | |
na | 98 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
15 | 98 |
PDQ_LOYN Panel data quality - participants loyalty to the panel (vs. attrition) and membership tenure (dummy)
Panel data quality - participants loyalty to the panel (vs. attrition) and membership tenure (dummy)
Value 5633 | Frequency | |
---|---|---|
0 | no | 98 |
1 | yes | 15 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_NRE Panel data quality - nonresponse error
Panel data quality - nonresponse error
Value 5732 | Frequency | |
---|---|---|
30 predictors (primary and secondary characteristics) of starting (started/invited panelists) and completion (finished/started questionnaires) rates (2 models) | 1 | |
age, gender, education, and personality traits are used to study sample composition in the stages of willingness to participate in the panel and actual participation to the online surveys among the sample members (second and third selection stages) | 1 | |
comparison (based on over 65 years old, household composition and size, home-owner, and urbanicity) between initial nonrespondents (recruitment phase) and 4 Latent Classes of attriters | 1 | |
completion rate, and eligibility rate | 1 | |
completion rates (absorption rate=percentage of the invitations delivered, start rate, completion rate, screened out rate, breakoff rate, and number of completed questionnaire)*wave (1 and 2)*survey mode (PC web, and mobile web) | 1 | |
demographics*response propensity (non-acceptance=refusal or non-response to the invitation in the screening survey, non-completion, and missing opinion from the policy favorability items) | 1 | |
gender*age*education*method of recruitment*panel tenure*number of other online panels in which a member was enrolled*reasons for joining the online panel*materialism*starting rate; (the same variables)*break-off rate | 1 | |
initial response rate (household registered as panel member, and participating persons in panel households) | 1 | |
males, older respondents, very high income households, low educated people, and households without children are more prone to participate (binary probit model) | 1 | |
members of online panels highly interested in the survey topic have a higher participation rate than panel members with low personal interest; the level of topic salience in the email invitation have no influence on participation behaviour | 1 | |
nonresponse rates at the agree-to-panel stage and at the panel stage (=final stage of the recruitment process) and reasons; providing people with a simPC leads to more panel participation among the groups often hard to reach in survey research | 1 | |
number invited*number of respondents*response rate*number of recruits*ratio recruits per recruiter*5 arms (experiment 2); waves*recruitment (arms 4 and 5) | 1 | |
number of response in recruitment experiment (noncontacts, fully completed survey before reminder, response rate before reminder, final number of fully completed survey, final response rate, final number of incomplete interviews and of explicit refus | 1 | |
only mentioned | 1 | |
only mentioned as NOT significant | 1 | |
only mentioned to correct it with weights | 1 | |
participation rate, screening participation rate, and dropout rate | 1 | |
response in recruitment (not usable, not reached, refusals, central questions only, complete recruitment interview, willing to participate in panel, and registered panel member)*interview modes (CATI/CAPI households with/without phone number); experi | 1 | |
response metrics (recruitment rate X completion rate = cumulative response rate) for the telephone recruitment and online participation | 1 | |
response propensity models at recruitment stage; effect of “concerns/refusal conversion” patterns on the odds of obtaining a full interview (recruitment stage); incentive experiment carried out within the “call announced” subsample0 | 1 | |
response rate, and proportion of consented respondents who failed the screener | 7 | |
response rates at recruitment interview and at panel registration (ELIPSS) | 1 | |
response rates at recruitment interview and at panel registration (GESIS) | 1 | |
response rates at recruitment interview and at panel registration (GIP) | 1 | |
response rates at recruitment interview and at panel registration (LISS) | 1 | |
response rates of the 3 samples | 1 | |
starting rate, completion rate, and cumulative response rate by treatment group (conditioned and unconditioned) | 1 | |
unintended (=spontaneous) and intended mobile response rates | 2 | |
na | 78 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
35 | 78 |
PDQ_NREN Panel data quality - nonresponse error (dummy)
Panel data quality - nonresponse error (dummy)
Value 5831 | Frequency | |
---|---|---|
0 | no | 78 |
1 | yes | 35 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_ME Panel data quality - measurement error
Panel data quality - measurement error
Value 5930 | Frequency | |
---|---|---|
bias between true parliamentary results and estimates from respondents in the 9 classes is measured computing correlation between voting behavior and attrition classes | 1 | |
conditioning on measured preferences, don't know answers, question order | 2 | |
demographic distributions and data quality indicators (e. g., inconsistent/conflicting aswers) were compared between those who validated and those who did not; the impact of the validation process on the final sample (in terms of demographics, behavi | 1 | |
mapping effort (=the exertion of physical and mental power to complete the mapping activity, measured by the total number of markers placed in the survey process, the total elapsed clock time0placing the markers, and the mean elapsed time in placing | 1 | |
means of the response style indicators (acquiescence, extreme responding, neutral middle, and differentiation) per grid (political, and neighborhood)*4 Latent Classes of respondents*effect sizes (Cohen’s f) of the all differences (multivariate analys | 19 | |
members of online panels highly interested in the survey topic show less satisficing behaviour than panel members with low personal interest; the level of topic salience in the email invitation have no influence on data quality in the online panel | 1 | |
number of surveys*age*gender*race*income*marital status*education*full-time work status*survey effort*interview duration*attrited*straight-line*percent missing*percent "don't know"*junk responses to the open-ended questions; panel memberships*(the sa | 1 | |
quality (=the strength of the relationship between the latent and the observed variables) estimates for each experiment, trait and method are slightly different between the online panel and the face-to-face survey in half of the cases and are in fav | 1 | |
quality (=the strength of the relationship between the latent variable one is really interested in and the observed answer to a specific question asked in a given survey) estimates for each experiment, trait and method are sometimes higher in the onl | 1 | |
respondent identity (the proportion of consented respondents who failed to meet the technical criteria, failed to complete the screener questions, and provided discordant responses); discordance (=survey responses were compared with each other, with | 7 | |
satisficing: panel surveys and Natsal-3 CASI show more neutral points (i.e., “don’t know,” “depends,” or “neither agree nor disagree”) when compared with the same (or similar) opinion questions in Natsal-3 CAPI | 4 | |
social desirability bias | 1 | |
straightlining*waves; waves*gender*age*marital status*immigration*education*10 grid questions*implausible/plausible straightlining (generalized estimating equations logistic regression) | 1 | |
straightlining, and answers in scale questions | 2 | |
strong axiom of revealed preference violations as data quality metrics (i. e., low-probability screening questions, failing "trap questions", straight-lining, speeding, and inconsistent answers); response differences when passing and failing quaranti | 1 | |
na | 69 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
44 | 69 |
PDQ_ME_R Panel data quality - measurement error (recode)
Panel data quality - measurement error (recode)
Value 6029 | Frequency | |
---|---|---|
1 | bias between true values and estimates | 1 |
2 | discordant/inconsistent answers | 8 |
3 | satisficing behavior | 31 |
4 | quality estimates obtained adopting a Multitrait-Multimethod (MTMM) matrix | 2 |
5 | social desirability bias | 1 |
6 | strong axiom of revealed preference (SARP) violations | 1 |
99 | not applicable | 69 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
44 | 69 |
Valid range from 1 to 6
PDQ_ME_N Panel data quality - measurement error (dummy)
Panel data quality - measurement error (dummy)
Value 6128 | Frequency | |
---|---|---|
0 | no | 69 |
1 | yes | 44 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
PDQ_QDE Panel data quality - questionnaire design
Panel data quality - questionnaire design
Value 6227 | Frequency | |
---|---|---|
attention filter questions | 2 | |
na | 111 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
2 | 111 |
PDQ_QDEN Panel data quality - questionnaire design (dummy)
Panel data quality - questionnaire design (dummy)
Value 6326 | Frequency | |
---|---|---|
0 | no | 111 |
1 | yes | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
113 | 0 |
Valid range from 0 to 1
ID_PANST Id panel study
Id panel study
Value 6425 | Frequency | |
---|---|---|
001.001.001 | 1 | |
002.001.001 | 1 | |
003.001.001 | 1 | |
004.001.001 | 1 | |
004.001.002 | 1 | |
005.001.001 | 1 | |
006.001.001 | 1 | |
007.001.001 | 1 | |
008.001.001 | 1 | |
009.001.001 | 1 | |
010.001.001 | 1 | |
010.001.002 | 1 | |
010.002.001 | 1 | |
010.002.002 | 1 | |
011.001.001 | 1 | |
012.001.001 | 1 | |
013.001.001 | 1 | |
014.001.001 | 1 | |
015.001.001 | 1 | |
016.001.001 | 1 | |
017.001.001 | 1 | |
018.001.001 | 1 | |
019.001.001 | 1 | |
019.002.001 | 1 | |
020.001.001 | 1 | |
021.001.001 | 1 | |
022.001.001 | 1 | |
023.001.001 | 1 | |
023.002.001 | 1 | |
023.003.001 | 1 | |
023.004.001 | 1 | |
023.005.001 | 1 | |
023.006.001 | 1 | |
023.007.001 | 1 | |
024.001.001 | 1 | |
025.001.001 | 1 | |
026.001.001 | 1 | |
027.001.001 | 1 | |
027.002.001 | 1 | |
028.001.001 | 1 | |
028.002.001 | 1 | |
028.003.001 | 1 | |
028.004.001 | 1 | |
029.001.001 | 1 | |
029.002.001 | 1 | |
029.003.001 | 1 | |
029.004.001 | 1 | |
030.001.001 | 1 | |
031.001.001 | 1 | |
032.001.001 | 1 | |
033.001.001 | 1 | |
034.001.001 | 1 | |
035.001.001 | 1 | |
036.001.001 | 1 | |
037.001.001 | 1 | |
038.001.001 | 1 | |
039.001.001 | 1 | |
039.002.001 | 1 | |
040.001.001 | 1 | |
041.001.001 | 1 | |
041.002.001 | 1 | |
042.001.001 | 1 | |
042.001.002 | 1 | |
043.001.001 | 1 | |
044.001.001 | 1 | |
045.001.001 | 1 | |
046.001.001 | 1 | |
047.001.001 | 1 | |
048.001.001 | 1 | |
049.001.001 | 1 | |
050.001.001 | 1 | |
051.001.001 | 1 | |
052.001.001 | 1 | |
053.001.001 | 1 | |
053.001.002 | 1 | |
053.002.001 | 1 | |
054.001.001 | 1 | |
055.001.001 | 1 | |
056.001.001 | 1 | |
057.001.001 | 1 | |
058.001.001 | 1 | |
059.001.001 | 1 | |
060.001.001 | 1 | |
061.001.001 | 1 | |
062.001.001 | 1 | |
063.001.001 | 1 | |
064.001.001 | 1 | |
064.002.001 | 1 | |
065.001.001 | 1 | |
066.001.001 | 1 | |
066.002.001 | 1 | |
066.003.001 | 1 | |
067.001.001 | 1 | |
068.001.001 | 1 | |
069.001.001 | 1 | |
070.001.001 | 1 | |
071.001.001 | 1 | |
072.001.001 | 1 | |
073.001.001 | 1 | |
073.002.001 | 1 | |
073.003.001 | 1 | |
073.004.001 | 1 | |
073.005.001 | 1 | |
073.006.001 | 1 | |
073.007.001 | 1 | |
073.008.001 | 1 | |
073.009.001 | 1 | |
073.010.001 | 1 | |
073.011.001 | 1 | |
073.012.001 | 1 | |
073.013.001 | 1 | |
073.014.001 | 1 | |
073.015.001 | 1 | |
073.016.001 | 1 | |
073.017.001 | 1 | |
073.018.001 | 1 | |
073.019.001 | 1 | |
074.001.001 | 1 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
118 | 0 |
TITLE Title of the reference
Title of the reference
Value 6524 | Frequency | |
---|---|---|
A Comparison of Different Online Sampling Approaches for Generating National Samples | 3 | |
A Comparison of Four Probability-Based Online and Mixed-Mode Panels in Europe | 4 | |
A Comparison of the Quality of Questions in a Face-to-face and a Web Survey | 1 | |
A multi-group analysis of online survey respondent data quality: Comparing a regular USA consumer panel to MTurk samples | 2 | |
Accuracy of Estimates in Access Panel Based Surveys (in Improving survey methods) | 1 | |
An empirical test of the impact of smartphones on panel-based online data collection (in Online Panel Research: A Data Quality Perspective) | 1 | |
Assessing representativeness of a probability-based online panel in Germany (in Online Panel Research: A Data Quality Perspective) | 1 | |
Attention and Usability in Internet Surveys: Effects of Visual Feedback in Grid Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Attitudes Toward Risk and Informed Consent for Research on Medical Practices: A Cross-Sectional Survey | 1 | |
Can Biomarkers Be Collected in an Internet Survey? A Pilot Study in the LISS Panel (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Can a non-probabilistic online panel achieve question quality similar to that of the European Social Survey? | 1 | |
Challenges in Reaching Hard-to-Reach Groups in Internet Panel Research (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Comparing Survey Results Obtained via Mobile Devices and Computers: An Experiment With a Mobile Web Survey on a Heterogeneous Group of Mobile Devices Versus a Computer-Assisted Web Survey | 1 | |
Comparison of Smartphone and Online Computer Survey Administration | 1 | |
Comparison of US Panel Vendors for Online Surveys | 7 | |
Comparison of telephone RDD and online panel survey modes on CPGI scores and co-morbidities | 1 | |
Correcting for non-response bias in contingent valuation surveys concerning environmental non-market goods: an empirical investigation using an online panel | 1 | |
Data Quality in PC and Mobile Web Surveys | 1 | |
Determinants of the starting rate and the completion rate in online panel studies (in Online Panel Research: A Data Quality Perspective) | 1 | |
Does It Pay Off to Include Non-Internet Households in an Internet Panel? | 1 | |
Does the Inclusion of Non-Internet Households in a Web Panel Reduce Coverage Bias? | 1 | |
Effects of Lotteries on Response Behavior in Online Panels | 1 | |
Efficiency of Different Recruitment Strategies for Web Panels | 1 | |
Estimating the effects of nonresponses in online panels through imputation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Evaluation of an Adapted Design in a Multi-device Online Panel: A DemoSCOPE Case Study | 2 | |
Evaluation of an online (opt-in) panel for public participation geographic information systems surveys | 1 | |
How Do Lotteries and Study Results Influence Response Behavior in Online Panels? | 2 | |
How Representative Are Online Panels? Problems of Coverage and Selection and Possible Solutions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Improving Response Rates and Questionnaire Design for Mobile Web Surveys | 1 | |
Improving Survey Response Rates in Online Panels: Effects of Low-Cost Incentives and Cost-Free Text Appeal Interventions | 1 | |
Improving web survey quality: Potentials and constraints of propensity score adjustments (in Online Panel Research: A Data Quality Perspective) | 1 | |
Informing panel members about study results: Effects of traditional and innovative forms of feedback on participation (in Online Panel Research: A Data Quality Perspective) | 1 | |
Internet panels, professional respondents, and data quality | 19 | |
Lotteries and study results in market research online panels | 1 | |
Making Mobile Browser Surveys Smarter Results from a Randomized Experiment Comparing Online Surveys Completed via Computer or Smartphone | 1 | |
Measurement invariance and quality of composite scores in a face-to-face and a web survey | 1 | |
Measuring Attitudes Toward Controversial Issues in Internet Surveys: Order Effects of Open and Closed Questioning (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Mobile Response in Web Panels | 2 | |
Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face-to-Face Reference Surveys | 1 | |
Motives for joining nonprobability online panels and their association with survey participation behavior (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonprobability Web Surveys to Measure Sexual Behaviors and Attitudes in the General Population: A Comparison With a Probability Sample Interview Survey | 4 | |
Nonresponse and attrition in a probability-based online panel for the general population (in Online Panel Research: A Data Quality Perspective) | 1 | |
Nonresponse and measurement error in an online panel: Does additional effort to recruit reluctant respondents result in poorer quality data? (in Online Panel Research: A Data Quality Perspective) | 1 | |
Online panels and validity: Representativeness and attrition in the Finnish eOpinion panel (in Online Panel Research: A Data Quality Perspective) | 1 | |
Panel Attrition - Separating Stayers, Fast Attriters, Gradual Attriters, and Lurkers | 1 | |
Panel Conditioning in Difficult Attitudinal Questions | 2 | |
Professional respondents in nonprobability online panels (in Online Panel Research: A Data Quality Perspective) | 1 | |
Recruiting A Probability Sample For An Online Panel: Effects Of Contact Mode, Incentives, And Information | 1 | |
Recruiting an Internet Panel Using Respondent-Driven Sampling | 1 | |
Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments | 1 | |
Respondent Screening and Revealed Preference Axioms: Testing Quarantining Methods for Enhanced Data Quality in Web Panel Survey | 1 | |
Response Behavior in an Adaptive Survey Design for the Setting-Up Stage of a Probability-Based Access Panel in Germany (in Improving survey methods) | 1 | |
Sample composition discrepancies in different stages of a probability-based online panel | 1 | |
Selection bias of internet panel surveys: A comparison with a paper-based survey and national governmental statistics in Japan | 1 | |
Sensitive topics in PC Web and mobile web surveys: Is there a difference? | 1 | |
Setting Up an Online Panel Representative of the General Population The German Internet Panel | 1 | |
Straightlining in Web survey panels over time | 1 | |
Survey Mode Effects on Data Quality: Comparison of Web and Mail Modes in a U.S. National Panel Survey | 1 | |
Survey Participation in a Probability-Based Internet Panel in the Netherlands (in Improving survey methods) | 1 | |
Surveying Rare Populations Using a Probability based Online Panel | 1 | |
The Access Panel of German Official Statistics as a Selection Frame (in Improving survey methods) | 1 | |
The Design of Grids in Web Surveys | 4 | |
The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability-based General Population Panel | 1 | |
The Utility of an Online Convenience Panel for Reaching Rare and Dispersed Populations | 1 | |
The comparison of road safety survey answers between web-panel and face-to-face; Dutch results of SARTRE-4 survey | 1 | |
The impact of speeding on data quality in nonprobability and freshly recruited probability-based online panels (in Online Panel Research: A Data Quality Perspective) | 2 | |
The relationship between nonresponse strategies and measurement error: Comparing online panel surveys to traditional surveys (in Online Panel Research: A Data Quality Perspective) | 3 | |
The role of topic interest and topic salience in online panel web surveys. | 1 | |
The untold story of multi-mode (online and mail) consumer panels: From optimal recruitment to retention and attrition (in Online Panel Research: A Data Quality Perspective) | 1 | |
The use of Pcs, smartphones and tablets in a probability based panel survey. Effects on survey measurement error. | 1 | |
Using Interactive Features to Motivate and Probe Responses to Open-Ended Questions (in Social and Behavioral Research and the Internet: Advances in Applied Methods and Research Strategies) | 1 | |
Validating respondents’ identity in online samples: The impact of efforts to eliminate fraudulent respondents (in Online Panel Research: A Data Quality Perspective) | 1 | |
What Happens if You Offer a Mobile Option to Your Web Panel? Evidence From a Probability-Based Panel of Internet Users | 1 | |
What do web survey panel respondents answer when asked "Do you have any other comment?" | 2 |
Valid cases | Invalid cases | Minimum | Maximum | Arithmetic mean | Standard deviation |
---|---|---|---|---|---|
118 |
EDITOR Editor of the reference
Editor of the reference
Value 6623 | Frequency | |
---|---|---|
Annals of Internal Medicine 162 (10) | 1 | |
Asia-Pacific Journal of Public Health | 1 | |
Center for Crime and Justice Policy, CCJP 1 | 3 | |
Field Methods, 26, 4 | 1 | |
Field Methods, 27, 4. pp. 391-408 | 1 | |
Field Methods, Published online before print February 21, 2013 | 1 | |
Field Methods, Published online before print January 29, 2013 | 1 | |
International Gambling Studies | 1 | |
International Journal of Internet Science, 8, 1, p. 17-29 | 1 | |
International Journal of Market Research, 55, 1, pp. 59-80 | 1 | |
International Journal of Market Research, 55, 5, pp. 611-616 | 1 | |
International Journal of Market Research, 57, 3, pp. 395-412 | 1 | |
International Journal of Public Opinion Research, 24, 2, pp. 238-249 | 1 | |
International Journal of Public Opinion Research, 24, 4, pp. 534-545 | 1 | |
International Journal of Public Opinion Research, 25, 2, pp. 242-253 | 1 | |
JMIR Publications, 15, 11 | 7 | |
Journal of Business Research, 69, 8, pp. 3139-3148 | 2 | |
Journal of Environmental Planning and Management | 1 | |
Journal of Medical Internet Research, 16, 12, e276 | 4 | |
Journal of Official Statistics, 30, 2, pp. 291-310 | 1 | |
Journal of Safety Research, 46, pp. 13-20 | 1 | |
Methodology, 11, 3, pp. 81-88 | 19 | |
PLOS one, 10, 12 | 1 | |
Public Opinion Quarterly (POQ), 76, 3, pp. 470-490 | 1 | |
Public Opinion Quarterly (POQ), 79, 3, pp. 687-709 | 1 | |
Public Opinion Quarterly (POQ), First published online: November 14, 2014 | 1 | |
Public Opinion Quarterly (POQ), First published online: September 16, 2013 (77, 3, pp. 783-797) | 2 | |
Routledge | 10 | |
Social Science Computer Review, 30, 2, pp. 212-228 | 1 | |
Social Science Computer Review, 31, 3, p. 322-345 | 4 | |
Social Science Computer Review, 31, 3, pp. 371-385 | 2 | |
Social Science Computer Review, 31, 4, p. 482-504 | 1 | |
Social Science Computer Review, 31, 6, p. 725-743 | 1 | |
Social Science Computer Review, 32 no. 2, pp. 238-255 | 1 | |
Social Science Computer Review, 32, 4, 544-560 | 1 | |
Social Science Computer Review, 32, 6, pp. 728-742 | 2 | |
Social Science Computer Review, 34, 1, 2016, pp. 8-25, Published online before print March 31, 2015 |