To get things rolling with my bite-size chunks of survey data analysis, I’ll begin with a description of the survey itself and the distribution and sampling.
The survey (which was approved by my university’s IRB and run through Qualtrics software) was open from 3.25.2019 through 1.20.2020 – about 10 months total. I sent four waves of email solicitation and, as noted in my last post, I also distributed invitations and spread the word at NAHBS 2019 (in Sacramento) and publicized it through a couple of podcast interviews. The survey was open, meaning anyone could access it if they found the link (put differently: respondents did not need an individualized invitation to access the survey).
The survey itself was comprised of 67 items, some of which contained multiple responses and follow-up questions, producing ~130 variables for analysis. Respondents were given the option of total anonymity (in which case I don’t know their identity) or confidentiality (in which case I know who they are, but with no identify/identifying information ever being made public). 88% of the final sample of respondents responded confidentially, offering the possibility for a follow-up survey in the future and a better sense of just how representative is this sample of the larger population of U.S. framebuilders.
The original email solicitation was sent to 330 addresses. I cast this net as widely as possible, sending it to anyone I could find who appeared to be in the U.S. and offering bikes they fabricate for sale. About 40 of these bounced back as dead emails, leaving about 290 possible active recipients.
When I closed the survey, there were 128 responses with some amount of data, but this included a couple of double responses (people who completed the survey twice….probably because they forgot they’d done it) as well as partial completions and a few from people outside of the U.S. Dozens more visitors looked at the survey and either completed a single gatekeeping confidential/anonymous question and examined the survey, or just completed a single question; these were all dropped from the analysis. The lowest percentage of completion of all items for those whose data were saved was 21%, and the lowest completion for those included in the analyses was 55% (there are 3 cases with 55% completion).
After cleaning and filtering, I have 123 responses total – yielding a 42% response rate.
I think this is an excellent response rate especially given the population of builders itself, which has high rates of turnover, is often resistant to this kind of solicitation, and is mainly comprised of single-person shops with little spare time. What is more, there was no incentive or direct/immediate benefit to framebuilders for completing the survey, apart from the promise of a free report with data analysis made available after completion of the work.
By way of comparison, a similar electronic survey (indeed, some items in my survey were modeled after theirs) of the “Portland Made Collective” of makers/manufacturers in Portland, OR received a 25% response rate, which the authors found to be good. And, that survey was targeted specifically at members of a formal organization organized around promoting , so those were a self-selected population with, presumably, more incentive for completing the survey.