Skip to main content

A randomized trial of mail and email recruitment strategies for a physician survey on clinical trial accrual

Abstract

Background

Patient participation in cancer clinical trials is suboptimal. A challenge to capturing physicians’ insights about trials has been low response to surveys. We conducted a study using varying combinations of mail and email to recruit a nationally representative sample of medical, surgical, and radiation oncologists to complete a survey on trial accrual.

Methods

We randomly assigned eligible physicians identified from the American Medical Association MasterFile (n = 13,251) to mail- or email-based recruitment strategies. Mail-based recruitment included a survey packet with: (1) cover letter describing the survey and inviting participation; (2) paper copy of the survey and postage-paid return envelope; and (3) a web link for completing the survey online. Email-based recruitment included an e-mail describing the survey and inviting participation, along with the web link to the survey, and a reminder postcard 2 weeks later.

Results

Response was higher for mail-based (11.8, 95% CI 11.0–12.6%) vs. email-based (4.5, 95% CI 4.0–5.0%) recruitment. In email-based recruitment, only one-quarter of recipients opened the email, and even fewer clicked on the link to complete the survey. Most physicians in mail-based recruitment responded after the first invitation (362 of 770 responders, 47.0%). A higher proportion of responders vs. non-responders were young (ages 25–44 years), men, and radiation or surgical (vs. medical) oncologists.

Conclusions

Most physicians assigned to mail-based recruitment actually completed the survey online via the link provided in the cover letter, and those in email-based recruitment did not respond until they received a reminder postcard by mail. Providing the option to return a paper survey or complete it online may have further increased participation in the mail-based group, and future studies should examine how combinations of delivery mode and return options affect physicians’ response to surveys.

Peer Review reports

Background

Patient participation in cancer clinical trials is suboptimal [1], and fewer than half of National Cancer Institute-sponsored trials meet accrual targets [2]. Prior studies have surveyed oncologists about patient barriers to trial participation but have overlooked practice-level barriers that may impede capacity to successfully enroll and care for patients on trials.

Historically, a challenge to capturing oncologists’ insights about trials has been low response rates to surveys [3]. Physician surveys are an important tool for capturing information about the organization and delivery of care, as well as physician knowledge and attitudes [4], often from a representative sample. Response rates vary by survey mode, incentive, and physician characteristics. Mail-based surveys generally yield higher response rates compared to email-based surveys [5, 6], but mail may be costly and have a slower return. More recently, studies have used mixed-modes, or varying combinations of mail and email, to recruit physicians to complete surveys. Mixed-mode recruitment strategies appear to elicit higher response compared to email alone [7, 8]. Few have evaluated the effect of recruitment strategies on response among oncologists.

We conducted a study using varying combinations of mail and email to recruit a national sample of medical, surgical, and radiation oncologists to complete a survey on practice-level barriers to trial accrual. To understand how recruitment strategies affected response, as well as characteristics of the sample, we addressed two research questions:

  1. 1.

    Does survey response rate differ by recruitment strategy (mail- vs. email-based)?

  2. 2.

    Are there differences in characteristics of responders vs. non-responders?

Methods

Sampling frame

We identified physicians from the American Medical Association (AMA) Physician Masterfile – the standard sampling frame used in national physician surveys because it provides the most complete coverage of the U.S. physician population. We restricted sampling to hematology/oncology specialties (including medical oncology, radiation oncology, and surgical oncology), office-based physicians, age ≤ 75 years, e-mail address on file, and with patient care listed as primary activity (n = 13,251).

Our goal was to obtain 1500 survey responses, and we expected a 30% response rate based on comparable reports in the literature [9,10,11,12,13]. Therefore, we stratified eligible physicians by specialty (n = 9177 medical oncologists; n = 3720 radiation oncologists; n = 354 surgical oncologists) to generate a list of 5000 physicians. Because the number of surgical oncologists was much smaller than other specialties, we included all 354 surgical oncologists in the list of 5000 physicians. To obtain the remaining 4646, we used simple random sampling (via PROC SURVEYSELECT without replacement in SAS) to randomly select 2323 medical oncologists and 2323 radiation oncologists. We then randomly assigned all 5000 physicians to a mail- or email-based recruitment strategy. A randomization log was generated by a biostatistician (CA) and sent directly to research staff not involved in the analysis. Study investigators were blind to randomized assignment.

Recruitment strategies

From May – July 2017, we sent invitations to 5000 physicians to complete a 15-min survey, with questions about: (1) characteristics of physicians’ practices; (2) referral and recruitment of patients to clinical trials; and (3) barriers to trial accrual.

The mail-based recruitment strategy included a survey packet with a: (1) cover letter, signed by the Principal Investigator (CSS), describing the survey and inviting participation; (2) telephone number to call for more information or opt-out; (3) paper copy of the survey and postage-paid return envelope; and (4) web link for completing the survey online [14]. To those who had not responded 2 weeks after the survey packet mailing, we sent a reminder postcard with the web link to the online survey. Finally, 2 weeks after the reminder postcard, we mailed a second survey packet to remaining non-responders.

The email-based recruitment strategy included an email describing the survey and inviting participation, along with the web link and a unique code to access and complete the survey online [14]. We worked with Medical Marketing Services (MMS, Schaumburg, IL) to deliver emails using the Principal Investigator as the sender name and with the subject line, “Help NCI identify challenges & incentives for oncologists to recruit for trials.” One week after the email, we sent a second email identical to the first. We received reports from MMS after each email with the number of messages delivered, as well as the number of recipients who opened the email and clicked the web link. Finally, 1 week after the second email, we mailed a reminder postcard to physicians who had not yet responded.

Due to low response from the random sample of 5000 physicians, we modified recruitment strategies to invite the remaining 8251 eligible physicians in the sampling frame. We randomly assigned physicians, stratified by specialty (medical oncology and radiation oncology), to a mail- or email-based recruitment strategy using simple random sampling, as above. We recruited these 8251 additional physicians from August – September 2017.

For mail-based recruitment, we shortened the cover letter text and changed the signature to the Medical Director of Oncology at our healthcare system (JVC). Because we noted most responders completed the survey online (vs. via paper), we mailed only a second reminder postcard instead of a second survey packet. For email-based recruitment, we shortened the subject line and, as we had done previously, sent a second email 1 week after the first and a reminder postcard to non-responders another week later. No other changes were made.

We allowed responses up to 6 months from the first invitation. All survey completers were able to claim a $50 Amazon gift card.

Statistical analysis

Our primary outcome was response rate, defined as the proportion of invited physicians who returned a survey. In the entire sampling frame (i.e., random sample of 5000 physicians plus remaining 8251), we compared response rate by recruitment strategy (mail- vs. email-based) using a Chi-square test.

We also used a Chi-square test to compare demographic characteristics by recruitment strategy and response (responders vs. non-responders). Demographic characteristics included age, sex, specialty (medical, radiation, surgical oncology), and geographic region (West, Midwest, Northeast, South).

The Institutional Review Board at the University of Texas Southwestern Medical Center determined that this research study (protocol# 092016–096) is exempt in accordance with 45 CFR 46.101(b). In the case of an anonymous survey, completion of the survey is indication of consent to participate. Both the cover letter and email indicated that: participation was voluntary and participants may refuse to answer any questions or stop participation at any time; the survey was anonymous and no personally identifying information would be obtained; and responses would not be traceable back to any participant.

Results

We invited 13,251 physicians to complete the survey; 6526 were randomized to mail-based recruitment and 6725 to email-based recruitment. There were no differences in demographic characteristics by recruitment strategy (Table 1).

Table 1 Demographic characteristics and response rate by recruitment strategy

In mail-based recruitment, 383 (5.9% of 6526 mailed) survey packets were undeliverable (i.e., returned to our study office). In email-based recruitment, messages were delivered to more than 98% of physicians; across all emails delivered, 9.3–13.3% of recipients opened the email, and 0.5–4.7% clicked the web link. Most opens and clicks occurred within 2 days of email delivery.

Overall, 1072 (8.1%) physicians responded (Fig. 1). Median time from first invitation to response was 18 days for the mail-based and 23 days for the e-mail based recruitment strategies. Response rate was higher in mail-based (11.8, 95% CI 11.0–12.6%) compared to email-based (4.5, 95% CI 4.0–5.0%) recruitment (Table 1). Nearly all responders (96.1%, n = 1030) completed the survey through to the last question.

Fig. 1
figure 1

Response to the survey by recruitment strategy

As shown in Fig. 1, more physicians in mail-based recruitment responded after the first invitation (362 of 770 responders, 47.0%) compared to the second (200 of 770, 26.0%) and third (208 of 770, 27.0%) invitations. About three-quarters (n = 576) used the web link included in the cover letter or postcard to complete the survey online instead of completing and returning the paper version (Table 1). For email-based recruitment, the largest group of responders (198 of 302 responders, 65.6%) did so after the reminder postcard, mailed 2 weeks after the first email invitation.

Compared to non-responders, a higher proportion of responders were young (ages 25–44 years), male, and radiation or surgical (vs. medical) oncologists (all p < 0.05) (Table 2). There was no difference in response by geographic region (p = 0.10).

Table 2 Differences in demographic characteristics of responders and non-responders, overall and by recruitment strategy (n = 13,251)

Discussion

Response to the survey on clinical trial accrual was low overall, but our randomized trial yielded several interesting findings. We compared physicians initially recruited to complete the survey by email to those recruited by mail. Response in email-based recruitment was 4.5% compared with 11.8% in mail-based recruitment. Conventional wisdom suggests email elicits higher response rates because physicians can simply click a web link in an email to access and complete the survey. However, we found the overwhelming majority of physicians assigned to email-based recruitment (~ 90%) never opened the email message, and among those who did, only about a quarter clicked the link to open the survey. Even though they were not able to simply click a link to access the survey, most responders in mail-based recruitment typed in the web address to complete the online version of the survey (rather than filling out and mailing back the paper copy). Finally, whereas in mail-based recruitment most responses were from the first survey packet (i.e., not reminder postcard), in email-based recruitment, most responses came after the mailed reminder postcard.

Our finding that response was higher among physicians recruited by mail is consistent with previous studies showing surveys delivered by mail vs. email [7, 8], or a combination of mail and e-mail [5, 6], generally elicit better response than email alone. Most physicians who received mailed invitations responded by typing in the link to complete the survey online. The value of receiving mail invitations was further highlighted by the fact that, in the email-based recruitment group, most physicians who completed the survey did so after receiving a mailed postcard reminder.

After low response among the initial 5000 physicians invited to complete the survey, response to email invitations did not improve after attempts to shorten the message and subject line, albeit we did not formally test for differences in response before and after we implemented these changes. The low response in email-based recruitment is likely because very few of the recipients (~ 10%) even opened the message, and only about a quarter of those who opened the e-mail even clicked to open the survey. It may be easier to not notice or ignore an e-mail than paper that arrives in one’s physical mailbox. Or the email from an unfamiliar sender may have been delivered to a spam inbox, giving intended recipients no chance to open it. Email recruitment strategies may promise faster response, elicit longer response to open-ended questions [8], and appear as a low-cost alternative to postal mail, but the risk of non-response – especially when the message remains unopened – seems to outweigh the potential benefits of these conveniences [15]. An important avenue of future research is to understand the non-response bias associated with and cost-effectiveness of email recruitment.

A strength of this study was our use of the AMA Physician Masterfile as the sampling frame. The Masterfile offers the most complete coverage of the U.S. physician population because physicians enroll in medical school or during training in the U.S. Nearly all mail and email invitations in our study were delivered, reflecting the accuracy and up-to-dateness of contact information listed in the Masterfile. By inviting all eligible physicians from the Masterfile, we were also able to compare differences in responders vs. non-responders and describe selection bias arising from these differences. Responders were more likely to be younger, male, and surgical (vs. medical or radiation) oncologists. Additionally, the large sampling frame allowed us to achieve a response of more than 1000 surveys, with robust responses to open-ended questions and sufficient sample size to facilitate comparison.

A limitation is that we did not design our study to compare response by survey mode (i.e., online or paper). Specifically, physicians assigned to mail-based recruitment were offered a choice to complete the survey online or return by mail, whereas physicians in email-based recruitment were provided only a web link to complete the survey online. It was not practical to test two survey modes in email-based recruitment unless physicians were instructed to print a survey and return it by mail. We also could not examine the effect on response of making adjustments to invitation length and content.

Conclusions

In summary, a mail-based recruitment strategy increased physicians’ response to a survey on clinical trial accrual, compared to email-based recruitment. Ironically, most physicians assigned to mail-based recruitment actually completed the survey online via the link provided in the cover letter, and those in email-based recruitment did not respond until they received a reminder postcard by mail. Providing the option to return a paper survey or complete it online may have further increased participation for those recruited by mail, and future studies should examine how combinations of delivery mode and return options affect physicians’ response to surveys.

Availability of data and materials

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

AMA:

American Medical Association

References

  1. Murthy VH, Krumholz HM, Gross CP. Participation in cancer clinical trials: race-, sex-, and age-based disparities. JAMA. 2004;291(22):2720–6.

    Article  CAS  Google Scholar 

  2. Mendelsohn J, Moses HL, Nass SJ. A national cancer clinical trials system for the 21st century: Reinvigorating the NCI Cooperative Group Program. Washington, D.C.: National Academies Press; 2010.

  3. Kirkwood MK, Hanley A, Bruinooge SS, Garrett-Mayer E, Levit LA, Schenkel C, Seid JE, Polite BN, Schilsky RL. The state of oncology practice in America, 2018: results of the ASCO practice census survey. J Oncol Pract Am Soc Clin Oncol. 2018;14(7):e412–20.

    Article  Google Scholar 

  4. Klabunde CN, Willis GB, Casalino LP. Facilitators and barriers to survey participation by physicians: a call to action for researchers. Eval Health Prof. 2013;36(3):279–95.

    Article  Google Scholar 

  5. Martins Y, Lederman RI, Lowenstein CL, Joffe S, Neville BA, Hastings BT, Abel GA. Increasing response rates from physicians in oncology research: a structured literature review and data from a recent physician survey. Br J Cancer. 2012;106(6):1021–6.

    Article  CAS  Google Scholar 

  6. VanGeest JB, Johnson TP, Welch VL. Methodologies for improving response rates in surveys of physicians: a systematic review. Eval Health Prof. 2007;30(4):303–21.

    Article  Google Scholar 

  7. Scott A, Jeon SH, Joyce CM, Humphreys JS, Kalb G, Witt J, Leahy A. A randomised trial and economic evaluation of the effect of response mode on response rate, response bias, and item non-response in a survey of doctors. BMC Med Res Methodol. 2011;11:126.

    Article  Google Scholar 

  8. Seguin R, Godwin M, MacDonald S, McCall M. E-mail or snail mail? Randomized controlled trial on which works better for surveys. Can Fam Physician. 2004;50:414–9.

    PubMed  PubMed Central  Google Scholar 

  9. Blanch-Hartigan D, Forsythe LP, Alfano CM, Smith T, Nekhlyudov L, Ganz PA, Rowland JH. Provision and discussion of survivorship care plans among cancer survivors: results of a nationally representative survey of oncologists and primary care physicians. J Clin Oncol. 2014;32(15):1578–85.

    Article  Google Scholar 

  10. Klabunde CN, Keating NL, Potosky AL, Ambs A, He Y, Hornbrook MC, Ganz PA. A population-based assessment of specialty physician involvement in cancer clinical trials. J Natl Cancer Inst. 2011;103(5):384–97.

    Article  Google Scholar 

  11. Lee RT, Barbo A, Lopez G, Melhem-Bertrandt A, Lin H, Olopade OI, Curlin FA. National survey of US oncologists' knowledge, attitudes, and practice patterns regarding herb and supplement use by patients with cancer. J Clin Oncol. 2014;32(36):4095–101.

    Article  Google Scholar 

  12. Mori M, Shimizu C, Ogawa A, Okusaka T, Yoshida S, Morita T. A National Survey to systematically identify factors associated with Oncologists' attitudes toward end-of-life discussions: what determines timing of end-of-life discussions? Oncologist. 2015;20(11):1304–11.

    Article  Google Scholar 

  13. Hanley A, Hagerty K, Towle EL, Neuss MN, Mulvey TM, Acheson AK. Results of the 2013 American Society of Clinical Oncology National Oncology Census. J Oncol Pract Am Soc Clin Oncol. 2014;10(2):143–8.

    Article  Google Scholar 

  14. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)--a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42(2):377–81.

    Article  Google Scholar 

  15. Dykema J, Jones NR, Piche T, Stevenson J. Surveying clinicians by web: current issues in design and administration. Eval Health Prof. 2013;36(3):352–81.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This work was supported by the National Cancer Institute (P30CA142543, U54 CA163308-05S1) and National Center for Advancing Translational Sciences (KL2TR001103, UL1TR001105) at the National Institutes of Health. The sponsor played no role in the design of the study; collection, analysis, and interpretation of data; and in writing the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

CCM contributed to the study design, led analysis of study data, contributed to the collection and assembly of data, and was a major contributor to writing the manuscript. SJCL contributed to the study design and interpretation of data. AMG contributed to the study design and interpretation of data. JVC contributed to the collection and assembly of data. CA contributed to the analysis of study data and performed randomization. RN contributed to writing the manuscript. DEG contributed to the collection and assembly of data and interpretation of data. EAH contributed to the interpretation of data. KM led collection and assembly of data. CSS contributed to the study design, collection and assembly of data, and interpretation of data, and was a major contributor to writing the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Caitlin C. Murphy.

Ethics declarations

Ethics approval and consent to participate

The Institutional Review Board at the University of Texas Southwestern Medical Center determined that this research study (protocol# 092016–096) is exempt in accordance with 45 CFR 46.101(b).

In the case of an anonymous survey, completion of a survey is indication of consent to participate in research. Further, the invitation letter and email indicated that: participation was voluntary and participants may refuse to answer any questions or stop participation at any time; the survey was anonymous and no personally identifying information would be obtained and responses would not be traceable back to any participant. Contact information was provided for the study coordinator, and for the IRB.

The invitation letter and email also indicated that, upon completion, participant would be directed to a separate website to provide contact information to receive a gift card, if desired. Contact information would not be linked to survey responses and would be destroyed after receipt of the gift card.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests. This article was produced by employees of the U.S. government as part of their official duties and, as such, is in the public domain. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Cancer Institute or the National Institutes of Health.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Murphy, C.C., Craddock Lee, S.J., Geiger, A.M. et al. A randomized trial of mail and email recruitment strategies for a physician survey on clinical trial accrual. BMC Med Res Methodol 20, 123 (2020). https://0-doi-org.brum.beds.ac.uk/10.1186/s12874-020-01014-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12874-020-01014-x