Clinical Trial Details
— Status: Completed
Administrative data
NCT number |
NCT04131062 |
Other study ID # |
2017-0334 |
Secondary ID |
R01CA211723-03S1 |
Status |
Completed |
Phase |
N/A
|
First received |
|
Last updated |
|
Start date |
November 1, 2019 |
Est. completion date |
May 19, 2022 |
Study information
Verified date |
December 2022 |
Source |
Geisinger Clinic |
Contact |
n/a |
Is FDA regulated |
No |
Health authority |
|
Study type |
Interventional
|
Clinical Trial Summary
The goal of this trial is to determine whether the Sage eConsent framework (presented using
an electronic application) is non-inferior to traditional, paper-based, human-mediated
consent-and therefore could be part of an acceptable population screening approach to
identifying patients and others with actionable hereditary syndromes-and to increase basic
knowledge about patients' informational needs about different aspects of genetic/omic
screening. After receiving either 1) the traditional consenting approach, or 2) a consenting
approach presented on an electronic tablet, the investigators will test for differences
between these two arms in a variety of outcome measures including objective and perceived
comprehension, time spent and informational needs, and enrollment decision, among others.
Description:
The usual consent process for MyCode proceeds as follows: each day, trained MyCode consenters
receive a list of patients who are eligible to be approached about MyCode and are scheduled
to be seen that day in certain clinics in Geisinger's two-state catchment area. (Any
Geisinger patient is eligible who has not previously enrolled in, or declined to enroll in,
MyCode.) When an eligible patient arrives at the clinic, the consenter approaches them,
confirms their identity, and then asks them if they would like to hear about MyCode. If they
decline, the consenter thanks the patient for their time and the encounter is over. If the
patient agrees, the consenter goes through a script that the MyCode team has developed from
the written consent form that highlights the most important aspects of MyCode, including
return of actionable results to participants and their primary care physicians, genetic
privacy, and data sharing for research purposes. At the end of the script, the consenter
invites and answers questions from the patient. Next, the consenter hands the patient the
7-page written consent form and asks if they would like a few minutes to review it. Finally,
the consenter asks the patient whether they wish to enroll in MyCode or not and records their
answer-Yes, No, or Thinking (i.e., the patient needs more time to consider)-into the
patient's electronic health record.
In the present trial, patients are randomized at the individual level to receive either this
usual consent or eConsent via iPad app. During the pilot phase of this trial, 11 Research
Assistants (RAs) were trained on both MyCode consenting and on this trial's protocol. As per
usual care, the RAs receive a daily list of MyCode-eligible patients scheduled to appear in
clinic. And, as per usual care, the RAs approach the patient, confirm their identity, and ask
if they wish to learn about MyCode. Those who do are then randomized to the usual care
(paper) or eConsent (iPad) arm of the trial, according to whether the current time, as
indicated by digital stopwatches, ends in an even or odd number. In the paper arm, the
consent process proceeds as usual, with only two minor changes: 1) the RA uses the stopwatch
to time the duration of the consent encounter (beginning from the moment they are randomized
to the paper arm and ending when either the consent process is interrupted-e.g., because the
patient is called back to the examination room-or when the consent process terminates with an
enrollment decision (Yes, No, or Thinking); and 2) the RA uses a tracking sheet to record
MyCode response rate (i.e., patients approached who did not want to hear about MyCode) and
study attrition (e.g., consent process was interrupted). In the iPad arm, the RA hands the
patient the iPad and explains that the interactive app will tell them all about MyCode.
Patients reluctant to use an iPad are encouraged once to try, with the RA showing them that
all that is involved is tapping, but patients who continue to resist are switched to the
paper arm and this is noted on the tracking sheet. In the iPad arm, the RA also records
whether the patient asks the RA any questions about MyCode and, as with the paper arm, when a
patient declines to hear about MyCode and when the consent process is interrupted.
In both arms, patients are then asked to complete a survey, which serves as the primary
source of data for the study. The survey is administered on paper in the paper arm and on
iPad (via the Qualtrics platform) in the iPad arm. The eConsent app generates a random study
ID number that is sent to Qualtrics, where the user's click behavior during the consent
process (e.g., time spent on each screen and in total, whether the user clicked "learn more"
on each page, (in)correct answers to teach-back questions) is anonymously combined with their
survey responses. Survey questions are closed-end (true/false, multiple choice, Likert scale)
and based on the Quality of Informed Consent and All of Us participant-provided information
surveys.
This study is designed to be powered at 99% to detect an effect of modest size (half a point
on the comprehension quiz), requiring 526 participants. Very high levels of power (here, 95%
or 99%)-as opposed to the more standard benchmark power level of 80%-are desirable in tests
of non-inferiority so that investigators can be as certain as possible that an inference of
"no effect" is not a Type II error. In the very unlikely event that data collection proceeds
much more slowly than it has in the pilot, the study retains 95% power to detect a one-half
question effect with only 372 participants.