View clinical trials related to Competency-Based Education.
Filter by:A triple-arm, randomized, simple-blinded clinical trial will be conducted. A control sequence and an intervention sequence of three subgroups with different exposure levels to the simulation program are proposed. Group 1: open abdominal aortic repair (AAOR), Group 2: vascular anastomosis (VA) and AAOR, and Group 3: specific micro-surgical skills, VA and AAOR. Surgical residents of general, vascular, or cardiovascular surgery programs will be included. Sample size calculation resulted in 45 participants, 15 in each group. Simple blinding will involve external evaluators. Randomization will occur as a simple randomization.
The hypothesis of our work is that with the simulation techniques applied in the Medical School of the Autonomous University of Barcelona (UAB), students accelerate the learning curve of clinical skills, acquire transversal skills in medicine, and obtain a higher quality learning.
This study aims to evaluate the effectiveness of a user and expert centric mobile phone-based career intervention program of career competencies among Malaysian public managers.
This study evaluates whether the implementation of a competency-based medical education program called CoBaTrICE (Competency Based Training program in Intensive Care in Europe) provides higher levels of competency in comparison with the current official time-based program in Intensive Care Medicine (ICM) in Spain. The hypothesis will be confirmed or rejected through a multicenter cluster randomized trial of 14 ICU Departments from 14 academic referral hospitals located in Spain. A total of 38 trainees on the 3rd year of the specialization period will be followed during the three years of their specific training period in Intensive Care Medicine. CoBaTrICE (seven hospitals) will be compared with the current official model of training in ICM in Spain (seven hospitals), which is based on exposure to experiences through clinical rotations. The implementation of CoBaTrICE will include the three following essential elements: 1) Training the trainers; 2) Workplace-based assessments; 3)The use of an electronic portfolio. The level of competency achieved by each participant will be determined by a simulation-based Objective Structured Clinical Exam (OSCE) performed at the end of the third year of traning (baseline) and at the end of the 5th year of training period.
The establishment of a number of training programs in therapeutic endoscopy, standardization of the performance of endoscopic ultrasound (EUS) and endoscopy retrograde cholangiopancreatography (ERCP) and definition of competence is of paramount importance. The length of training and minimum number of procedures, requisite theoretical learning and methodology to define competence in EUS and ERCP are not well defined. The investigators research has demonstrated that individuals in training acquire skills at different rates and the number of procedures completed alone is a suboptimal marker for competency in a given procedure. Hence, emphasis needs to be shifted away from the number of procedures performed to performance metrics with well-defined and validated thresholds of performance. Multicenter prospective data are needed to help guide development of competency based medical education that define learning curves in EUS and ERCP and set evidence-based benchmarks required to achieve competence using a validated competency assessment tool. Hypothesis: The central hypothesis is that a validated EUS and ERCP competency assessment tool will allow for reliable and generalizable standardized learning curves, competency benchmarks and creation of a centralized national database that compares a trainee's performance amongst peers.
The establishment of a number of training programs in therapeutic endoscopy, standardization of the performance of endoscopic ultrasound (EUS) and endoscopy retrograde cholangiopancreatography (ERCP) and definition of competence is of paramount importance. The length of training and minimum number of procedures, requisite theoretical learning and methodology to define competence in EUS and ERCP are not well defined. The investigators research has demonstrated that individuals in training acquire skills at different rates and the number of procedures completed alone is a suboptimal marker for competency in a given procedure. Hence, emphasis needs to be shifted away from the number of procedures performed to performance metrics with well-defined and validated thresholds of performance. Multicenter prospective data are needed to help guide development of competency based medical education that define learning curves in EUS and ERCP and set evidence-based benchmarks required to achieve competence using a validated competency assessment tool. Hypothesis: The central hypothesis is that a validated EUS and ERCP competency assessment tool will allow for reliable and generalizable standardized learning curves, competency benchmarks and creation of a centralized national database that compares a trainee's performance amongst peers.