Background: Internet resources play an important role in how medical students access information related to residency programs. Evaluating program websites is necessary in order to provide accurate information for applicants and provide information regarding areas of website improvement for programs. To date, dermatology residency websites (DRWS) have not been evaluated. This paper evaluates dermatology residency websites based on availability of predefined measures.Methods: Using the FREIDA (Fellowship and Residency Electronic Interactive Database) Online database, authors searched for all accredited dermatology program websites. Eligible programs were identified through the FREIDA Online database and had a functioning website. Two authors independently extracted data with consensus or third researcher resolution of differences. This data was accessed and archived from July 15 th to July 17 th , 2015. Primary outcomes measured were presence of content on education, resident and faculty information, program environment, applicant recruitment, schedule, salary, and website quality evaluated using an online tool (WooRank.com).Results: Out of 117 accredited dermatology residencies, 115 had functioning webpages. Of these, 76.5% (75) had direct links found on the FRIEDA Online database. Most programs contained information on education, faculty, program environment, and applicant recruitment. However, website quality and marketing effectiveness were highly variable; most programs were deemed to need improvements in the functioning of their webpages. Also, additional information on current residents and about potential away rotations were lacking from most websites with only 52.2% (60) and 41.7% (48) of programs providing this content, respectively.Conclusions: A majority of dermatology residency websites contained adequate information on many of the factors we evaluated. However, many were lacking in areas that matter to applicants. We hope this report will encourage dermatology residency programs to improve their websites and provide adequate content to attract the top residents for their respective programs.
To increase the timeliness, objectivity, and efficiency in evaluating ophthalmology residents' learning of cataract surgery, an automatic analysis system for cataract surgery videos is developed to assess performance, particularly in the capsulorhexis step on the Kitaro simulator. We utilize computer vision technologies to measure performance of this critical step including duration, size, centrality, circularity, as well as motion stability during the capsulorhexis procedure. Consequently, a grading mechanism is established based on either linear regression or non-linear classification via Support Vector Machine of those computed measures. Comparisons of expert graders to the computer vision-based approach have demonstrated the accuracy and consistency of the computerized technique.
Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.