Skip to main content
Advertisement

Main menu

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • AJNR Case Collection
    • Case of the Week Archive
    • Classic Case Archive
    • Case of the Month Archive
  • Special Collections
    • Spinal CSF Leak Articles (Jan 2020-June 2024)
    • 2024 AJNR Journal Awards
    • Most Impactful AJNR Articles
  • Multimedia
    • AJNR Podcast
    • AJNR Scantastics
    • Video Articles
  • For Authors
    • Submit a Manuscript
    • Author Policies
    • Fast publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Manuscript Submission Guidelines
    • Imaging Protocol Submission
    • Submit a Case for the Case Collection
  • About Us
    • About AJNR
    • Editorial Board
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Other Publications
    • ajnr

User menu

  • Alerts
  • Log in

Search

  • Advanced search
American Journal of Neuroradiology
American Journal of Neuroradiology

American Journal of Neuroradiology

ASHNR American Society of Functional Neuroradiology ASHNR American Society of Pediatric Neuroradiology ASSR
  • Alerts
  • Log in

Advanced Search

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • AJNR Case Collection
    • Case of the Week Archive
    • Classic Case Archive
    • Case of the Month Archive
  • Special Collections
    • Spinal CSF Leak Articles (Jan 2020-June 2024)
    • 2024 AJNR Journal Awards
    • Most Impactful AJNR Articles
  • Multimedia
    • AJNR Podcast
    • AJNR Scantastics
    • Video Articles
  • For Authors
    • Submit a Manuscript
    • Author Policies
    • Fast publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Manuscript Submission Guidelines
    • Imaging Protocol Submission
    • Submit a Case for the Case Collection
  • About Us
    • About AJNR
    • Editorial Board
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Follow AJNR on Twitter
  • Visit AJNR on Facebook
  • Follow AJNR on Instagram
  • Join AJNR on LinkedIn
  • RSS Feeds

Welcome to the new AJNR, Updated Hall of Fame, and more. Read the full announcements.


AJNR is seeking candidates for the position of Associate Section Editor, AJNR Case Collection. Read the full announcement.

 

Other

Peer Learning in Neuroradiology: Not as Easy as It Sounds

K. Mani, K. Shah, N. Kadom, D. Seidenwurm and A.J. Nemeth
American Journal of Neuroradiology October 2023, 44 (10) 1109-1115; DOI: https://doi.org/10.3174/ajnr.A7973
K. Mani
University Radiology GroupRutgers University School of MedicineNewark, New Jersey
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for K. Mani
K. Shah
MD Anderson Cancer CenterHouston, Texas
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for K. Shah
N. Kadom
Emory University School of MedicineChildren’s Healthcare of AtlantaAtlanta, Georgia
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for N. Kadom
D. Seidenwurm
Sutter HealthSacramento, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for D. Seidenwurm
A.J. Nemeth
Northwestern University, Feinberg School of MedicineNorthwestern Memorial HospitalChicago, Illinois
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for A.J. Nemeth
  • Article
  • Figures & Data
  • Info & Metrics
  • Responses
  • References
  • PDF
Loading

Peer Learning (PL) is an engaging activity in which practicing radiologists come together to review cases from which they can learn jointly. The major impetus for PL lies in the overarching goal of improving diagnosis in radiology through a team-based culture of viewing mistakes as an opportunity to learn.1 In its book Improving Diagnosis in Health Care, the Institute of Medicine (IOM) found that most people will be affected by at least 1 diagnostic error in health care.2

As a result of the IOM book Improving Diagnosis in Health Care, several external drivers are in place to change the practices of health care providers toward improved diagnosis. Radiologists realized that PL, but not random score-based peer review, best meets the IOM goals of establishing effective teamwork, educating practitioners in the diagnostic process, learning from mistakes, creating a culture that improves diagnostic performance, and establishing a reporting mechanism for discrepancies.1 Following a 2020 Radiology Peer Learning Summit, the American College of Radiology (ACR) developed a new accreditation pathway that replaces score-based random peer review with PL.3,4 The American Board of Radiology (ABR) added PL as an alternative participatory activity for meeting Maintenance of Certification (MOC) Part 4 criteria.5 The Joint Commission (TJC) serves as another external driver of improved practitioner performance, eg, peer review, through Ongoing Professional Practice Evaluation (OPPE) requirements.6

Assessments of agreement among university neuroradiologists showed up to 12.4% disagreement rates,7 which are much higher than the reported 2.9% for score-based random peer review.8 Among errors in neuroradiology are discrepancies regarding vascular, neoplastic, and congenital disorders, as well as artifacts.7 Moreover, neuroradiology errors can also arise from test-selection errors, protocolling errors, technical errors, and failure to communicate results in a timely fashion. Education can decrease errors, given that neuroradiologists with high participation rates in the Tumor Board have lower diagnostic error rates.9 Having awareness of “blind spots,” for example with complex head and neck anatomy and pathology, may decrease interpretive errors and could conceivably be improved with PL.10,11 In fact, there is currently some evidence that PL creates learning opportunities,12,13 but there is still a lack of adoption of PL programs nationwide14 and a lack of scientific evidence demonstrating its effectiveness.

This Perspectives comes from members of the American Society of Neuroradiology (ASNR) Quality, Safety and Value Committee. Here, we describe several challenges faced by neuroradiologists who are interested in serving as PL champions. Among these challenges is an inability to recruit volunteer champions to drive PL programs, lack of resources for running a PL program, unknown effects on required reporting to TJC, and lack of evidence favoring PL over score-based random peer review.

Challenge 1: Who Wants to Be a PL Champion?

The barriers related to appointing PL champions may be related to implicit expectations regarding clinical rather than noninterpretive performance and to the scope of this role, depending on the existing culture within the neuroradiology practice and the resources available to support a PL program.

The current practice environment in neuroradiology is characterized by rising clinical volumes, tight finances, and the unfolding Great Resignation. As a result, neuroradiologists may cut back on noninterpretive duties.15 In a pediatric neuroradiology example, it was stated that serving as a PL champion requires, at a minimum, several hours of work in preparation for PL meetings and may require additional time for managing discrepancies as well as for external reporting such as generating and submitting data for the ACR accreditation program, for TJC, or for receiving Continuing Medical Education (CME) credits.13 There are currently no established physician roles that would provide PL champions with protected time for these tasks.

PL champions face additional challenges. Some radiologists believe that participating in interesting case conferences is the same as PL. While this belief is literally true, the term PL is to be viewed in the context of meeting goals from the IOM Improving Diagnosis in Health Care, which includes several layers of accountability, foremost a process for handling discrepancies.1 There needs to be a clear process for reporting discrepancies, for consistently notifying the original interpreting radiologist of the discrepancy, and for ensuring optimal patient care. To meet TJC and ACR requirements, PL must include cases with discrepancies, but additionally, the inclusion of great catches, interesting cases, and so forth can foster shared learning. PL champions may need to drive this culture shift, emphasizing the importance of reporting discrepancies for learning purposes.

The ACR acknowledges the importance of separating the performance review of radiologists from learning and professional growth by creating the ACR Accreditation Pathway for PL,4 which substitutes agreement/disagreement ratings with measures of participation in a PL program. For neuroradiologists who are used to the randomized score-based peer review, the culture needs to shift away from perceiving discrepancy reporting as punishment or as a performance-assessment tool and toward sharing learning opportunities so the group can learn and grow. This shift in measuring performance is crucial because traditional score-based peer review has been used as a punitive measure in the past.14,16 Achieving such a culture shift could be an impossible lift for neuroradiologists. Leadership support is critical to the success of any PL program.

Another culture shift revolves around how discrepant opinions are handled. Absolute certainty in medicine is rare to come by, and often there are differences of opinion regarding a diagnosis in radiology. In the randomized score-based peer review system, a voting system would be used to decide which image interpretation is more “correct.” This culture may be founded in the traditional “learn-what” approach, such as reading an article or taking an online course that may provide a sense of certainty of a diagnosis. Instead, PL emphasizes the idea of “learn-how,” by sharing knowledge, offering suggestions, and discussing alternative diagnostic approaches in the setting of discrepant opinions.17

Thankfully, it is not necessary to design a PL program de novo. The ACR’s PL checklist and sample policies provide a great starting point for building a PL program that is founded on the IOM goals for Improving Diagnosis in Health Care.1,2 The main pillars of PL programs include a mechanism for managing discrepancies, providing a safe learning environment, having a clear separation of learning from performance evaluation, and anonymization. There are ample opportunities for the identification of discrepancies in neuroradiology, for example during any comparison with a prior study, from secondary review for the Neuro-Oncology Tumor Board and other neuroimaging conferences and reading room secondary opinion consultations, during teaching sessions, and from clinical error reporting.

The PL champion is also tasked with creating a safe learning environment. Groups with higher psychological safety have a “shared belief that the team is safe for interpersonal risk-taking.”18 As a result, members of such groups practice open communication, are not afraid to voice concerns or ask questions, and seek feedback without fear of being judged. Achieving such a culture requires deliberate effort to flatten authority gradients and eliminate any language that implies blaming, shaming, or judging. To create safe learning environments the PL champions will set clear ground rules, serve as role models, foster nonjudgmental behavior by demonstrating openness to different perspectives, and actively discourage any dismissiveness/hostility.19

Challenge 2: Are There Resources for Running a PL Program?

Besides a PL champion driving and executing a PL program, there are also resources required for external regulatory reporting, such as the ACR Accreditation Pathway for PL or TJC, and possibly reporting for claiming CME credits. Among required resources, besides the PL champion’s time and enthusiasm, are software tools and support staff time.

There are a few commercial tools that help manage various aspects of PL, but there is not currently a comprehensive commercial tool that manages the entire process, starting when a radiologist submits a case and including management of discrepancies, ensuring optimal clinical care, aiding PL champions in preparing for the PL conference, documenting PL performance targets, running a conference with anonymized PACS cases, capturing learning and improvement initiatives, and generating an annual report. Many commercial tools facilitate case submissions and case rating/classification systems (discrepancy, great catch, and so forth) but lack the ability to extract data for monthly and annual tracking of radiologist performance targets, such as the number of monthly cases submitted and participation in PL conferences. Meeting attendance can be tracked separately, and this tracking can be facilitated by choosing virtual meeting platforms that automatically generate attendance forms at the conclusion of the meeting.

At a coauthor’s (N.K.) institution in a pediatric neuroradiology program, the REDCap (Research Electronic Data Capture; projectredcap.org) research tool is used to drive a large portion of the PL process. Specifically, the tool serves as a case-reporting tool, it can be used to notify original readers of their reported cases and reporting reasons, it indicates whether any actions for clinical care need to be taken by the original reader, and it allows data-tracking and data summary required by the ACR Accreditation Pathway for PL (Fig 1). Having to create tools and processes and then implement them can be time-consuming and may require collaboration with other subject matter experts, which can delay their implementation.

FIG 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 1.

REDCap tool for PL. A, The submitter can indicate his or her name to receive credit against the monthly case submission requirement per the ACR Accreditation Pathway for PL. The submitter selects the reason for case submission, which includes discrepancies as well as interesting cases, good catches, and more. We use the PACS accession number as the case identifier. Any additional required actions can be entered, and the submitter attests to being responsible for ensuring optimal patient care. B, After submitting the content in the survey, a PDF is created that contains all survey input, except the name of the case submitter. The submitter can input the original reader’s email to quickly share the feedback. C, The REDCap tool allows the creation of reports that easily summarize information such as learning and improvement actions resulting from the PL program, which can be used for annual reporting on the ACR Accreditation Pathway for PL. D, We also have an administrative assistant monitor monthly case submissions and send an email with current submissions to every participating radiologist midway through the monthly reporting period.

Being able to present cases in an anonymized fashion can represent another challenge. Preserving anonymity in PL is important because it can positively impact learners’ perceptions of the value of PL, can foster the provision of more critical peer feedback, and can lead to increased performance.20 The extent of anonymity required may depend on the maturity level of the safety culture within a group of neuroradiologists but generally involves anonymity of the notification of a discrepancy as well as anonymous case presentation during PL meetings. Anonymity facilitates a nonpunitive atmosphere during review of cases among a group of attendings and trainees. The cases should be prepared by the PL champion to minimize the number of people who can identify cases and readers. Most interesting, to meet the ACR PL Accreditation Pathway criteria, the identity of anyone submitting cases needs to be captured to meet performance targets, but the identity of the original reader whose report was flagged as a discrepancy is not required to be captured. Inclusion of interesting cases or great catches, however, can increase PL participation and transparency21 and represents an opportunity to celebrate individuals by name. Of note, many PACS systems do not completely anonymize patient identification, and PL champions need to be cautious when screensharing the entire PACS window. Certain virtual meeting applications allow sharing only a portion of the screen, which may be better suited to preserving anonymity. Another way to preserve anonymity is to create slide presentations, which can be very time-consuming.13

A key outcome of a PL program from the perspective of the ACR Accreditation Pathway for PL is the documentation of quality improvements that arose from the PL program. PL meetings and case discussions can lead to the discovery of process and system issues that can be addressed. Many issues may be addressed directly by the neuroradiologists in the PL group, such as changing CT and MR imaging protocols or reporting templates, but larger issues, such as a broken system for providing feedback from neuroradiologists to technologists, may require escalation to a dedicated improvement team.22 It may be challenging to set up a process for handing off such projects to a dedicated quality team if the practice even has access to one.23

Challenge 3: How Does Peer Learning Meet TJC Requirements?

Another barrier affecting the transition from random score-based peer review to PL relates to external reporting of radiologists’ performance. Specifically, it is still unknown whether TJC will accept metrics derived from PL to replace the widely accepted random score-based review performance evaluation metrics of agreement/disagreement rates between radiologists.

TJC requires health care entities to provide both qualitative and quantitative data for OPPE and has traditionally accepted data from random score-based peer review to meet this requirement in radiology. It is technically possible to maintain random score-based peer review for external reporting purposes while also participating in PL, but this practice could cause confusion and mistrust among practicing radiologists, which would counteract the basic principles of a safe PL environment.

It may be better to replace score-based random peer review data reporting for OPPE with a different set of performance data. For example, 1 coauthor (N.K.) is proposing the use of report turnaround times (TAT) in conjunction with PL metrics (the number of cases submitted per radiologist per month, PL meeting participation, and so forth) for reporting only quantitative data for OPPE21 (Table). Additional qualitative data that TJC may require for OPPE could be collected through other pathways. For example, annual peer evaluations could be collected like those commonly used in the credentialing process (Fig 2). In addition, data from reporting systems for issues of physician practices could be used to reflect a qualitative assessment of radiologists’ performance (Fig 3).

FIG 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 2.

A sample OPPE form allowing peers to evaluate their peers. This qualitative evaluation aligns with the 6 Accreditation Council for Graduate Medical Education (ACGME) core competencies and can serve to identify any practice concerns.

FIG 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 3.

A sample report describing Focused Professional Practice Evaluation (FPPE) events to division directors, which represents and qualitative assessment that OPPE can use for TJC reporting. Division directors will not know the nature of the events that have been investigated, but they can still easily glean from this type of reporting across time whether a radiologist’s practice raises concern in terms of a higher-than-usual number of relevant issues and dispositions, such as behavior concerns, contract violations, and verbal or administrative interventions.

View this table:
  • View inline
  • View popup

Sample approach to defining quantitative data for OPPE use that could replace score-based random review dataa

Neuroradiologists who are interested in discontinuing random score-based peer review will have to consider external reporting requirements and work with representatives from those agencies to ensure that any new metrics meet existing requirements.

Challenge 4: What Is the Scientific Evidence Favoring PL?

It may be difficult for PL champions to convince leadership of abandoning traditional peer review in favor of PL. There is some evidence in the scientific literature that PL is a better approach to improving diagnosis than randomized score-based peer review but not necessarily that it is a performance-evaluation tool for radiologists. There is, however, ample evidence that score-based peer review is a flawed performance-evaluation tool24 that failed to demonstrate learning25 and failed to engage radiologists.14,26

If one is considering addendum rates as a surrogate marker of improved patient care, then PL by far exceeds the effects of score-based random peer review,27 but data directly linking peer learning to improved patient outcomes are still missing.

There is some evidence that PL may lead to greater radiologist engagement.12,13 Physician burnout poses an increased risk of patient safety incidents as well as poor quality of care and low patient satisfaction.28 A recent report stated that reported burnout among US neuroradiologists ranged from 49% to 79%.29 There remains an opportunity to generate additional scientific evidence linking PL to radiologist engagement metrics and linking improved engagement to improved patient outcomes.

Overall, the field of PL offers an opportunity for neuroradiologists to apply a scholarly angle. For example, we need evidence that PL leads to a neuroradiologist’s improved ability to reliably make an accurate diagnosis, that PL improves the cohesiveness of neuroradiology teams, and that PL could reduce burnout. There is a traditional view that high clinical volumes lead to lower academic output in neuroradiology, as measured by peer-reviewed articles, presentations, and abstracts.30 However, this simplistic linkage, which disregards factors like seniority and work schedules, has been criticized by other neuroradiologists31 and should not deter neuroradiologists from engaging in roles that do not contribute to clinical output.

The field of PL in radiology is still evolving. Neuroradiologists have an opportunity to become leaders in this field. Meeting the challenges presented in this article can result in professional and personal growth, improved job satisfaction, and reduced feelings of burnout. These are important possible gains to consider when weighing the commitment required to fill a PL champion role.

Footnotes

  • Disclosure forms provided by the authors are available with the full text and PDF of this article at www.ajnr.org.

References

  1. 1.↵
    1. Larson DB,
    2. Donnelly LF,
    3. Podberesky DJ, et al
    . Peer feedback, learning, and improvement: answering the call of the Institute of Medicine report on diagnostic error. Radiology 2017;283:231–41 doi:10.1148/radiol.2016161254 pmid:27673509
    CrossRefPubMed
  2. 2.↵
    1. Balogh EP,
    2. Miller BT,
    3. Ball JR
    1. Balogh EP,
    2. Miller BT,
    3. Ball JR
    ,et al. Committee on diagnostic error in health care. In: Balogh EP, Miller BT, Ball JR, et al. Improving Diagnosis in Health Care. National Academies Press; December 29, 2015
  3. 3.↵
    1. Larson DB,
    2. Broder JC,
    3. Bhargavan-Chatfield M, et al
    . Transitioning from peer review to peer learning: report of the 2020 Peer Learning Summit. J Am Coll Radiol 2020;17:1499–508 doi:10.1016/j.jacr.2020.07.016 pmid:32771491
    CrossRefPubMed
  4. 4.↵
    American College of Radiology. Peer Learning Resources. www.acr.org/Practice-Management-Quality-Informatics/Peer-Learning-Resources. Accessed May 11, 2023
  5. 5.↵
    American Board of Radiology. Participatory Activities. March 17, 2022. https://www.theabr.org/diagnostic-radiology/maintenance-of-certification/improvement-medical-practice/participatory-activities. Accessed May 18, 2023
  6. 6.↵
    The Joint Commission. What are the key elements needed to meet the Ongoing Professional Practice Evaluation (OPPE) requirements? https://www.jointcommission.org/standards/standard-faqs/hospital-and-hospital-clinics/medical-staff-ms/000001500. Accessed May 21, 2023
  7. 7.↵
    1. Babiarz LS,
    2. Yousem DM
    . Quality control in neuroradiology: discrepancies in image interpretation among academic neuroradiologists. AJNR Am J Neuroradiol 2012;33:37–42 doi:10.3174/ajnr.A2704 pmid:22033725
    Abstract/FREE Full Text
  8. 8.↵
    1. Borgstede JP,
    2. Lewis RS,
    3. Bhargavan M, et al
    . RADPEER quality assurance program: a multifacility study of interpretive disagreement rates. J Am Coll Radiol 2004;1:59–65 doi:10.1016/S1546-1440(03)00002-4 pmid:17411521
    CrossRefPubMed
  9. 9.↵
    1. Ivanovic V,
    2. Assadsangabi R,
    3. Hacein-Bey L, et al
    . Neuroradiology diagnostic errors at a tertiary academic centre: effect of participation in tumour boards and physician experience. Clin Radiol 2022;77:607–12 doi:10.1016/j.crad.2022.04.006 pmid:35589432
    CrossRefPubMed
  10. 10.↵
    1. Assadsangabi R,
    2. Maralani P,
    3. Chen AF, et al
    . Common blind spots and interpretive errors of neck imaging. Clin Imaging 2022;82:29–37 doi:10.1016/j.clinimag.2021.10.019 pmid:34773810
    CrossRefPubMed
  11. 11.↵
    1. Vong S,
    2. Chang J,
    3. Assadsangabi R, et al
    . Analysis of perceptual errors in skull-base pathology. Neuroradiol J 2022 Jun 18 [Epub ahead of print] doi:10.1177/19714009221108679 pmid:35722674
    CrossRefPubMed
  12. 12.↵
    1. Sharpe RE Jr.,
    2. Huffman RI,
    3. Congdon RG, et al
    . Implementation of a peer learning program replacing score-based peer review in a multispecialty integrated practice. AJR Am J Roentgenol 2018;211:949–56 doi:10.2214/AJR.18.19891 pmid:30207788
    CrossRefPubMed
  13. 13.↵
    1. Kadom N,
    2. Reddy KM,
    3. Khanna G, et al
    . Peer learning program metrics: a pediatric neuroradiology example. AJNR Am J Neuroradiol 2022;43:1680–84 doi:10.3174/ajnr.A7673 pmid:36229162
    Abstract/FREE Full Text
  14. 14.↵
    1. Lee CS,
    2. Neumann C,
    3. Jha P, et al
    . Current status and future wish list of peer review: A National Questionnaire of U.S. Radiologists. AJR Am J Roentgenol 2020;214:493–97 doi:10.2214/AJR.19.22194 pmid:31939700
    CrossRefPubMed
  15. 15.↵
    1. Chen JY,
    2. Vedantham S,
    3. Lexa FJ
    . Burnout and work-work imbalance in radiology—wicked problems on a global scale: a baseline pre-COVID-19 survey of US neuroradiologists compared to international radiologists and adjacent staff. Eur J Radiol 2022;155:110153 doi:10.1016/j.ejrad.2022.110153 pmid:35058099
    CrossRefPubMed
  16. 16.↵
    1. Ngo AV,
    2. Stanescu AL,
    3. Swenson DW, et al
    . Practical considerations when implementing peer learning conferences. Pediatr Radiol 2019;49:526–30 doi:10.1007/s00247-018-4305-7 pmid:30923885
    CrossRefPubMed
  17. 17.↵
    1. Edmondson AC
    . The Fearless Organization. John Wiley & Sons 2018;175–77
  18. 18.↵
    1. Edmondson A
    . Psychological safety and learning behavior in work teams. Adm Sci Q 1999;44:350–83 doi:10.2307/2666999
    CrossRef
  19. 19.↵
    1. Holley LC,
    2. Steiner S
    . Safe space: student perspectives on classroom environment. J Soc Work Educ 2005;41:49–64 doi:10.5175/JSWE.2005.200300343
    CrossRef
  20. 20.↵
    1. Panadero E,
    2. Alqassab M
    . An empirical review of anonymity effects in peer assessment, peer feedback, peer review, peer evaluation and peer grading. Assessment & Evaluation in Higher Education 2019;44:1253–78 doi:10.1080/02602938.2019.1600186
    CrossRef
  21. 21.↵
    1. Khader A,
    2. Ali S,
    3. Wald C, et al
    . Abdominal peer learning: advantages and lessons learned. Abdom Radiol (NY) 2023;48:1526–35 doi:10.1007/s00261-023-03846-9 pmid:36801958
    CrossRefPubMed
  22. 22.↵
    1. Donnelly LF,
    2. Larson DB,
    3. Heller RE III., et al
    . Practical suggestions on how to move from peer review to peer learning. AJR Am J Roentgenol 2018;210:578–82 doi:10.2214/AJR.17.18660 pmid:29323555
    CrossRefPubMed
  23. 23.↵
    1. Broder JC,
    2. Scheirey CD,
    3. Wald C
    . Step by step: a structured approach for proposing, developing and implementing a radiology peer learning program. Curr Probl Diagn Radiol 2021;50:457–60 doi:10.1067/j.cpradiol.2021.02.007 pmid:33663894
    CrossRefPubMed
  24. 24.↵
    1. Bender LC,
    2. Linnau KF,
    3. Meier EN, et al
    . Interrater agreement in the evaluation of discrepant imaging findings with the Radpeer system. AJR Am J Roentgenol 2012;199:1320–27 doi:10.2214/AJR.12.8972 pmid:23169725
    CrossRefPubMed
  25. 25.↵
    1. Eisenberg RL,
    2. Cunningham ML,
    3. Siewert B, et al
    . Survey of faculty perceptions regarding a peer review system. J Am Coll Radiol 2014;11:397–401 doi:10.1016/j.jacr.2013.08.011 pmid:24144835
    CrossRefPubMed
  26. 26.↵
    1. Chaudhry H,
    2. Del Gaizo AJ,
    3. Frigini LA, et al
    . Forty-one million RADPEER reviews later: what we have learned and are still learning. J Am Coll Radiol 202017:779–85 doi:10.1016/j.jacr.2019.12.023 pmid:31991118
    CrossRefPubMed
  27. 27.↵
    1. Trinh TW,
    2. Boland GW,
    3. Khorasani R
    . Improving radiology peer learning: comparing a novel electronic peer learning tool and a traditional score-based peer review system. AJR Am J Roentgenol 2019;212:135–41 doi:10.2214/AJR.18.19958 pmid:30403533
    CrossRefPubMed
  28. 28.↵
    1. Panagioti M,
    2. Geraghty K,
    3. Johnson J, et al
    . Association between physician burnout and patient safety, professionalism, and patient satisfaction: a systematic review and meta-analysis. JAMA Intern Med 2018;178:1317–31 (Retraction published in JAMA Intern Med 2020;180:931) doi:10.1001/jamainternmed.2018.3713 pmid:30193239
    CrossRefPubMed
  29. 29.↵
    1. Weissman IA,
    2. Van Geertruyden P,
    3. Prabhakar AM, et al
    . Practice resources to address radiologist burnout. J Am Coll Radiol 2023;20:494–99 doi:10.1016/j.jacr.2023.03.007 pmid:36934890
    CrossRefPubMed
  30. 30.↵
    1. Eschelman DJ,
    2. Sullivan KL,
    3. Parker L, et al
    . The relationship of clinical and academic productivity in a university hospital radiology department. AJR Am J Roentgenol 2000;174:27–31 doi:10.2214/ajr.174.1.1740027 pmid:10628448
    CrossRefPubMed
  31. 31.↵
    1. Yousem DM
    . Academic and clinical productivity: relative value units do not tell the whole story. AJR Am J Roentgenol 2001;176:1598–600 doi:10.2214/ajr.176.6.1761598 pmid:11373242
    CrossRefPubMed
  • Received May 24, 2023.
  • Accepted after revision July 21, 2023.
  • © 2023 by American Journal of Neuroradiology
PreviousNext
Back to top

In this issue

American Journal of Neuroradiology: 44 (10)
American Journal of Neuroradiology
Vol. 44, Issue 10
1 Oct 2023
  • Table of Contents
  • Index by author
  • Complete Issue (PDF)
Advertisement
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Neuroradiology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Peer Learning in Neuroradiology: Not as Easy as It Sounds
(Your Name) has sent you a message from American Journal of Neuroradiology
(Your Name) thought you would like to see the American Journal of Neuroradiology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Cite this article
K. Mani, K. Shah, N. Kadom, D. Seidenwurm, A.J. Nemeth
Peer Learning in Neuroradiology: Not as Easy as It Sounds
American Journal of Neuroradiology Oct 2023, 44 (10) 1109-1115; DOI: 10.3174/ajnr.A7973

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
0 Responses
Respond to this article
Share
Bookmark this article
Peer Learning Challenges in Neuroradiology
K. Mani, K. Shah, N. Kadom, D. Seidenwurm, A.J. Nemeth
American Journal of Neuroradiology Oct 2023, 44 (10) 1109-1115; DOI: 10.3174/ajnr.A7973
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Purchase

Jump to section

  • Article
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • Responses
  • References
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • No citing articles found.
  • Crossref (1)
  • Google Scholar

This article has been cited by the following articles in journals that are participating in Crossref Cited-by Linking.

  • Scan With Me: A Train-the-Trainer Program to Upskill MRI Personnel in Low- and Middle-Income Countries
    Abdul Nashirudeen Mumuni, Katerina Eyre, Cristian Montalba, Aduluwa Harrison, Surendra Maharjan, Francis Botwe, Marina Fernandez Garcia, Abderrazek Zeraii, Matthias G. Friedrich, Abiodun Fatade, Ntobeko A.B. Ntusi, Tchoyoson Lim, Ria Garg, Muhammad Umair, Hammed A. Ninalowo, Sola Adeleke, Chinedum Anosike, Farouk Dako, Udunna C. Anazodo
    Journal of the American College of Radiology 2024 21 8

Similar Articles

Advertisement

Indexed Content

  • Current Issue
  • Accepted Manuscripts
  • Article Preview
  • Past Issues
  • Editorials
  • Editors Choice
  • Fellow Journal Club
  • Letters to the Editor

Cases

  • Case Collection
  • Archive - Case of the Week
  • Archive - Case of the Month
  • Archive - Classic Case

Special Collections

  • Special Collections

Resources

  • News and Updates
  • Turn around Times
  • Submit a Manuscript
  • Author Policies
  • Manuscript Submission Guidelines
  • Evidence-Based Medicine Level Guide
  • Publishing Checklists
  • Graphical Abstract Preparation
  • Imaging Protocol Submission
  • Submit a Case
  • Become a Reviewer/Academy of Reviewers
  • Get Peer Review Credit from Publons

Multimedia

  • AJNR Podcast
  • AJNR SCANtastic
  • Video Articles

About Us

  • About AJNR
  • Editorial Board
  • Not an AJNR Subscriber? Join Now
  • Alerts
  • Feedback
  • Advertise with us
  • Librarian Resources
  • Permissions
  • Terms and Conditions

American Society of Neuroradiology

  • Not an ASNR Member? Join Now

© 2025 by the American Society of Neuroradiology All rights, including for text and data mining, AI training, and similar technologies, are reserved.
Print ISSN: 0195-6108 Online ISSN: 1936-959X

Powered by HighWire